Jan 23 03:14:51 np0005593233 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 03:14:51 np0005593233 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 03:14:51 np0005593233 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 03:14:51 np0005593233 kernel: BIOS-provided physical RAM map:
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 03:14:51 np0005593233 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 23 03:14:51 np0005593233 kernel: NX (Execute Disable) protection: active
Jan 23 03:14:51 np0005593233 kernel: APIC: Static calls initialized
Jan 23 03:14:51 np0005593233 kernel: SMBIOS 2.8 present.
Jan 23 03:14:51 np0005593233 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 23 03:14:51 np0005593233 kernel: Hypervisor detected: KVM
Jan 23 03:14:51 np0005593233 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 03:14:51 np0005593233 kernel: kvm-clock: using sched offset of 3793423952 cycles
Jan 23 03:14:51 np0005593233 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 03:14:51 np0005593233 kernel: tsc: Detected 2800.000 MHz processor
Jan 23 03:14:51 np0005593233 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 23 03:14:51 np0005593233 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 03:14:51 np0005593233 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 03:14:51 np0005593233 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 23 03:14:51 np0005593233 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 23 03:14:51 np0005593233 kernel: Using GB pages for direct mapping
Jan 23 03:14:51 np0005593233 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 03:14:51 np0005593233 kernel: ACPI: Early table checksum verification disabled
Jan 23 03:14:51 np0005593233 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 23 03:14:51 np0005593233 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:14:51 np0005593233 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:14:51 np0005593233 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:14:51 np0005593233 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 23 03:14:51 np0005593233 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:14:51 np0005593233 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:14:51 np0005593233 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 23 03:14:51 np0005593233 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 23 03:14:51 np0005593233 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 23 03:14:51 np0005593233 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 23 03:14:51 np0005593233 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 23 03:14:51 np0005593233 kernel: No NUMA configuration found
Jan 23 03:14:51 np0005593233 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 23 03:14:51 np0005593233 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 23 03:14:51 np0005593233 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 23 03:14:51 np0005593233 kernel: Zone ranges:
Jan 23 03:14:51 np0005593233 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 03:14:51 np0005593233 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 03:14:51 np0005593233 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 03:14:51 np0005593233 kernel:  Device   empty
Jan 23 03:14:51 np0005593233 kernel: Movable zone start for each node
Jan 23 03:14:51 np0005593233 kernel: Early memory node ranges
Jan 23 03:14:51 np0005593233 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 03:14:51 np0005593233 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 23 03:14:51 np0005593233 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 03:14:51 np0005593233 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 23 03:14:51 np0005593233 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 03:14:51 np0005593233 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 03:14:51 np0005593233 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 03:14:51 np0005593233 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 03:14:51 np0005593233 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 03:14:51 np0005593233 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 03:14:51 np0005593233 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 03:14:51 np0005593233 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 03:14:51 np0005593233 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 03:14:51 np0005593233 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 03:14:51 np0005593233 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 03:14:51 np0005593233 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 03:14:51 np0005593233 kernel: TSC deadline timer available
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Max. logical packages:   8
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Max. logical dies:       8
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Max. dies per package:   1
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Max. threads per core:   1
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Num. cores per package:     1
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Num. threads per package:   1
Jan 23 03:14:51 np0005593233 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 23 03:14:51 np0005593233 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 03:14:51 np0005593233 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 03:14:51 np0005593233 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 23 03:14:51 np0005593233 kernel: Booting paravirtualized kernel on KVM
Jan 23 03:14:51 np0005593233 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 03:14:51 np0005593233 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 23 03:14:51 np0005593233 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 23 03:14:51 np0005593233 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 23 03:14:51 np0005593233 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 03:14:51 np0005593233 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 03:14:51 np0005593233 kernel: random: crng init done
Jan 23 03:14:51 np0005593233 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: Fallback order for Node 0: 0 
Jan 23 03:14:51 np0005593233 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 03:14:51 np0005593233 kernel: Policy zone: Normal
Jan 23 03:14:51 np0005593233 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 03:14:51 np0005593233 kernel: software IO TLB: area num 8.
Jan 23 03:14:51 np0005593233 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 23 03:14:51 np0005593233 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 03:14:51 np0005593233 kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 03:14:51 np0005593233 kernel: Dynamic Preempt: voluntary
Jan 23 03:14:51 np0005593233 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 03:14:51 np0005593233 kernel: rcu: #011RCU event tracing is enabled.
Jan 23 03:14:51 np0005593233 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 23 03:14:51 np0005593233 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 23 03:14:51 np0005593233 kernel: #011Rude variant of Tasks RCU enabled.
Jan 23 03:14:51 np0005593233 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 23 03:14:51 np0005593233 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 03:14:51 np0005593233 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 23 03:14:51 np0005593233 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 03:14:51 np0005593233 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 03:14:51 np0005593233 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 03:14:51 np0005593233 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 23 03:14:51 np0005593233 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 03:14:51 np0005593233 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 03:14:51 np0005593233 kernel: Console: colour VGA+ 80x25
Jan 23 03:14:51 np0005593233 kernel: printk: console [ttyS0] enabled
Jan 23 03:14:51 np0005593233 kernel: ACPI: Core revision 20230331
Jan 23 03:14:51 np0005593233 kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 03:14:51 np0005593233 kernel: x2apic enabled
Jan 23 03:14:51 np0005593233 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 03:14:51 np0005593233 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 03:14:51 np0005593233 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 23 03:14:51 np0005593233 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 03:14:51 np0005593233 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 03:14:51 np0005593233 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 03:14:51 np0005593233 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 03:14:51 np0005593233 kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 03:14:51 np0005593233 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 03:14:51 np0005593233 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 23 03:14:51 np0005593233 kernel: RETBleed: Mitigation: untrained return thunk
Jan 23 03:14:51 np0005593233 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 03:14:51 np0005593233 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 03:14:51 np0005593233 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 03:14:51 np0005593233 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 03:14:51 np0005593233 kernel: x86/bugs: return thunk changed
Jan 23 03:14:51 np0005593233 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 03:14:51 np0005593233 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 03:14:51 np0005593233 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 03:14:51 np0005593233 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 03:14:51 np0005593233 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 03:14:51 np0005593233 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 23 03:14:51 np0005593233 kernel: Freeing SMP alternatives memory: 40K
Jan 23 03:14:51 np0005593233 kernel: pid_max: default: 32768 minimum: 301
Jan 23 03:14:51 np0005593233 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 03:14:51 np0005593233 kernel: landlock: Up and running.
Jan 23 03:14:51 np0005593233 kernel: Yama: becoming mindful.
Jan 23 03:14:51 np0005593233 kernel: SELinux:  Initializing.
Jan 23 03:14:51 np0005593233 kernel: LSM support for eBPF active
Jan 23 03:14:51 np0005593233 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 23 03:14:51 np0005593233 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 03:14:51 np0005593233 kernel: ... version:                0
Jan 23 03:14:51 np0005593233 kernel: ... bit width:              48
Jan 23 03:14:51 np0005593233 kernel: ... generic registers:      6
Jan 23 03:14:51 np0005593233 kernel: ... value mask:             0000ffffffffffff
Jan 23 03:14:51 np0005593233 kernel: ... max period:             00007fffffffffff
Jan 23 03:14:51 np0005593233 kernel: ... fixed-purpose events:   0
Jan 23 03:14:51 np0005593233 kernel: ... event mask:             000000000000003f
Jan 23 03:14:51 np0005593233 kernel: signal: max sigframe size: 1776
Jan 23 03:14:51 np0005593233 kernel: rcu: Hierarchical SRCU implementation.
Jan 23 03:14:51 np0005593233 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 23 03:14:51 np0005593233 kernel: smp: Bringing up secondary CPUs ...
Jan 23 03:14:51 np0005593233 kernel: smpboot: x86: Booting SMP configuration:
Jan 23 03:14:51 np0005593233 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 23 03:14:51 np0005593233 kernel: smp: Brought up 1 node, 8 CPUs
Jan 23 03:14:51 np0005593233 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 23 03:14:51 np0005593233 kernel: node 0 deferred pages initialised in 96ms
Jan 23 03:14:51 np0005593233 kernel: Memory: 7763736K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618364K reserved, 0K cma-reserved)
Jan 23 03:14:51 np0005593233 kernel: devtmpfs: initialized
Jan 23 03:14:51 np0005593233 kernel: x86/mm: Memory block size: 128MB
Jan 23 03:14:51 np0005593233 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 03:14:51 np0005593233 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 23 03:14:51 np0005593233 kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 03:14:51 np0005593233 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 03:14:51 np0005593233 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 03:14:51 np0005593233 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 03:14:51 np0005593233 kernel: audit: initializing netlink subsys (disabled)
Jan 23 03:14:51 np0005593233 kernel: audit: type=2000 audit(1769156087.665:1): state=initialized audit_enabled=0 res=1
Jan 23 03:14:51 np0005593233 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 03:14:51 np0005593233 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 03:14:51 np0005593233 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 03:14:51 np0005593233 kernel: cpuidle: using governor menu
Jan 23 03:14:51 np0005593233 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 03:14:51 np0005593233 kernel: PCI: Using configuration type 1 for base access
Jan 23 03:14:51 np0005593233 kernel: PCI: Using configuration type 1 for extended access
Jan 23 03:14:51 np0005593233 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 03:14:51 np0005593233 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 03:14:51 np0005593233 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 03:14:51 np0005593233 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 03:14:51 np0005593233 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 03:14:51 np0005593233 kernel: Demotion targets for Node 0: null
Jan 23 03:14:51 np0005593233 kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 03:14:51 np0005593233 kernel: ACPI: Added _OSI(Module Device)
Jan 23 03:14:51 np0005593233 kernel: ACPI: Added _OSI(Processor Device)
Jan 23 03:14:51 np0005593233 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 03:14:51 np0005593233 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 03:14:51 np0005593233 kernel: ACPI: Interpreter enabled
Jan 23 03:14:51 np0005593233 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 23 03:14:51 np0005593233 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 03:14:51 np0005593233 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 03:14:51 np0005593233 kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 03:14:51 np0005593233 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 23 03:14:51 np0005593233 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 03:14:51 np0005593233 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [3] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [4] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [5] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [6] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [7] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [8] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [9] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [10] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [11] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [12] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [13] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [14] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [15] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [16] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [17] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [18] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [19] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [20] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [21] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [22] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [23] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [24] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [25] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [26] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [27] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [28] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [29] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [30] registered
Jan 23 03:14:51 np0005593233 kernel: acpiphp: Slot [31] registered
Jan 23 03:14:51 np0005593233 kernel: PCI host bridge to bus 0000:00
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 23 03:14:51 np0005593233 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 03:14:51 np0005593233 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 03:14:51 np0005593233 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 03:14:51 np0005593233 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 03:14:51 np0005593233 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 23 03:14:51 np0005593233 kernel: iommu: Default domain type: Translated
Jan 23 03:14:51 np0005593233 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 03:14:51 np0005593233 kernel: SCSI subsystem initialized
Jan 23 03:14:51 np0005593233 kernel: ACPI: bus type USB registered
Jan 23 03:14:51 np0005593233 kernel: usbcore: registered new interface driver usbfs
Jan 23 03:14:51 np0005593233 kernel: usbcore: registered new interface driver hub
Jan 23 03:14:51 np0005593233 kernel: usbcore: registered new device driver usb
Jan 23 03:14:51 np0005593233 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 03:14:51 np0005593233 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 03:14:51 np0005593233 kernel: PTP clock support registered
Jan 23 03:14:51 np0005593233 kernel: EDAC MC: Ver: 3.0.0
Jan 23 03:14:51 np0005593233 kernel: NetLabel: Initializing
Jan 23 03:14:51 np0005593233 kernel: NetLabel:  domain hash size = 128
Jan 23 03:14:51 np0005593233 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 03:14:51 np0005593233 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 03:14:51 np0005593233 kernel: PCI: Using ACPI for IRQ routing
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 03:14:51 np0005593233 kernel: vgaarb: loaded
Jan 23 03:14:51 np0005593233 kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 03:14:51 np0005593233 kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 03:14:51 np0005593233 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 03:14:51 np0005593233 kernel: pnp: PnP ACPI init
Jan 23 03:14:51 np0005593233 kernel: pnp: PnP ACPI: found 5 devices
Jan 23 03:14:51 np0005593233 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_INET protocol family
Jan 23 03:14:51 np0005593233 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 03:14:51 np0005593233 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_XDP protocol family
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 23 03:14:51 np0005593233 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 23 03:14:51 np0005593233 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 23 03:14:51 np0005593233 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 78905 usecs
Jan 23 03:14:51 np0005593233 kernel: PCI: CLS 0 bytes, default 64
Jan 23 03:14:51 np0005593233 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 03:14:51 np0005593233 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 23 03:14:51 np0005593233 kernel: Trying to unpack rootfs image as initramfs...
Jan 23 03:14:51 np0005593233 kernel: ACPI: bus type thunderbolt registered
Jan 23 03:14:51 np0005593233 kernel: Initialise system trusted keyrings
Jan 23 03:14:51 np0005593233 kernel: Key type blacklist registered
Jan 23 03:14:51 np0005593233 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 03:14:51 np0005593233 kernel: zbud: loaded
Jan 23 03:14:51 np0005593233 kernel: integrity: Platform Keyring initialized
Jan 23 03:14:51 np0005593233 kernel: integrity: Machine keyring initialized
Jan 23 03:14:51 np0005593233 kernel: Freeing initrd memory: 87956K
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_ALG protocol family
Jan 23 03:14:51 np0005593233 kernel: xor: automatically using best checksumming function   avx       
Jan 23 03:14:51 np0005593233 kernel: Key type asymmetric registered
Jan 23 03:14:51 np0005593233 kernel: Asymmetric key parser 'x509' registered
Jan 23 03:14:51 np0005593233 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 03:14:51 np0005593233 kernel: io scheduler mq-deadline registered
Jan 23 03:14:51 np0005593233 kernel: io scheduler kyber registered
Jan 23 03:14:51 np0005593233 kernel: io scheduler bfq registered
Jan 23 03:14:51 np0005593233 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 03:14:51 np0005593233 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 03:14:51 np0005593233 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 03:14:51 np0005593233 kernel: ACPI: button: Power Button [PWRF]
Jan 23 03:14:51 np0005593233 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 23 03:14:51 np0005593233 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 23 03:14:51 np0005593233 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 23 03:14:51 np0005593233 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 03:14:51 np0005593233 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 03:14:51 np0005593233 kernel: Non-volatile memory driver v1.3
Jan 23 03:14:51 np0005593233 kernel: rdac: device handler registered
Jan 23 03:14:51 np0005593233 kernel: hp_sw: device handler registered
Jan 23 03:14:51 np0005593233 kernel: emc: device handler registered
Jan 23 03:14:51 np0005593233 kernel: alua: device handler registered
Jan 23 03:14:51 np0005593233 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 23 03:14:51 np0005593233 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 23 03:14:51 np0005593233 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 23 03:14:51 np0005593233 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 23 03:14:51 np0005593233 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 03:14:51 np0005593233 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 03:14:51 np0005593233 kernel: usb usb1: Product: UHCI Host Controller
Jan 23 03:14:51 np0005593233 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 03:14:51 np0005593233 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 23 03:14:51 np0005593233 kernel: hub 1-0:1.0: USB hub found
Jan 23 03:14:51 np0005593233 kernel: hub 1-0:1.0: 2 ports detected
Jan 23 03:14:51 np0005593233 kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 03:14:51 np0005593233 kernel: usbserial: USB Serial support registered for generic
Jan 23 03:14:51 np0005593233 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 03:14:51 np0005593233 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 03:14:51 np0005593233 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 03:14:51 np0005593233 kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 03:14:51 np0005593233 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 23 03:14:51 np0005593233 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 03:14:51 np0005593233 kernel: rtc_cmos 00:04: registered as rtc0
Jan 23 03:14:51 np0005593233 kernel: rtc_cmos 00:04: setting system clock to 2026-01-23T08:14:50 UTC (1769156090)
Jan 23 03:14:51 np0005593233 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 23 03:14:51 np0005593233 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 03:14:51 np0005593233 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 03:14:51 np0005593233 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 03:14:51 np0005593233 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 03:14:51 np0005593233 kernel: usbcore: registered new interface driver usbhid
Jan 23 03:14:51 np0005593233 kernel: usbhid: USB HID core driver
Jan 23 03:14:51 np0005593233 kernel: drop_monitor: Initializing network drop monitor service
Jan 23 03:14:51 np0005593233 kernel: Initializing XFRM netlink socket
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_INET6 protocol family
Jan 23 03:14:51 np0005593233 kernel: Segment Routing with IPv6
Jan 23 03:14:51 np0005593233 kernel: NET: Registered PF_PACKET protocol family
Jan 23 03:14:51 np0005593233 kernel: mpls_gso: MPLS GSO support
Jan 23 03:14:51 np0005593233 kernel: IPI shorthand broadcast: enabled
Jan 23 03:14:51 np0005593233 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 03:14:51 np0005593233 kernel: AES CTR mode by8 optimization enabled
Jan 23 03:14:51 np0005593233 kernel: sched_clock: Marking stable (3998007999, 149170850)->(4430175409, -282996560)
Jan 23 03:14:51 np0005593233 kernel: registered taskstats version 1
Jan 23 03:14:51 np0005593233 kernel: Loading compiled-in X.509 certificates
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 03:14:51 np0005593233 kernel: Demotion targets for Node 0: null
Jan 23 03:14:51 np0005593233 kernel: page_owner is disabled
Jan 23 03:14:51 np0005593233 kernel: Key type .fscrypt registered
Jan 23 03:14:51 np0005593233 kernel: Key type fscrypt-provisioning registered
Jan 23 03:14:51 np0005593233 kernel: Key type big_key registered
Jan 23 03:14:51 np0005593233 kernel: Key type encrypted registered
Jan 23 03:14:51 np0005593233 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 03:14:51 np0005593233 kernel: Loading compiled-in module X.509 certificates
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 03:14:51 np0005593233 kernel: ima: Allocated hash algorithm: sha256
Jan 23 03:14:51 np0005593233 kernel: ima: No architecture policies found
Jan 23 03:14:51 np0005593233 kernel: evm: Initialising EVM extended attributes:
Jan 23 03:14:51 np0005593233 kernel: evm: security.selinux
Jan 23 03:14:51 np0005593233 kernel: evm: security.SMACK64 (disabled)
Jan 23 03:14:51 np0005593233 kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 03:14:51 np0005593233 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 03:14:51 np0005593233 kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 03:14:51 np0005593233 kernel: evm: security.apparmor (disabled)
Jan 23 03:14:51 np0005593233 kernel: evm: security.ima
Jan 23 03:14:51 np0005593233 kernel: evm: security.capability
Jan 23 03:14:51 np0005593233 kernel: evm: HMAC attrs: 0x1
Jan 23 03:14:51 np0005593233 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 03:14:51 np0005593233 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 03:14:51 np0005593233 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 03:14:51 np0005593233 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 03:14:51 np0005593233 kernel: usb 1-1: Manufacturer: QEMU
Jan 23 03:14:51 np0005593233 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 23 03:14:51 np0005593233 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 03:14:51 np0005593233 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 23 03:14:51 np0005593233 kernel: Running certificate verification RSA selftest
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 03:14:51 np0005593233 kernel: Running certificate verification ECDSA selftest
Jan 23 03:14:51 np0005593233 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 03:14:51 np0005593233 kernel: clk: Disabling unused clocks
Jan 23 03:14:51 np0005593233 kernel: Freeing unused decrypted memory: 2028K
Jan 23 03:14:51 np0005593233 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 03:14:51 np0005593233 kernel: Write protecting the kernel read-only data: 30720k
Jan 23 03:14:51 np0005593233 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 03:14:51 np0005593233 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 03:14:51 np0005593233 kernel: Run /init as init process
Jan 23 03:14:51 np0005593233 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 03:14:51 np0005593233 systemd: Detected virtualization kvm.
Jan 23 03:14:51 np0005593233 systemd: Detected architecture x86-64.
Jan 23 03:14:51 np0005593233 systemd: Running in initrd.
Jan 23 03:14:51 np0005593233 systemd: No hostname configured, using default hostname.
Jan 23 03:14:51 np0005593233 systemd: Hostname set to <localhost>.
Jan 23 03:14:51 np0005593233 systemd: Initializing machine ID from VM UUID.
Jan 23 03:14:51 np0005593233 systemd: Queued start job for default target Initrd Default Target.
Jan 23 03:14:51 np0005593233 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 03:14:51 np0005593233 systemd: Reached target Local Encrypted Volumes.
Jan 23 03:14:51 np0005593233 systemd: Reached target Initrd /usr File System.
Jan 23 03:14:51 np0005593233 systemd: Reached target Local File Systems.
Jan 23 03:14:51 np0005593233 systemd: Reached target Path Units.
Jan 23 03:14:51 np0005593233 systemd: Reached target Slice Units.
Jan 23 03:14:51 np0005593233 systemd: Reached target Swaps.
Jan 23 03:14:51 np0005593233 systemd: Reached target Timer Units.
Jan 23 03:14:51 np0005593233 systemd: Listening on D-Bus System Message Bus Socket.
Jan 23 03:14:51 np0005593233 systemd: Listening on Journal Socket (/dev/log).
Jan 23 03:14:51 np0005593233 systemd: Listening on Journal Socket.
Jan 23 03:14:51 np0005593233 systemd: Listening on udev Control Socket.
Jan 23 03:14:51 np0005593233 systemd: Listening on udev Kernel Socket.
Jan 23 03:14:51 np0005593233 systemd: Reached target Socket Units.
Jan 23 03:14:51 np0005593233 systemd: Starting Create List of Static Device Nodes...
Jan 23 03:14:51 np0005593233 systemd: Starting Journal Service...
Jan 23 03:14:51 np0005593233 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 03:14:51 np0005593233 systemd: Starting Apply Kernel Variables...
Jan 23 03:14:51 np0005593233 systemd: Starting Create System Users...
Jan 23 03:14:51 np0005593233 systemd: Starting Setup Virtual Console...
Jan 23 03:14:51 np0005593233 systemd: Finished Create List of Static Device Nodes.
Jan 23 03:14:51 np0005593233 systemd: Finished Apply Kernel Variables.
Jan 23 03:14:51 np0005593233 systemd: Finished Create System Users.
Jan 23 03:14:51 np0005593233 systemd-journald[307]: Journal started
Jan 23 03:14:51 np0005593233 systemd-journald[307]: Runtime Journal (/run/log/journal/5e159ac4110b464c8264d020fcde6246) is 8.0M, max 153.6M, 145.6M free.
Jan 23 03:14:51 np0005593233 systemd-sysusers[311]: Creating group 'users' with GID 100.
Jan 23 03:14:51 np0005593233 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Jan 23 03:14:51 np0005593233 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 03:14:51 np0005593233 systemd: Starting Create Static Device Nodes in /dev...
Jan 23 03:14:51 np0005593233 systemd: Started Journal Service.
Jan 23 03:14:51 np0005593233 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 03:14:51 np0005593233 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 03:14:51 np0005593233 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 03:14:51 np0005593233 systemd[1]: Finished Setup Virtual Console.
Jan 23 03:14:51 np0005593233 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 03:14:51 np0005593233 systemd[1]: Starting dracut cmdline hook...
Jan 23 03:14:51 np0005593233 dracut-cmdline[331]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 03:14:51 np0005593233 dracut-cmdline[331]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 03:14:51 np0005593233 systemd[1]: Finished dracut cmdline hook.
Jan 23 03:14:51 np0005593233 systemd[1]: Starting dracut pre-udev hook...
Jan 23 03:14:51 np0005593233 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 03:14:51 np0005593233 kernel: device-mapper: uevent: version 1.0.3
Jan 23 03:14:51 np0005593233 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 03:14:51 np0005593233 kernel: RPC: Registered named UNIX socket transport module.
Jan 23 03:14:51 np0005593233 kernel: RPC: Registered udp transport module.
Jan 23 03:14:51 np0005593233 kernel: RPC: Registered tcp transport module.
Jan 23 03:14:51 np0005593233 kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 03:14:51 np0005593233 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 03:14:52 np0005593233 rpc.statd[447]: Version 2.5.4 starting
Jan 23 03:14:52 np0005593233 rpc.statd[447]: Initializing NSM state
Jan 23 03:14:52 np0005593233 rpc.idmapd[452]: Setting log level to 0
Jan 23 03:14:52 np0005593233 systemd[1]: Finished dracut pre-udev hook.
Jan 23 03:14:52 np0005593233 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 03:14:52 np0005593233 systemd-udevd[465]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 03:14:52 np0005593233 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 03:14:52 np0005593233 systemd[1]: Starting dracut pre-trigger hook...
Jan 23 03:14:52 np0005593233 systemd[1]: Finished dracut pre-trigger hook.
Jan 23 03:14:52 np0005593233 systemd[1]: Starting Coldplug All udev Devices...
Jan 23 03:14:52 np0005593233 systemd[1]: Created slice Slice /system/modprobe.
Jan 23 03:14:52 np0005593233 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 03:14:52 np0005593233 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 03:14:52 np0005593233 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 03:14:52 np0005593233 systemd[1]: Reached target Network.
Jan 23 03:14:52 np0005593233 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 03:14:52 np0005593233 systemd[1]: Starting dracut initqueue hook...
Jan 23 03:14:52 np0005593233 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 03:14:52 np0005593233 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 03:14:52 np0005593233 systemd[1]: Mounting Kernel Configuration File System...
Jan 23 03:14:52 np0005593233 systemd[1]: Mounted Kernel Configuration File System.
Jan 23 03:14:52 np0005593233 systemd[1]: Reached target System Initialization.
Jan 23 03:14:52 np0005593233 systemd[1]: Reached target Basic System.
Jan 23 03:14:52 np0005593233 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 23 03:14:52 np0005593233 kernel: scsi host0: ata_piix
Jan 23 03:14:52 np0005593233 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 03:14:52 np0005593233 kernel: scsi host1: ata_piix
Jan 23 03:14:52 np0005593233 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 23 03:14:52 np0005593233 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 23 03:14:52 np0005593233 kernel: vda: vda1
Jan 23 03:14:52 np0005593233 kernel: ata1: found unknown device (class 0)
Jan 23 03:14:52 np0005593233 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 03:14:52 np0005593233 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 03:14:53 np0005593233 systemd-udevd[517]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:14:53 np0005593233 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 03:14:53 np0005593233 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 03:14:53 np0005593233 systemd[1]: Reached target Initrd Root Device.
Jan 23 03:14:53 np0005593233 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 03:14:53 np0005593233 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 03:14:53 np0005593233 systemd[1]: Finished dracut initqueue hook.
Jan 23 03:14:53 np0005593233 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 03:14:53 np0005593233 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 03:14:53 np0005593233 systemd[1]: Reached target Remote File Systems.
Jan 23 03:14:53 np0005593233 systemd[1]: Starting dracut pre-mount hook...
Jan 23 03:14:53 np0005593233 systemd[1]: Finished dracut pre-mount hook.
Jan 23 03:14:53 np0005593233 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 03:14:53 np0005593233 systemd-fsck[560]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 03:14:53 np0005593233 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 03:14:53 np0005593233 systemd[1]: Mounting /sysroot...
Jan 23 03:14:54 np0005593233 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 03:14:54 np0005593233 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 03:14:54 np0005593233 kernel: XFS (vda1): Ending clean mount
Jan 23 03:14:54 np0005593233 systemd[1]: Mounted /sysroot.
Jan 23 03:14:54 np0005593233 systemd[1]: Reached target Initrd Root File System.
Jan 23 03:14:54 np0005593233 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 03:14:54 np0005593233 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 03:14:54 np0005593233 systemd[1]: Reached target Initrd File Systems.
Jan 23 03:14:54 np0005593233 systemd[1]: Reached target Initrd Default Target.
Jan 23 03:14:54 np0005593233 systemd[1]: Starting dracut mount hook...
Jan 23 03:14:54 np0005593233 systemd[1]: Finished dracut mount hook.
Jan 23 03:14:54 np0005593233 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 03:14:54 np0005593233 rpc.idmapd[452]: exiting on signal 15
Jan 23 03:14:54 np0005593233 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 03:14:54 np0005593233 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Network.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Timer Units.
Jan 23 03:14:54 np0005593233 systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Initrd Default Target.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Basic System.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Initrd Root Device.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Initrd /usr File System.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Path Units.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Remote File Systems.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Slice Units.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Socket Units.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target System Initialization.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Local File Systems.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Swaps.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut mount hook.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut pre-mount hook.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut initqueue hook.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Setup Virtual Console.
Jan 23 03:14:54 np0005593233 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-udevd.service: Consumed 2.730s CPU time.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Closed udev Control Socket.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Closed udev Kernel Socket.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut pre-udev hook.
Jan 23 03:14:54 np0005593233 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped dracut cmdline hook.
Jan 23 03:14:54 np0005593233 systemd[1]: Starting Cleanup udev Database...
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 03:14:54 np0005593233 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 03:14:54 np0005593233 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Stopped Create System Users.
Jan 23 03:14:54 np0005593233 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 03:14:54 np0005593233 systemd[1]: Finished Cleanup udev Database.
Jan 23 03:14:54 np0005593233 systemd[1]: Reached target Switch Root.
Jan 23 03:14:54 np0005593233 systemd[1]: Starting Switch Root...
Jan 23 03:14:54 np0005593233 systemd[1]: Switching root.
Jan 23 03:14:54 np0005593233 systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Jan 23 03:14:54 np0005593233 systemd-journald[307]: Journal stopped
Jan 23 03:14:56 np0005593233 kernel: audit: type=1404 audit(1769156094.913:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:14:56 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:14:56 np0005593233 kernel: audit: type=1403 audit(1769156095.072:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 03:14:56 np0005593233 systemd: Successfully loaded SELinux policy in 162.407ms.
Jan 23 03:14:56 np0005593233 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.541ms.
Jan 23 03:14:56 np0005593233 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 03:14:56 np0005593233 systemd: Detected virtualization kvm.
Jan 23 03:14:56 np0005593233 systemd: Detected architecture x86-64.
Jan 23 03:14:56 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:14:56 np0005593233 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd: Stopped Switch Root.
Jan 23 03:14:56 np0005593233 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 03:14:56 np0005593233 systemd: Created slice Slice /system/getty.
Jan 23 03:14:56 np0005593233 systemd: Created slice Slice /system/serial-getty.
Jan 23 03:14:56 np0005593233 systemd: Created slice Slice /system/sshd-keygen.
Jan 23 03:14:56 np0005593233 systemd: Created slice User and Session Slice.
Jan 23 03:14:56 np0005593233 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 03:14:56 np0005593233 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 23 03:14:56 np0005593233 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 03:14:56 np0005593233 systemd: Reached target Local Encrypted Volumes.
Jan 23 03:14:56 np0005593233 systemd: Stopped target Switch Root.
Jan 23 03:14:56 np0005593233 systemd: Stopped target Initrd File Systems.
Jan 23 03:14:56 np0005593233 systemd: Stopped target Initrd Root File System.
Jan 23 03:14:56 np0005593233 systemd: Reached target Local Integrity Protected Volumes.
Jan 23 03:14:56 np0005593233 systemd: Reached target Path Units.
Jan 23 03:14:56 np0005593233 systemd: Reached target rpc_pipefs.target.
Jan 23 03:14:56 np0005593233 systemd: Reached target Slice Units.
Jan 23 03:14:56 np0005593233 systemd: Reached target Swaps.
Jan 23 03:14:56 np0005593233 systemd: Reached target Local Verity Protected Volumes.
Jan 23 03:14:56 np0005593233 systemd: Listening on RPCbind Server Activation Socket.
Jan 23 03:14:56 np0005593233 systemd: Reached target RPC Port Mapper.
Jan 23 03:14:56 np0005593233 systemd: Listening on Process Core Dump Socket.
Jan 23 03:14:56 np0005593233 systemd: Listening on initctl Compatibility Named Pipe.
Jan 23 03:14:56 np0005593233 systemd: Listening on udev Control Socket.
Jan 23 03:14:56 np0005593233 systemd: Listening on udev Kernel Socket.
Jan 23 03:14:56 np0005593233 systemd: Mounting Huge Pages File System...
Jan 23 03:14:56 np0005593233 systemd: Mounting POSIX Message Queue File System...
Jan 23 03:14:56 np0005593233 systemd: Mounting Kernel Debug File System...
Jan 23 03:14:56 np0005593233 systemd: Mounting Kernel Trace File System...
Jan 23 03:14:56 np0005593233 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 03:14:56 np0005593233 systemd: Starting Create List of Static Device Nodes...
Jan 23 03:14:56 np0005593233 systemd: Starting Load Kernel Module configfs...
Jan 23 03:14:56 np0005593233 systemd: Starting Load Kernel Module drm...
Jan 23 03:14:56 np0005593233 systemd: Starting Load Kernel Module efi_pstore...
Jan 23 03:14:56 np0005593233 systemd: Starting Load Kernel Module fuse...
Jan 23 03:14:56 np0005593233 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 03:14:56 np0005593233 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd: Stopped File System Check on Root Device.
Jan 23 03:14:56 np0005593233 systemd: Stopped Journal Service.
Jan 23 03:14:56 np0005593233 systemd: Starting Journal Service...
Jan 23 03:14:56 np0005593233 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 03:14:56 np0005593233 systemd: Starting Generate network units from Kernel command line...
Jan 23 03:14:56 np0005593233 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 03:14:56 np0005593233 systemd: Starting Remount Root and Kernel File Systems...
Jan 23 03:14:56 np0005593233 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 03:14:56 np0005593233 systemd: Starting Apply Kernel Variables...
Jan 23 03:14:56 np0005593233 systemd: Starting Coldplug All udev Devices...
Jan 23 03:14:56 np0005593233 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 03:14:56 np0005593233 systemd-journald[685]: Journal started
Jan 23 03:14:56 np0005593233 systemd-journald[685]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 03:14:56 np0005593233 kernel: ACPI: bus type drm_connector registered
Jan 23 03:14:56 np0005593233 systemd[1]: Queued start job for default target Multi-User System.
Jan 23 03:14:56 np0005593233 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd: Started Journal Service.
Jan 23 03:14:56 np0005593233 kernel: fuse: init (API version 7.37)
Jan 23 03:14:56 np0005593233 systemd[1]: Mounted Huge Pages File System.
Jan 23 03:14:56 np0005593233 systemd[1]: Mounted POSIX Message Queue File System.
Jan 23 03:14:56 np0005593233 systemd[1]: Mounted Kernel Debug File System.
Jan 23 03:14:56 np0005593233 systemd[1]: Mounted Kernel Trace File System.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 03:14:56 np0005593233 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 03:14:56 np0005593233 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Load Kernel Module drm.
Jan 23 03:14:56 np0005593233 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 03:14:56 np0005593233 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Load Kernel Module fuse.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 03:14:56 np0005593233 systemd[1]: Mounting FUSE Control File System...
Jan 23 03:14:56 np0005593233 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Rebuild Hardware Database...
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 03:14:56 np0005593233 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 03:14:56 np0005593233 systemd-journald[685]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 03:14:56 np0005593233 systemd-journald[685]: Received client request to flush runtime journal.
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Create System Users...
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Apply Kernel Variables.
Jan 23 03:14:56 np0005593233 systemd[1]: Mounted FUSE Control File System.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 03:14:56 np0005593233 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Create System Users.
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 03:14:56 np0005593233 systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 03:14:56 np0005593233 systemd[1]: Reached target Local File Systems.
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 03:14:56 np0005593233 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 03:14:56 np0005593233 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 03:14:56 np0005593233 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 03:14:56 np0005593233 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 03:14:56 np0005593233 bootctl[702]: Couldn't find EFI system partition, skipping.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Security Auditing Service...
Jan 23 03:14:56 np0005593233 systemd[1]: Starting RPC Bind...
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 03:14:56 np0005593233 auditd[708]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 03:14:56 np0005593233 auditd[708]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 03:14:56 np0005593233 systemd[1]: Started RPC Bind.
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 03:14:56 np0005593233 augenrules[713]: /sbin/augenrules: No change
Jan 23 03:14:56 np0005593233 augenrules[728]: No rules
Jan 23 03:14:56 np0005593233 augenrules[728]: enabled 1
Jan 23 03:14:56 np0005593233 augenrules[728]: failure 1
Jan 23 03:14:56 np0005593233 augenrules[728]: pid 708
Jan 23 03:14:56 np0005593233 augenrules[728]: rate_limit 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_limit 8192
Jan 23 03:14:56 np0005593233 augenrules[728]: lost 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_wait_time 60000
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_wait_time_actual 0
Jan 23 03:14:56 np0005593233 augenrules[728]: enabled 1
Jan 23 03:14:56 np0005593233 augenrules[728]: failure 1
Jan 23 03:14:56 np0005593233 augenrules[728]: pid 708
Jan 23 03:14:56 np0005593233 augenrules[728]: rate_limit 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_limit 8192
Jan 23 03:14:56 np0005593233 augenrules[728]: lost 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_wait_time 60000
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_wait_time_actual 0
Jan 23 03:14:56 np0005593233 augenrules[728]: enabled 1
Jan 23 03:14:56 np0005593233 augenrules[728]: failure 1
Jan 23 03:14:56 np0005593233 augenrules[728]: pid 708
Jan 23 03:14:56 np0005593233 augenrules[728]: rate_limit 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_limit 8192
Jan 23 03:14:56 np0005593233 augenrules[728]: lost 0
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog 4
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_wait_time 60000
Jan 23 03:14:56 np0005593233 augenrules[728]: backlog_wait_time_actual 0
Jan 23 03:14:56 np0005593233 systemd[1]: Started Security Auditing Service.
Jan 23 03:14:56 np0005593233 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 03:14:56 np0005593233 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 03:14:57 np0005593233 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 03:14:57 np0005593233 systemd[1]: Finished Rebuild Hardware Database.
Jan 23 03:14:57 np0005593233 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 03:14:57 np0005593233 systemd[1]: Starting Update is Completed...
Jan 23 03:14:57 np0005593233 systemd[1]: Finished Update is Completed.
Jan 23 03:14:57 np0005593233 systemd-udevd[736]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 03:14:57 np0005593233 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 03:14:57 np0005593233 systemd[1]: Reached target System Initialization.
Jan 23 03:14:57 np0005593233 systemd[1]: Started dnf makecache --timer.
Jan 23 03:14:57 np0005593233 systemd[1]: Started Daily rotation of log files.
Jan 23 03:14:57 np0005593233 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 03:14:57 np0005593233 systemd[1]: Reached target Timer Units.
Jan 23 03:14:57 np0005593233 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 03:14:57 np0005593233 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 03:14:57 np0005593233 systemd[1]: Reached target Socket Units.
Jan 23 03:14:57 np0005593233 systemd[1]: Starting D-Bus System Message Bus...
Jan 23 03:14:57 np0005593233 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 03:14:57 np0005593233 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 03:14:57 np0005593233 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 03:14:57 np0005593233 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 03:14:57 np0005593233 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 03:14:57 np0005593233 systemd-udevd[748]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:14:57 np0005593233 systemd[1]: Started D-Bus System Message Bus.
Jan 23 03:14:57 np0005593233 systemd[1]: Reached target Basic System.
Jan 23 03:14:57 np0005593233 dbus-broker-lau[761]: Ready
Jan 23 03:14:57 np0005593233 systemd[1]: Starting NTP client/server...
Jan 23 03:14:57 np0005593233 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 03:14:57 np0005593233 chronyd[783]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 03:14:57 np0005593233 chronyd[783]: Loaded 0 symmetric keys
Jan 23 03:14:57 np0005593233 chronyd[783]: Using right/UTC timezone to obtain leap second data
Jan 23 03:14:57 np0005593233 chronyd[783]: Loaded seccomp filter (level 2)
Jan 23 03:14:58 np0005593233 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 03:14:58 np0005593233 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 03:14:58 np0005593233 systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 03:14:58 np0005593233 systemd[1]: Started irqbalance daemon.
Jan 23 03:14:58 np0005593233 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 03:14:58 np0005593233 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 03:14:58 np0005593233 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 03:14:58 np0005593233 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 03:14:58 np0005593233 systemd[1]: Reached target sshd-keygen.target.
Jan 23 03:14:58 np0005593233 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 03:14:58 np0005593233 systemd[1]: Reached target User and Group Name Lookups.
Jan 23 03:14:58 np0005593233 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 23 03:14:58 np0005593233 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 03:14:58 np0005593233 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 03:14:58 np0005593233 systemd[1]: Starting User Login Management...
Jan 23 03:14:58 np0005593233 systemd[1]: Started NTP client/server.
Jan 23 03:14:58 np0005593233 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 03:14:58 np0005593233 kernel: kvm_amd: TSC scaling supported
Jan 23 03:14:58 np0005593233 kernel: kvm_amd: Nested Virtualization enabled
Jan 23 03:14:58 np0005593233 kernel: kvm_amd: Nested Paging enabled
Jan 23 03:14:58 np0005593233 kernel: kvm_amd: LBR virtualization supported
Jan 23 03:14:58 np0005593233 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 23 03:14:58 np0005593233 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 23 03:14:58 np0005593233 kernel: Console: switching to colour dummy device 80x25
Jan 23 03:14:58 np0005593233 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 03:14:58 np0005593233 kernel: [drm] features: -context_init
Jan 23 03:14:58 np0005593233 kernel: [drm] number of scanouts: 1
Jan 23 03:14:58 np0005593233 kernel: [drm] number of cap sets: 0
Jan 23 03:14:58 np0005593233 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 23 03:14:58 np0005593233 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 03:14:58 np0005593233 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 03:14:58 np0005593233 kernel: Console: switching to colour frame buffer device 128x48
Jan 23 03:14:58 np0005593233 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 03:14:58 np0005593233 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 03:14:58 np0005593233 systemd-logind[804]: New seat seat0.
Jan 23 03:14:58 np0005593233 systemd-logind[804]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 03:14:58 np0005593233 systemd-logind[804]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 03:14:58 np0005593233 systemd[1]: Started User Login Management.
Jan 23 03:14:58 np0005593233 iptables.init[795]: iptables: Applying firewall rules: [  OK  ]
Jan 23 03:14:58 np0005593233 systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 03:14:59 np0005593233 cloud-init[844]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 08:14:58 +0000. Up 12.44 seconds.
Jan 23 03:14:59 np0005593233 systemd[1]: run-cloud\x2dinit-tmp-tmpfr98ejkw.mount: Deactivated successfully.
Jan 23 03:14:59 np0005593233 systemd[1]: Starting Hostname Service...
Jan 23 03:14:59 np0005593233 systemd[1]: Started Hostname Service.
Jan 23 03:14:59 np0005593233 systemd-hostnamed[858]: Hostname set to <np0005593233.novalocal> (static)
Jan 23 03:14:59 np0005593233 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 03:14:59 np0005593233 systemd[1]: Reached target Preparation for Network.
Jan 23 03:14:59 np0005593233 systemd[1]: Starting Network Manager...
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6406] NetworkManager (version 1.54.3-2.el9) is starting... (boot:66815a1e-6ee4-4c69-a06a-3cb6c4c50564)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6411] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6516] manager[0x5592e734a000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6562] hostname: hostname: using hostnamed
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6562] hostname: static hostname changed from (none) to "np0005593233.novalocal"
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6566] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6700] manager[0x5592e734a000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6700] manager[0x5592e734a000]: rfkill: WWAN hardware radio set enabled
Jan 23 03:14:59 np0005593233 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6768] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6768] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6769] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6769] manager: Networking is enabled by state file
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6771] settings: Loaded settings plugin: keyfile (internal)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6782] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6802] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6811] dhcp: init: Using DHCP client 'internal'
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6814] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6826] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6834] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6842] device (lo): Activation: starting connection 'lo' (1b1b4b7e-5795-461d-975a-62ee381f5d15)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6851] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6854] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6890] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6894] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 03:14:59 np0005593233 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6896] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6897] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6899] device (eth0): carrier: link connected
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6901] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6907] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6914] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:14:59 np0005593233 systemd[1]: Started Network Manager.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6917] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6918] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6920] manager: NetworkManager state is now CONNECTING
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6921] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:14:59 np0005593233 systemd[1]: Reached target Network.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6930] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6934] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6970] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Jan 23 03:14:59 np0005593233 systemd[1]: Starting Network Manager Wait Online...
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6976] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.6995] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:14:59 np0005593233 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 03:14:59 np0005593233 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7136] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7138] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7143] device (lo): Activation: successful, device activated.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7148] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7150] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7153] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7155] device (eth0): Activation: successful, device activated.
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7160] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 03:14:59 np0005593233 NetworkManager[862]: <info>  [1769156099.7163] manager: startup complete
Jan 23 03:14:59 np0005593233 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 03:14:59 np0005593233 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 03:14:59 np0005593233 systemd[1]: Reached target NFS client services.
Jan 23 03:14:59 np0005593233 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 03:14:59 np0005593233 systemd[1]: Reached target Remote File Systems.
Jan 23 03:14:59 np0005593233 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 03:14:59 np0005593233 systemd[1]: Finished Network Manager Wait Online.
Jan 23 03:14:59 np0005593233 systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 03:15:00 np0005593233 cloud-init[925]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 08:15:00 +0000. Up 13.67 seconds.
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |  eth0  | True |        38.102.83.224         | 255.255.255.0 | global | fa:16:3e:57:69:e7 |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |  eth0  | True | fe80::f816:3eff:fe57:69e7/64 |       .       |  link  | fa:16:3e:57:69:e7 |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 23 03:15:00 np0005593233 cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 03:15:01 np0005593233 cloud-init[925]: Generating public/private rsa key pair.
Jan 23 03:15:01 np0005593233 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 03:15:01 np0005593233 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 03:15:01 np0005593233 cloud-init[925]: The key fingerprint is:
Jan 23 03:15:01 np0005593233 cloud-init[925]: SHA256:JoxqqrBkA67GWThOqWqEWp+KyK1FyQZW785sbT5R4oM root@np0005593233.novalocal
Jan 23 03:15:01 np0005593233 cloud-init[925]: The key's randomart image is:
Jan 23 03:15:01 np0005593233 cloud-init[925]: +---[RSA 3072]----+
Jan 23 03:15:01 np0005593233 cloud-init[925]: |   .             |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |  . .            |
Jan 23 03:15:01 np0005593233 cloud-init[925]: | o   .           |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |. o oo . .       |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |o o=..= S        |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |+*+o+E.*         |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |O**o * oo        |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |X@+ + o.         |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |&ooo   ..        |
Jan 23 03:15:01 np0005593233 cloud-init[925]: +----[SHA256]-----+
Jan 23 03:15:01 np0005593233 cloud-init[925]: Generating public/private ecdsa key pair.
Jan 23 03:15:01 np0005593233 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 03:15:01 np0005593233 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 03:15:01 np0005593233 cloud-init[925]: The key fingerprint is:
Jan 23 03:15:01 np0005593233 cloud-init[925]: SHA256:LcrHtws3TWhKVWrbeSixHsJ5BixnLzeWfcs74i66TTI root@np0005593233.novalocal
Jan 23 03:15:01 np0005593233 cloud-init[925]: The key's randomart image is:
Jan 23 03:15:01 np0005593233 cloud-init[925]: +---[ECDSA 256]---+
Jan 23 03:15:01 np0005593233 cloud-init[925]: |            .    |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |       .   o     |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |      . = =      |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |       = B O o   |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |        S ^ * o  |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |     . + % * + . |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |      o E * . o  |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |       . O.o. .. |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |        oo+=o... |
Jan 23 03:15:01 np0005593233 cloud-init[925]: +----[SHA256]-----+
Jan 23 03:15:01 np0005593233 cloud-init[925]: Generating public/private ed25519 key pair.
Jan 23 03:15:01 np0005593233 cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 03:15:01 np0005593233 cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 03:15:01 np0005593233 cloud-init[925]: The key fingerprint is:
Jan 23 03:15:01 np0005593233 cloud-init[925]: SHA256:yJaHq8ZJWTlIYVp0mNYDXqimrqRJ6DPVz2sbUnwiE/I root@np0005593233.novalocal
Jan 23 03:15:01 np0005593233 cloud-init[925]: The key's randomart image is:
Jan 23 03:15:01 np0005593233 cloud-init[925]: +--[ED25519 256]--+
Jan 23 03:15:01 np0005593233 cloud-init[925]: |   .*Bo          |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |   =*oo          |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |  .=.o o         |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |  o +.*+         |
Jan 23 03:15:01 np0005593233 cloud-init[925]: | o  .E*=S.       |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |o  .oo+oo        |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |oo.o o+.         |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |==  +..+.        |
Jan 23 03:15:01 np0005593233 cloud-init[925]: |=.o.. .oo        |
Jan 23 03:15:01 np0005593233 cloud-init[925]: +----[SHA256]-----+
Jan 23 03:15:01 np0005593233 systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 03:15:01 np0005593233 systemd[1]: Reached target Cloud-config availability.
Jan 23 03:15:01 np0005593233 systemd[1]: Reached target Network is Online.
Jan 23 03:15:01 np0005593233 systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 03:15:01 np0005593233 systemd[1]: Starting Crash recovery kernel arming...
Jan 23 03:15:01 np0005593233 systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 03:15:01 np0005593233 systemd[1]: Starting System Logging Service...
Jan 23 03:15:01 np0005593233 systemd[1]: Starting OpenSSH server daemon...
Jan 23 03:15:01 np0005593233 sm-notify[1008]: Version 2.5.4 starting
Jan 23 03:15:01 np0005593233 systemd[1]: Starting Permit User Sessions...
Jan 23 03:15:01 np0005593233 systemd[1]: Started Notify NFS peers of a restart.
Jan 23 03:15:01 np0005593233 systemd[1]: Finished Permit User Sessions.
Jan 23 03:15:01 np0005593233 systemd[1]: Started OpenSSH server daemon.
Jan 23 03:15:01 np0005593233 systemd[1]: Started Command Scheduler.
Jan 23 03:15:01 np0005593233 systemd[1]: Started Getty on tty1.
Jan 23 03:15:01 np0005593233 systemd[1]: Started Serial Getty on ttyS0.
Jan 23 03:15:01 np0005593233 systemd[1]: Reached target Login Prompts.
Jan 23 03:15:01 np0005593233 rsyslogd[1009]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1009" x-info="https://www.rsyslog.com"] start
Jan 23 03:15:01 np0005593233 rsyslogd[1009]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 03:15:01 np0005593233 systemd[1]: Started System Logging Service.
Jan 23 03:15:01 np0005593233 systemd[1]: Reached target Multi-User System.
Jan 23 03:15:01 np0005593233 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 03:15:02 np0005593233 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 03:15:02 np0005593233 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 03:15:02 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:15:02 np0005593233 kdumpctl[1020]: kdump: No kdump initial ramdisk found.
Jan 23 03:15:02 np0005593233 kdumpctl[1020]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 03:15:02 np0005593233 cloud-init[1166]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 08:15:02 +0000. Up 15.68 seconds.
Jan 23 03:15:02 np0005593233 systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 03:15:02 np0005593233 systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 03:15:02 np0005593233 dracut[1287]: dracut-057-102.git20250818.el9
Jan 23 03:15:02 np0005593233 dracut[1289]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 03:15:03 np0005593233 cloud-init[1357]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 08:15:03 +0000. Up 16.57 seconds.
Jan 23 03:15:03 np0005593233 cloud-init[1374]: #############################################################
Jan 23 03:15:03 np0005593233 cloud-init[1375]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 03:15:03 np0005593233 cloud-init[1380]: 256 SHA256:LcrHtws3TWhKVWrbeSixHsJ5BixnLzeWfcs74i66TTI root@np0005593233.novalocal (ECDSA)
Jan 23 03:15:03 np0005593233 cloud-init[1385]: 256 SHA256:yJaHq8ZJWTlIYVp0mNYDXqimrqRJ6DPVz2sbUnwiE/I root@np0005593233.novalocal (ED25519)
Jan 23 03:15:03 np0005593233 cloud-init[1387]: 3072 SHA256:JoxqqrBkA67GWThOqWqEWp+KyK1FyQZW785sbT5R4oM root@np0005593233.novalocal (RSA)
Jan 23 03:15:03 np0005593233 cloud-init[1388]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 03:15:03 np0005593233 cloud-init[1390]: #############################################################
Jan 23 03:15:03 np0005593233 cloud-init[1357]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 08:15:03 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 16.76 seconds
Jan 23 03:15:03 np0005593233 systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 03:15:03 np0005593233 systemd[1]: Reached target Cloud-init target.
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 03:15:03 np0005593233 dracut[1289]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: memstrack is not available
Jan 23 03:15:04 np0005593233 dracut[1289]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 03:15:04 np0005593233 dracut[1289]: memstrack is not available
Jan 23 03:15:04 np0005593233 dracut[1289]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 03:15:04 np0005593233 dracut[1289]: *** Including module: systemd ***
Jan 23 03:15:04 np0005593233 dracut[1289]: *** Including module: fips ***
Jan 23 03:15:04 np0005593233 chronyd[783]: Selected source 23.159.16.194 (2.centos.pool.ntp.org)
Jan 23 03:15:04 np0005593233 chronyd[783]: System clock TAI offset set to 37 seconds
Jan 23 03:15:05 np0005593233 dracut[1289]: *** Including module: systemd-initrd ***
Jan 23 03:15:05 np0005593233 dracut[1289]: *** Including module: i18n ***
Jan 23 03:15:05 np0005593233 dracut[1289]: *** Including module: drm ***
Jan 23 03:15:05 np0005593233 dracut[1289]: *** Including module: prefixdevname ***
Jan 23 03:15:05 np0005593233 dracut[1289]: *** Including module: kernel-modules ***
Jan 23 03:15:05 np0005593233 kernel: block vda: the capability attribute has been deprecated.
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: kernel-modules-extra ***
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: qemu ***
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: fstab-sys ***
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: rootfs-block ***
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: terminfo ***
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: udev-rules ***
Jan 23 03:15:06 np0005593233 dracut[1289]: Skipping udev rule: 91-permissions.rules
Jan 23 03:15:06 np0005593233 dracut[1289]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: virtiofs ***
Jan 23 03:15:06 np0005593233 dracut[1289]: *** Including module: dracut-systemd ***
Jan 23 03:15:06 np0005593233 systemd[1]: serial-getty@ttyS0.service: Deactivated successfully.
Jan 23 03:15:07 np0005593233 dracut[1289]: *** Including module: usrmount ***
Jan 23 03:15:07 np0005593233 dracut[1289]: *** Including module: base ***
Jan 23 03:15:07 np0005593233 dracut[1289]: *** Including module: fs-lib ***
Jan 23 03:15:07 np0005593233 dracut[1289]: *** Including module: kdumpbase ***
Jan 23 03:15:07 np0005593233 systemd[1]: serial-getty@ttyS0.service: Scheduled restart job, restart counter is at 1.
Jan 23 03:15:07 np0005593233 systemd[1]: Stopped Serial Getty on ttyS0.
Jan 23 03:15:07 np0005593233 systemd[1]: Started Serial Getty on ttyS0.
Jan 23 03:15:07 np0005593233 dracut[1289]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 03:15:07 np0005593233 dracut[1289]:  microcode_ctl module: mangling fw_dir
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 03:15:07 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 03:15:08 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 03:15:08 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 03:15:08 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 03:15:08 np0005593233 dracut[1289]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 03:15:08 np0005593233 dracut[1289]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 03:15:08 np0005593233 dracut[1289]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 03:15:08 np0005593233 dracut[1289]: *** Including module: openssl ***
Jan 23 03:15:08 np0005593233 dracut[1289]: *** Including module: shutdown ***
Jan 23 03:15:08 np0005593233 dracut[1289]: *** Including module: squash ***
Jan 23 03:15:08 np0005593233 dracut[1289]: *** Including modules done ***
Jan 23 03:15:08 np0005593233 dracut[1289]: *** Installing kernel module dependencies ***
Jan 23 03:15:08 np0005593233 irqbalance[797]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 23 03:15:08 np0005593233 irqbalance[797]: IRQ 25 affinity is now unmanaged
Jan 23 03:15:08 np0005593233 irqbalance[797]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 23 03:15:08 np0005593233 irqbalance[797]: IRQ 31 affinity is now unmanaged
Jan 23 03:15:08 np0005593233 irqbalance[797]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 23 03:15:08 np0005593233 irqbalance[797]: IRQ 28 affinity is now unmanaged
Jan 23 03:15:08 np0005593233 irqbalance[797]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 23 03:15:08 np0005593233 irqbalance[797]: IRQ 32 affinity is now unmanaged
Jan 23 03:15:08 np0005593233 irqbalance[797]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 23 03:15:08 np0005593233 irqbalance[797]: IRQ 30 affinity is now unmanaged
Jan 23 03:15:08 np0005593233 irqbalance[797]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 23 03:15:08 np0005593233 irqbalance[797]: IRQ 29 affinity is now unmanaged
Jan 23 03:15:09 np0005593233 dracut[1289]: *** Installing kernel module dependencies done ***
Jan 23 03:15:09 np0005593233 dracut[1289]: *** Resolving executable dependencies ***
Jan 23 03:15:09 np0005593233 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:15:10 np0005593233 dracut[1289]: *** Resolving executable dependencies done ***
Jan 23 03:15:10 np0005593233 dracut[1289]: *** Generating early-microcode cpio image ***
Jan 23 03:15:10 np0005593233 dracut[1289]: *** Store current command line parameters ***
Jan 23 03:15:10 np0005593233 dracut[1289]: Stored kernel commandline:
Jan 23 03:15:10 np0005593233 dracut[1289]: No dracut internal kernel commandline stored in the initramfs
Jan 23 03:15:10 np0005593233 dracut[1289]: *** Install squash loader ***
Jan 23 03:15:11 np0005593233 dracut[1289]: *** Squashing the files inside the initramfs ***
Jan 23 03:15:13 np0005593233 dracut[1289]: *** Squashing the files inside the initramfs done ***
Jan 23 03:15:13 np0005593233 dracut[1289]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 03:15:13 np0005593233 dracut[1289]: *** Hardlinking files ***
Jan 23 03:15:13 np0005593233 dracut[1289]: *** Hardlinking files done ***
Jan 23 03:15:13 np0005593233 dracut[1289]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 03:15:14 np0005593233 kdumpctl[1020]: kdump: kexec: loaded kdump kernel
Jan 23 03:15:14 np0005593233 kdumpctl[1020]: kdump: Starting kdump: [OK]
Jan 23 03:15:14 np0005593233 systemd[1]: Finished Crash recovery kernel arming.
Jan 23 03:15:14 np0005593233 systemd[1]: Startup finished in 4.594s (kernel) + 3.773s (initrd) + 19.595s (userspace) = 27.963s.
Jan 23 03:15:29 np0005593233 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:16:11 np0005593233 systemd[1]: Created slice User Slice of UID 1000.
Jan 23 03:16:11 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 03:16:11 np0005593233 systemd-logind[804]: New session 1 of user zuul.
Jan 23 03:16:11 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 03:16:11 np0005593233 systemd[1]: Starting User Manager for UID 1000...
Jan 23 03:16:11 np0005593233 systemd[4315]: Queued start job for default target Main User Target.
Jan 23 03:16:12 np0005593233 systemd[4315]: Created slice User Application Slice.
Jan 23 03:16:12 np0005593233 systemd[4315]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 03:16:12 np0005593233 systemd[4315]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 03:16:12 np0005593233 systemd[4315]: Reached target Paths.
Jan 23 03:16:12 np0005593233 systemd[4315]: Reached target Timers.
Jan 23 03:16:12 np0005593233 systemd[4315]: Starting D-Bus User Message Bus Socket...
Jan 23 03:16:12 np0005593233 systemd[4315]: Starting Create User's Volatile Files and Directories...
Jan 23 03:16:12 np0005593233 systemd[4315]: Finished Create User's Volatile Files and Directories.
Jan 23 03:16:12 np0005593233 systemd[4315]: Listening on D-Bus User Message Bus Socket.
Jan 23 03:16:12 np0005593233 systemd[4315]: Reached target Sockets.
Jan 23 03:16:12 np0005593233 systemd[4315]: Reached target Basic System.
Jan 23 03:16:12 np0005593233 systemd[4315]: Reached target Main User Target.
Jan 23 03:16:12 np0005593233 systemd[4315]: Startup finished in 120ms.
Jan 23 03:16:12 np0005593233 systemd[1]: Started User Manager for UID 1000.
Jan 23 03:16:12 np0005593233 systemd[1]: Started Session 1 of User zuul.
Jan 23 03:16:12 np0005593233 python3[4397]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:18 np0005593233 python3[4425]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:30 np0005593233 python3[4483]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:32 np0005593233 python3[4523]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 03:16:34 np0005593233 python3[4549]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9EiUM/Wr1dKB5RqtCvSqVc226WpmY1hGtuMMJuG9WgOJFIcFQvrVt53c29e+5OTV1q1e4Rmvj7g1SyULPxSS7/DtzxyTWY79kkBpxNFmAUeUA4U0adsVRTquvEONpBa4UE3bqCTtaRQa8XED98xqS4bCBXmbcLROlQ4Qc91Uj3wxKY4/fplPdXYZdZXz3cxwEsyC6dRkYcfiUSowlrmecr3FZO6SJfG9H4YFxzwAu1R4led86PwzjZJyHfDeIHcdaDUVFcX2hGQv9iIqgYP58aTb2gRp2PxSQJfGAevolpgA3xrQKo2uBDBuRTC/hE81toPd5IIPQ3lX2JDXxauMMbmmxSjYCltaP2/bcvZ697yZh1vEmyz62itMHt6GV69XsjsX5jHWhY2RtQ6ZpsNSqrOHSUj4jlPcZEFk+4UshKJJZNaM1psuS+KAGeodosF43EuKDbWMGeqCe/kwZaBXj/Xxob+rLcVQBMVOBq+EHuNNKxSIqaNZiMz0RBf11CUk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:34 np0005593233 python3[4573]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:35 np0005593233 python3[4672]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:35 np0005593233 python3[4743]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156194.8545573-252-96271306882806/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=87e5a27e13f74e79b84f2ddd13a58bce_id_rsa follow=False checksum=40e82d1acb27268baed51ce64c7c4dfd80f45a5d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:36 np0005593233 python3[4866]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:36 np0005593233 python3[4937]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156196.1148815-307-181799575029277/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=87e5a27e13f74e79b84f2ddd13a58bce_id_rsa.pub follow=False checksum=1c043d58f1d2a49f415267f4f9437247d6d980d7 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:38 np0005593233 python3[4985]: ansible-ping Invoked with data=pong
Jan 23 03:16:39 np0005593233 python3[5009]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:41 np0005593233 python3[5067]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 03:16:42 np0005593233 python3[5099]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:42 np0005593233 python3[5123]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:42 np0005593233 python3[5147]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:43 np0005593233 python3[5171]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:43 np0005593233 python3[5195]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:43 np0005593233 python3[5219]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:45 np0005593233 python3[5245]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:46 np0005593233 python3[5323]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:46 np0005593233 python3[5396]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156205.7187066-32-151485575249646/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:47 np0005593233 python3[5444]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:47 np0005593233 python3[5468]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:47 np0005593233 python3[5492]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593233 python3[5516]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593233 python3[5540]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593233 python3[5564]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593233 python3[5588]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:49 np0005593233 python3[5612]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:49 np0005593233 python3[5636]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:49 np0005593233 python3[5660]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593233 python3[5684]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593233 python3[5708]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593233 python3[5732]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593233 python3[5756]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:51 np0005593233 python3[5780]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:51 np0005593233 python3[5804]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:51 np0005593233 python3[5828]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:52 np0005593233 python3[5852]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:52 np0005593233 python3[5876]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593233 python3[5900]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593233 python3[5924]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593233 python3[5948]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:54 np0005593233 python3[5972]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:54 np0005593233 python3[5996]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:54 np0005593233 python3[6020]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:54 np0005593233 python3[6044]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:56 np0005593233 python3[6070]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 03:16:56 np0005593233 systemd[1]: Starting Time & Date Service...
Jan 23 03:16:56 np0005593233 systemd[1]: Started Time & Date Service.
Jan 23 03:16:56 np0005593233 systemd-timedated[6072]: Changed time zone to 'UTC' (UTC).
Jan 23 03:16:58 np0005593233 python3[6101]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:58 np0005593233 python3[6177]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:59 np0005593233 python3[6248]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769156218.4613564-253-93037954623326/source _original_basename=tmpt10xbsjn follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:59 np0005593233 python3[6348]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:59 np0005593233 python3[6419]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769156219.2971213-302-223783386693856/source _original_basename=tmp85r3axud follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:00 np0005593233 python3[6521]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:17:01 np0005593233 python3[6594]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769156220.480234-382-14114533514908/source _original_basename=tmpdegg9plj follow=False checksum=332c94ac911d053598365a4ff7b72c4143f36dd6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:01 np0005593233 python3[6642]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:17:01 np0005593233 python3[6668]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:17:02 np0005593233 python3[6748]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:17:02 np0005593233 python3[6821]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156222.1881044-453-176466223723108/source _original_basename=tmp5ywhb3kj follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:03 np0005593233 python3[6872]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-51e8-11f9-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:17:04 np0005593233 python3[6900]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e8-11f9-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 03:17:05 np0005593233 python3[6929]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:26 np0005593233 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 03:17:38 np0005593233 python3[6957]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:18:12 np0005593233 systemd[4315]: Starting Mark boot as successful...
Jan 23 03:18:12 np0005593233 systemd[4315]: Finished Mark boot as successful.
Jan 23 03:18:38 np0005593233 systemd-logind[804]: Session 1 logged out. Waiting for processes to exit.
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 23 03:18:45 np0005593233 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 23 03:18:45 np0005593233 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5600] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 03:18:45 np0005593233 systemd-udevd[6961]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5816] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5863] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5867] device (eth1): carrier: link connected
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5870] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5878] policy: auto-activating connection 'Wired connection 1' (640d0f87-1153-3cd5-bb8f-96e84186f0c8)
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5884] device (eth1): Activation: starting connection 'Wired connection 1' (640d0f87-1153-3cd5-bb8f-96e84186f0c8)
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5884] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.5887] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.6276] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:18:45 np0005593233 NetworkManager[862]: <info>  [1769156325.6295] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:18:46 np0005593233 systemd-logind[804]: New session 3 of user zuul.
Jan 23 03:18:46 np0005593233 systemd[1]: Started Session 3 of User zuul.
Jan 23 03:18:46 np0005593233 python3[6992]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-ea81-0856-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:18:56 np0005593233 python3[7072]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:18:56 np0005593233 python3[7145]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156336.1779392-155-192353670878755/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=4519dfe639408473d2f3e05ed975425922476cb1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:18:57 np0005593233 python3[7195]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:18:57 np0005593233 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 03:18:57 np0005593233 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 03:18:57 np0005593233 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 03:18:57 np0005593233 systemd[1]: Stopping Network Manager...
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6336] caught SIGTERM, shutting down normally.
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6372] dhcp4 (eth0): canceled DHCP transaction
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6372] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6373] dhcp4 (eth0): state changed no lease
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6377] manager: NetworkManager state is now CONNECTING
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6459] dhcp4 (eth1): canceled DHCP transaction
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6460] dhcp4 (eth1): state changed no lease
Jan 23 03:18:57 np0005593233 NetworkManager[862]: <info>  [1769156337.6520] exiting (success)
Jan 23 03:18:57 np0005593233 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:18:57 np0005593233 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 03:18:57 np0005593233 systemd[1]: Stopped Network Manager.
Jan 23 03:18:57 np0005593233 systemd[1]: NetworkManager.service: Consumed 2.192s CPU time, 10.0M memory peak.
Jan 23 03:18:57 np0005593233 systemd[1]: Starting Network Manager...
Jan 23 03:18:57 np0005593233 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7010] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:66815a1e-6ee4-4c69-a06a-3cb6c4c50564)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7011] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7093] manager[0x562034b70000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 03:18:57 np0005593233 systemd[1]: Starting Hostname Service...
Jan 23 03:18:57 np0005593233 systemd[1]: Started Hostname Service.
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7965] hostname: hostname: using hostnamed
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7966] hostname: static hostname changed from (none) to "np0005593233.novalocal"
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7975] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7982] manager[0x562034b70000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.7983] manager[0x562034b70000]: rfkill: WWAN hardware radio set enabled
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8024] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8025] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8026] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8027] manager: Networking is enabled by state file
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8030] settings: Loaded settings plugin: keyfile (internal)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8037] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8080] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8097] dhcp: init: Using DHCP client 'internal'
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8103] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8110] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8118] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8130] device (lo): Activation: starting connection 'lo' (1b1b4b7e-5795-461d-975a-62ee381f5d15)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8140] device (eth0): carrier: link connected
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8146] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8152] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8153] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8161] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8168] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8175] device (eth1): carrier: link connected
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8180] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8185] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (640d0f87-1153-3cd5-bb8f-96e84186f0c8) (indicated)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8187] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8192] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8199] device (eth1): Activation: starting connection 'Wired connection 1' (640d0f87-1153-3cd5-bb8f-96e84186f0c8)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8207] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 03:18:57 np0005593233 systemd[1]: Started Network Manager.
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8213] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8216] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8220] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8224] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8228] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8231] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8234] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8238] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8247] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8254] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8264] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8267] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8495] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8508] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8511] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8517] device (lo): Activation: successful, device activated.
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8529] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 03:18:57 np0005593233 systemd[1]: Starting Network Manager Wait Online...
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8601] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8629] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8631] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8634] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8637] device (eth0): Activation: successful, device activated.
Jan 23 03:18:57 np0005593233 NetworkManager[7202]: <info>  [1769156337.8641] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 03:18:58 np0005593233 python3[7279]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-ea81-0856-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:19:07 np0005593233 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:19:27 np0005593233 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5437] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:19:43 np0005593233 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:19:43 np0005593233 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5861] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5865] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5877] device (eth1): Activation: successful, device activated.
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5885] manager: startup complete
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5889] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <warn>  [1769156383.5897] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.5922] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 systemd[1]: Finished Network Manager Wait Online.
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6070] dhcp4 (eth1): canceled DHCP transaction
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6074] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6075] dhcp4 (eth1): state changed no lease
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6099] policy: auto-activating connection 'ci-private-network' (627d92e8-6942-5fb9-ae4f-28a944c718dc)
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6107] device (eth1): Activation: starting connection 'ci-private-network' (627d92e8-6942-5fb9-ae4f-28a944c718dc)
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6109] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6115] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6131] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6145] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6193] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6200] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:19:43 np0005593233 NetworkManager[7202]: <info>  [1769156383.6212] device (eth1): Activation: successful, device activated.
Jan 23 03:19:53 np0005593233 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:19:58 np0005593233 systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 03:19:58 np0005593233 systemd[1]: session-3.scope: Consumed 1.903s CPU time.
Jan 23 03:19:58 np0005593233 systemd-logind[804]: Session 3 logged out. Waiting for processes to exit.
Jan 23 03:19:58 np0005593233 systemd-logind[804]: Removed session 3.
Jan 23 03:20:32 np0005593233 systemd-logind[804]: New session 4 of user zuul.
Jan 23 03:20:32 np0005593233 systemd[1]: Started Session 4 of User zuul.
Jan 23 03:20:33 np0005593233 python3[7391]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:20:33 np0005593233 python3[7464]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156432.9161382-373-115789207151713/source _original_basename=tmpnssd_1x7 follow=False checksum=2312614d64ff3d0f4e5be15d02ba4fd1b13cd8df backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:20:36 np0005593233 systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 03:20:36 np0005593233 systemd-logind[804]: Session 4 logged out. Waiting for processes to exit.
Jan 23 03:20:36 np0005593233 systemd-logind[804]: Removed session 4.
Jan 23 03:21:12 np0005593233 systemd[4315]: Created slice User Background Tasks Slice.
Jan 23 03:21:12 np0005593233 systemd[4315]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 03:21:12 np0005593233 systemd[4315]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 03:26:26 np0005593233 systemd-logind[804]: New session 5 of user zuul.
Jan 23 03:26:26 np0005593233 systemd[1]: Started Session 5 of User zuul.
Jan 23 03:26:26 np0005593233 python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-81c6-2885-000000000ca6-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:27 np0005593233 python3[7555]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:27 np0005593233 python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:28 np0005593233 python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:28 np0005593233 python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:28 np0005593233 python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:29 np0005593233 python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:26:29 np0005593233 python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156789.1105494-367-178135042780692/source _original_basename=tmp4_o6bfid follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:31 np0005593233 python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 03:26:31 np0005593233 systemd[1]: Reloading.
Jan 23 03:26:31 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:26:32 np0005593233 systemd[1]: Starting dnf makecache...
Jan 23 03:26:32 np0005593233 dnf[7894]: Failed determining last makecache time.
Jan 23 03:26:32 np0005593233 dnf[7894]: CentOS Stream 9 - BaseOS                         51 kB/s | 6.7 kB     00:00
Jan 23 03:26:33 np0005593233 dnf[7894]: CentOS Stream 9 - AppStream                      60 kB/s | 6.8 kB     00:00
Jan 23 03:26:33 np0005593233 dnf[7894]: CentOS Stream 9 - CRB                            61 kB/s | 6.6 kB     00:00
Jan 23 03:26:33 np0005593233 dnf[7894]: CentOS Stream 9 - Extras packages                32 kB/s | 7.3 kB     00:00
Jan 23 03:26:33 np0005593233 dnf[7894]: Metadata cache created.
Jan 23 03:26:33 np0005593233 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 03:26:33 np0005593233 systemd[1]: Finished dnf makecache.
Jan 23 03:26:34 np0005593233 python3[7927]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 03:26:34 np0005593233 python3[7954]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:34 np0005593233 python3[7982]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:35 np0005593233 python3[8010]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:35 np0005593233 python3[8038]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:36 np0005593233 python3[8065]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-81c6-2885-000000000cad-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:37 np0005593233 python3[8095]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 03:26:40 np0005593233 systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 03:26:40 np0005593233 systemd[1]: session-5.scope: Consumed 4.388s CPU time.
Jan 23 03:26:40 np0005593233 systemd-logind[804]: Session 5 logged out. Waiting for processes to exit.
Jan 23 03:26:40 np0005593233 systemd-logind[804]: Removed session 5.
Jan 23 03:26:41 np0005593233 systemd-logind[804]: New session 6 of user zuul.
Jan 23 03:26:41 np0005593233 systemd[1]: Started Session 6 of User zuul.
Jan 23 03:26:42 np0005593233 python3[8128]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 03:26:47 np0005593233 setsebool[8164]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 03:26:47 np0005593233 setsebool[8164]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 03:26:58 np0005593233 kernel: SELinux:  Converting 385 SID table entries...
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:26:58 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:27:09 np0005593233 kernel: SELinux:  Converting 388 SID table entries...
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:27:09 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:27:28 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 03:27:28 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:27:28 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:27:28 np0005593233 systemd[1]: Reloading.
Jan 23 03:27:28 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:27:28 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:27:38 np0005593233 irqbalance[797]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 23 03:27:38 np0005593233 irqbalance[797]: IRQ 27 affinity is now unmanaged
Jan 23 03:28:12 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:28:12 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:28:12 np0005593233 systemd[1]: man-db-cache-update.service: Consumed 53.214s CPU time.
Jan 23 03:28:12 np0005593233 systemd[1]: run-r1d94c763fe4e420ebf2be0918428aea4.service: Deactivated successfully.
Jan 23 03:28:19 np0005593233 python3[29517]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-57be-5ddd-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:28:20 np0005593233 kernel: evm: overlay not supported
Jan 23 03:28:20 np0005593233 systemd[4315]: Starting D-Bus User Message Bus...
Jan 23 03:28:20 np0005593233 dbus-broker-launch[29574]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 03:28:20 np0005593233 dbus-broker-launch[29574]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 03:28:20 np0005593233 systemd[4315]: Started D-Bus User Message Bus.
Jan 23 03:28:20 np0005593233 dbus-broker-lau[29574]: Ready
Jan 23 03:28:20 np0005593233 systemd[4315]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 03:28:20 np0005593233 systemd[4315]: Created slice Slice /user.
Jan 23 03:28:20 np0005593233 systemd[4315]: podman-29555.scope: unit configures an IP firewall, but not running as root.
Jan 23 03:28:20 np0005593233 systemd[4315]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 03:28:20 np0005593233 systemd[4315]: Started podman-29555.scope.
Jan 23 03:28:20 np0005593233 systemd[4315]: Started podman-pause-2b937694.scope.
Jan 23 03:28:21 np0005593233 python3[29602]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.129.56.147:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.129.56.147:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:28:21 np0005593233 python3[29602]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 23 03:28:21 np0005593233 systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 03:28:21 np0005593233 systemd[1]: session-6.scope: Consumed 45.805s CPU time.
Jan 23 03:28:21 np0005593233 systemd-logind[804]: Session 6 logged out. Waiting for processes to exit.
Jan 23 03:28:21 np0005593233 systemd-logind[804]: Removed session 6.
Jan 23 03:28:49 np0005593233 systemd-logind[804]: New session 7 of user zuul.
Jan 23 03:28:49 np0005593233 systemd[1]: Started Session 7 of User zuul.
Jan 23 03:28:50 np0005593233 python3[29640]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO5lLQe2ste4Gmi1Ir356q/C15WL/7fzZRoS9rMZVgSYU6jYrJKHH43bSyQAo3PQspZm2qMkx0r+2fgxF65A8l0= zuul@np0005593231.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:28:50 np0005593233 python3[29666]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO5lLQe2ste4Gmi1Ir356q/C15WL/7fzZRoS9rMZVgSYU6jYrJKHH43bSyQAo3PQspZm2qMkx0r+2fgxF65A8l0= zuul@np0005593231.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:28:51 np0005593233 python3[29692]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593233.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 03:28:52 np0005593233 python3[29726]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO5lLQe2ste4Gmi1Ir356q/C15WL/7fzZRoS9rMZVgSYU6jYrJKHH43bSyQAo3PQspZm2qMkx0r+2fgxF65A8l0= zuul@np0005593231.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:28:52 np0005593233 python3[29804]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:28:53 np0005593233 python3[29877]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156932.3143203-168-175942300849986/source _original_basename=tmp79y_eks4 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:28:53 np0005593233 python3[29927]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 23 03:28:54 np0005593233 systemd[1]: Starting Hostname Service...
Jan 23 03:28:54 np0005593233 systemd[1]: Started Hostname Service.
Jan 23 03:28:54 np0005593233 systemd-hostnamed[29931]: Changed pretty hostname to 'compute-1'
Jan 23 03:28:54 np0005593233 systemd-hostnamed[29931]: Hostname set to <compute-1> (static)
Jan 23 03:28:54 np0005593233 NetworkManager[7202]: <info>  [1769156934.1301] hostname: static hostname changed from "np0005593233.novalocal" to "compute-1"
Jan 23 03:28:54 np0005593233 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:28:54 np0005593233 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:28:54 np0005593233 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 03:28:54 np0005593233 systemd[1]: session-7.scope: Consumed 2.402s CPU time.
Jan 23 03:28:54 np0005593233 systemd-logind[804]: Session 7 logged out. Waiting for processes to exit.
Jan 23 03:28:54 np0005593233 systemd-logind[804]: Removed session 7.
Jan 23 03:29:04 np0005593233 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:29:24 np0005593233 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:30:12 np0005593233 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 03:30:13 np0005593233 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 03:30:13 np0005593233 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 03:30:13 np0005593233 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 03:33:53 np0005593233 systemd-logind[804]: New session 8 of user zuul.
Jan 23 03:33:53 np0005593233 systemd[1]: Started Session 8 of User zuul.
Jan 23 03:33:54 np0005593233 python3[30031]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:33:56 np0005593233 python3[30147]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:57 np0005593233 python3[30220]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:57 np0005593233 python3[30246]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:57 np0005593233 python3[30319]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:57 np0005593233 python3[30345]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:58 np0005593233 python3[30418]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:58 np0005593233 python3[30444]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:58 np0005593233 python3[30517]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:59 np0005593233 python3[30544]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:59 np0005593233 python3[30617]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:59 np0005593233 python3[30643]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:34:00 np0005593233 python3[30716]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:34:00 np0005593233 python3[30742]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:34:00 np0005593233 python3[30815]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1862-34016-242726206157318/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:34:14 np0005593233 python3[30863]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:39:14 np0005593233 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 03:39:14 np0005593233 systemd[1]: session-8.scope: Consumed 5.187s CPU time.
Jan 23 03:39:14 np0005593233 systemd-logind[804]: Session 8 logged out. Waiting for processes to exit.
Jan 23 03:39:14 np0005593233 systemd-logind[804]: Removed session 8.
Jan 23 03:49:33 np0005593233 systemd-logind[804]: New session 9 of user zuul.
Jan 23 03:49:33 np0005593233 systemd[1]: Started Session 9 of User zuul.
Jan 23 03:49:34 np0005593233 python3.9[31029]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:49:35 np0005593233 python3.9[31210]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:49:43 np0005593233 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 03:49:43 np0005593233 systemd[1]: session-9.scope: Consumed 8.653s CPU time.
Jan 23 03:49:43 np0005593233 systemd-logind[804]: Session 9 logged out. Waiting for processes to exit.
Jan 23 03:49:43 np0005593233 systemd-logind[804]: Removed session 9.
Jan 23 03:50:00 np0005593233 systemd-logind[804]: New session 10 of user zuul.
Jan 23 03:50:00 np0005593233 systemd[1]: Started Session 10 of User zuul.
Jan 23 03:50:01 np0005593233 python3.9[31420]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 03:50:02 np0005593233 python3.9[31594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:03 np0005593233 python3.9[31746]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:50:05 np0005593233 python3.9[31899]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:50:06 np0005593233 python3.9[32051]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:50:06 np0005593233 python3.9[32203]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:50:07 np0005593233 python3.9[32326]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158206.3496113-178-267250989587025/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:50:08 np0005593233 python3.9[32478]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:09 np0005593233 python3.9[32634]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:50:10 np0005593233 python3.9[32786]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:50:10 np0005593233 python3.9[32937]: ansible-ansible.builtin.service_facts Invoked
Jan 23 03:50:16 np0005593233 python3.9[33190]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:50:17 np0005593233 python3.9[33340]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:18 np0005593233 python3.9[33494]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:19 np0005593233 python3.9[33652]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:50:20 np0005593233 python3.9[33736]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:51:05 np0005593233 systemd[1]: Reloading.
Jan 23 03:51:05 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:51:05 np0005593233 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 03:51:05 np0005593233 systemd[1]: Reloading.
Jan 23 03:51:06 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:51:06 np0005593233 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 03:51:06 np0005593233 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 03:51:06 np0005593233 systemd[1]: Reloading.
Jan 23 03:51:06 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:51:06 np0005593233 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 03:51:06 np0005593233 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 23 03:51:06 np0005593233 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 23 03:52:11 np0005593233 kernel: SELinux:  Converting 2723 SID table entries...
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:52:11 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:52:11 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 03:52:11 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:52:11 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:52:11 np0005593233 systemd[1]: Reloading.
Jan 23 03:52:12 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:52:12 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:52:12 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:52:12 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:52:12 np0005593233 systemd[1]: man-db-cache-update.service: Consumed 1.140s CPU time.
Jan 23 03:52:12 np0005593233 systemd[1]: run-rba6cc5bf10e34a02ba436d22415e4c6c.service: Deactivated successfully.
Jan 23 03:52:14 np0005593233 python3.9[35255]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:52:16 np0005593233 python3.9[35536]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 03:52:17 np0005593233 python3.9[35688]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 03:52:20 np0005593233 python3.9[35841]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:52:21 np0005593233 python3.9[35993]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 03:52:23 np0005593233 python3.9[36145]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:23 np0005593233 python3.9[36297]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:52:26 np0005593233 python3.9[36420]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158343.2497342-668-76249663406141/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:52:30 np0005593233 python3.9[36572]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:52:35 np0005593233 python3.9[36724]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:52:36 np0005593233 python3.9[36877]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:52:37 np0005593233 python3.9[37029]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 03:52:37 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:52:37 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:52:38 np0005593233 python3.9[37183]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 03:52:39 np0005593233 python3.9[37341]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 03:52:40 np0005593233 python3.9[37501]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 03:52:41 np0005593233 python3.9[37654]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 03:52:42 np0005593233 python3.9[37812]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 03:52:43 np0005593233 python3.9[37964]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:52:46 np0005593233 python3.9[38117]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:47 np0005593233 python3.9[38269]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:52:47 np0005593233 python3.9[38392]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158366.66995-1024-275900312242556/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:48 np0005593233 python3.9[38544]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:52:48 np0005593233 systemd[1]: Starting Load Kernel Modules...
Jan 23 03:52:49 np0005593233 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 03:52:49 np0005593233 kernel: Bridge firewalling registered
Jan 23 03:52:49 np0005593233 systemd-modules-load[38548]: Inserted module 'br_netfilter'
Jan 23 03:52:49 np0005593233 systemd[1]: Finished Load Kernel Modules.
Jan 23 03:52:49 np0005593233 python3.9[38703]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:52:50 np0005593233 python3.9[38826]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158369.259375-1093-14283875979450/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:51 np0005593233 python3.9[38978]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:52:55 np0005593233 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 23 03:52:55 np0005593233 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 23 03:52:55 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:52:55 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:52:56 np0005593233 systemd[1]: Reloading.
Jan 23 03:52:56 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:52:56 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:52:58 np0005593233 irqbalance[797]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 23 03:52:58 np0005593233 irqbalance[797]: IRQ 26 affinity is now unmanaged
Jan 23 03:52:58 np0005593233 python3.9[40318]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:52:59 np0005593233 python3.9[41349]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 03:53:00 np0005593233 python3.9[42165]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:53:01 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:53:01 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:53:01 np0005593233 systemd[1]: man-db-cache-update.service: Consumed 4.778s CPU time.
Jan 23 03:53:01 np0005593233 systemd[1]: run-ra424f321a95a4d88b3ffe709e3e633c4.service: Deactivated successfully.
Jan 23 03:53:01 np0005593233 python3.9[43146]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:01 np0005593233 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 03:53:01 np0005593233 systemd[1]: Starting Authorization Manager...
Jan 23 03:53:01 np0005593233 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 03:53:01 np0005593233 polkitd[43364]: Started polkitd version 0.117
Jan 23 03:53:01 np0005593233 systemd[1]: Started Authorization Manager.
Jan 23 03:53:02 np0005593233 python3.9[43534]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:53:02 np0005593233 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 03:53:02 np0005593233 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 03:53:02 np0005593233 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 03:53:02 np0005593233 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 03:53:03 np0005593233 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 03:53:03 np0005593233 python3.9[43696]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 03:53:08 np0005593233 python3.9[43849]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:53:08 np0005593233 systemd[1]: Reloading.
Jan 23 03:53:08 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:09 np0005593233 python3.9[44037]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:53:09 np0005593233 systemd[1]: Reloading.
Jan 23 03:53:09 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:10 np0005593233 python3.9[44226]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:11 np0005593233 python3.9[44379]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:11 np0005593233 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 03:53:12 np0005593233 python3.9[44532]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:14 np0005593233 python3.9[44694]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:15 np0005593233 python3.9[44847]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:53:15 np0005593233 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 03:53:15 np0005593233 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 03:53:15 np0005593233 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 03:53:15 np0005593233 systemd[1]: Starting Apply Kernel Variables...
Jan 23 03:53:15 np0005593233 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 03:53:15 np0005593233 systemd[1]: Finished Apply Kernel Variables.
Jan 23 03:53:15 np0005593233 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 03:53:15 np0005593233 systemd[1]: session-10.scope: Consumed 2min 18.274s CPU time.
Jan 23 03:53:15 np0005593233 systemd-logind[804]: Session 10 logged out. Waiting for processes to exit.
Jan 23 03:53:15 np0005593233 systemd-logind[804]: Removed session 10.
Jan 23 03:53:21 np0005593233 systemd-logind[804]: New session 11 of user zuul.
Jan 23 03:53:21 np0005593233 systemd[1]: Started Session 11 of User zuul.
Jan 23 03:53:22 np0005593233 python3.9[45030]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:53:24 np0005593233 python3.9[45186]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 03:53:24 np0005593233 python3.9[45339]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 03:53:26 np0005593233 python3.9[45497]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 03:53:27 np0005593233 python3.9[45657]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:53:28 np0005593233 python3.9[45741]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 03:53:31 np0005593233 python3.9[45904]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:53:43 np0005593233 kernel: SELinux:  Converting 2735 SID table entries...
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:53:43 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:53:43 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 03:53:43 np0005593233 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 03:53:44 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:53:44 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:53:44 np0005593233 systemd[1]: Reloading.
Jan 23 03:53:45 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:53:45 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:45 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:53:45 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:53:45 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:53:45 np0005593233 systemd[1]: run-rbef12e0d363948a08e51c84a245e37ef.service: Deactivated successfully.
Jan 23 03:53:49 np0005593233 python3.9[47002]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 03:53:49 np0005593233 systemd[1]: Reloading.
Jan 23 03:53:49 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:53:49 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:49 np0005593233 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 03:53:49 np0005593233 chown[47044]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 03:53:49 np0005593233 ovs-ctl[47049]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 03:53:49 np0005593233 ovs-ctl[47049]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 03:53:49 np0005593233 ovs-ctl[47049]: Starting ovsdb-server [  OK  ]
Jan 23 03:53:49 np0005593233 ovs-vsctl[47098]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 03:53:49 np0005593233 ovs-vsctl[47114]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"539cfa5a-1c2f-4cb4-97af-2edb819f72fc\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 03:53:49 np0005593233 ovs-ctl[47049]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 03:53:49 np0005593233 ovs-ctl[47049]: Enabling remote OVSDB managers [  OK  ]
Jan 23 03:53:49 np0005593233 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 03:53:49 np0005593233 ovs-vsctl[47123]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 23 03:53:49 np0005593233 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 03:53:49 np0005593233 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 03:53:49 np0005593233 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 03:53:49 np0005593233 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 03:53:49 np0005593233 ovs-ctl[47167]: Inserting openvswitch module [  OK  ]
Jan 23 03:53:50 np0005593233 ovs-ctl[47136]: Starting ovs-vswitchd [  OK  ]
Jan 23 03:53:50 np0005593233 ovs-ctl[47136]: Enabling remote OVSDB managers [  OK  ]
Jan 23 03:53:50 np0005593233 ovs-vsctl[47184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 23 03:53:50 np0005593233 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 03:53:50 np0005593233 systemd[1]: Starting Open vSwitch...
Jan 23 03:53:50 np0005593233 systemd[1]: Finished Open vSwitch.
Jan 23 03:53:50 np0005593233 python3.9[47336]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:53:51 np0005593233 python3.9[47488]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 03:53:53 np0005593233 kernel: SELinux:  Converting 2749 SID table entries...
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:53:53 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:53:54 np0005593233 python3.9[47643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:53:55 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 03:53:55 np0005593233 python3.9[47801]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:53:57 np0005593233 python3.9[47954]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:59 np0005593233 python3.9[48241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 03:54:00 np0005593233 python3.9[48391]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:54:01 np0005593233 python3.9[48545]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:54:03 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:54:03 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:54:03 np0005593233 systemd[1]: Reloading.
Jan 23 03:54:03 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:54:03 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:54:03 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:54:05 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:54:05 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:54:05 np0005593233 systemd[1]: run-rc1aaef19c120406084a4cb712c748429.service: Deactivated successfully.
Jan 23 03:54:06 np0005593233 python3.9[48861]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:54:06 np0005593233 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 03:54:06 np0005593233 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 03:54:06 np0005593233 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 03:54:06 np0005593233 systemd[1]: Stopping Network Manager...
Jan 23 03:54:06 np0005593233 NetworkManager[7202]: <info>  [1769158446.4704] caught SIGTERM, shutting down normally.
Jan 23 03:54:06 np0005593233 NetworkManager[7202]: <info>  [1769158446.4760] dhcp4 (eth0): canceled DHCP transaction
Jan 23 03:54:06 np0005593233 NetworkManager[7202]: <info>  [1769158446.4760] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:54:06 np0005593233 NetworkManager[7202]: <info>  [1769158446.4760] dhcp4 (eth0): state changed no lease
Jan 23 03:54:06 np0005593233 NetworkManager[7202]: <info>  [1769158446.4764] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:54:06 np0005593233 NetworkManager[7202]: <info>  [1769158446.4838] exiting (success)
Jan 23 03:54:06 np0005593233 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:54:06 np0005593233 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:54:06 np0005593233 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 03:54:06 np0005593233 systemd[1]: Stopped Network Manager.
Jan 23 03:54:06 np0005593233 systemd[1]: NetworkManager.service: Consumed 15.828s CPU time, 4.1M memory peak, read 0B from disk, written 28.5K to disk.
Jan 23 03:54:06 np0005593233 systemd[1]: Starting Network Manager...
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.5898] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:66815a1e-6ee4-4c69-a06a-3cb6c4c50564)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.5902] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.5996] manager[0x5647231ac000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 03:54:06 np0005593233 systemd[1]: Starting Hostname Service...
Jan 23 03:54:06 np0005593233 systemd[1]: Started Hostname Service.
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6777] hostname: hostname: using hostnamed
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6778] hostname: static hostname changed from (none) to "compute-1"
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6783] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6788] manager[0x5647231ac000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6790] manager[0x5647231ac000]: rfkill: WWAN hardware radio set enabled
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6811] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6820] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6821] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6821] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6822] manager: Networking is enabled by state file
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6824] settings: Loaded settings plugin: keyfile (internal)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6827] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6851] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6860] dhcp: init: Using DHCP client 'internal'
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6862] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6868] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6875] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6882] device (lo): Activation: starting connection 'lo' (1b1b4b7e-5795-461d-975a-62ee381f5d15)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6890] device (eth0): carrier: link connected
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6894] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6899] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6900] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6907] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6915] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6922] device (eth1): carrier: link connected
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6925] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6930] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (627d92e8-6942-5fb9-ae4f-28a944c718dc) (indicated)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6931] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6937] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6945] device (eth1): Activation: starting connection 'ci-private-network' (627d92e8-6942-5fb9-ae4f-28a944c718dc)
Jan 23 03:54:06 np0005593233 systemd[1]: Started Network Manager.
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6955] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6981] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6984] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6986] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6989] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6994] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.6999] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7002] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7008] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7019] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7026] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7041] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7063] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7074] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7076] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7079] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7084] device (lo): Activation: successful, device activated.
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7095] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7170] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7176] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7178] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 systemd[1]: Starting Network Manager Wait Online...
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7182] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7184] device (eth1): Activation: successful, device activated.
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7200] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7202] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7205] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7208] device (eth0): Activation: successful, device activated.
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7213] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 03:54:06 np0005593233 NetworkManager[48871]: <info>  [1769158446.7216] manager: startup complete
Jan 23 03:54:06 np0005593233 systemd[1]: Finished Network Manager Wait Online.
Jan 23 03:54:07 np0005593233 python3.9[49087]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:54:12 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:54:12 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:54:12 np0005593233 systemd[1]: Reloading.
Jan 23 03:54:12 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:54:12 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:54:12 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:54:13 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:54:13 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:54:13 np0005593233 systemd[1]: run-r46c9eb06f9cc44f5881e6e60d32e7f85.service: Deactivated successfully.
Jan 23 03:54:16 np0005593233 python3.9[49548]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:54:16 np0005593233 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:54:17 np0005593233 python3.9[49700]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:18 np0005593233 python3.9[49854]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:18 np0005593233 python3.9[50006]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:19 np0005593233 python3.9[50158]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:20 np0005593233 python3.9[50310]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:21 np0005593233 python3.9[50462]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:54:21 np0005593233 python3.9[50585]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158460.7191575-648-20280164363431/.source _original_basename=.93noi5o0 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:22 np0005593233 python3.9[50737]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:23 np0005593233 python3.9[50889]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 03:54:24 np0005593233 python3.9[51041]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:26 np0005593233 python3.9[51468]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 03:54:28 np0005593233 ansible-async_wrapper.py[51643]: Invoked with j253580900780 300 /home/zuul/.ansible/tmp/ansible-tmp-1769158467.1642954-846-8974121492145/AnsiballZ_edpm_os_net_config.py _
Jan 23 03:54:28 np0005593233 ansible-async_wrapper.py[51646]: Starting module and watcher
Jan 23 03:54:28 np0005593233 ansible-async_wrapper.py[51646]: Start watching 51647 (300)
Jan 23 03:54:28 np0005593233 ansible-async_wrapper.py[51647]: Start module (51647)
Jan 23 03:54:28 np0005593233 ansible-async_wrapper.py[51643]: Return async_wrapper task started.
Jan 23 03:54:28 np0005593233 python3.9[51648]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 03:54:29 np0005593233 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 03:54:29 np0005593233 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 03:54:29 np0005593233 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 03:54:29 np0005593233 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 03:54:29 np0005593233 kernel: cfg80211: failed to load regulatory.db
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.2819] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.2841] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3442] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3445] audit: op="connection-add" uuid="822a3304-ab8f-4a8c-9ad3-5391aa39952d" name="br-ex-br" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3463] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3465] audit: op="connection-add" uuid="e562d914-d3c3-4b3b-b38b-fdd615745ffe" name="br-ex-port" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3480] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3482] audit: op="connection-add" uuid="6db2dc89-8b75-44a5-89b8-2b997ca94a20" name="eth1-port" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3496] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3498] audit: op="connection-add" uuid="cfe63951-6ec2-457f-a59d-ca45ded1e50c" name="vlan20-port" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3512] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3514] audit: op="connection-add" uuid="c69f57bb-5f3c-4bd0-a9d7-d5e65876703c" name="vlan21-port" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3527] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3529] audit: op="connection-add" uuid="f6cd6db1-22ab-4174-9874-c51916c9cfe7" name="vlan22-port" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3541] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3543] audit: op="connection-add" uuid="4a4fba87-357d-41a5-b4d9-4420040cd832" name="vlan23-port" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3573] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3595] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.3597] audit: op="connection-add" uuid="e7a59cbf-4b5f-4c83-b04c-11cba88d92d8" name="br-ex-if" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.4937] audit: op="connection-update" uuid="627d92e8-6942-5fb9-ae4f-28a944c718dc" name="ci-private-network" args="connection.slave-type,connection.master,connection.timestamp,connection.controller,connection.port-type,ovs-external-ids.data,ipv6.routing-rules,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.dns,ipv6.routes,ipv4.routing-rules,ipv4.method,ipv4.never-default,ipv4.addresses,ipv4.dns,ipv4.routes,ovs-interface.type" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.4963] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.4966] audit: op="connection-add" uuid="bf5302c1-9b37-4a4c-941a-13070fb01fad" name="vlan20-if" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.4994] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.4997] audit: op="connection-add" uuid="997e85d2-197b-4bb2-8975-1ff51c42f9c4" name="vlan21-if" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5025] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5028] audit: op="connection-add" uuid="2db4eeff-70b8-4da4-bd1e-5c68940cc5b6" name="vlan22-if" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5056] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5059] audit: op="connection-add" uuid="26907dff-bfe1-4961-9e1c-ba8b7406580b" name="vlan23-if" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5079] audit: op="connection-delete" uuid="640d0f87-1153-3cd5-bb8f-96e84186f0c8" name="Wired connection 1" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5105] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5112] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5127] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5136] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (822a3304-ab8f-4a8c-9ad3-5391aa39952d)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5137] audit: op="connection-activate" uuid="822a3304-ab8f-4a8c-9ad3-5391aa39952d" name="br-ex-br" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5142] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5144] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5154] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5162] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (e562d914-d3c3-4b3b-b38b-fdd615745ffe)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5166] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5168] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5177] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5185] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (6db2dc89-8b75-44a5-89b8-2b997ca94a20)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5188] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5190] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5200] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5208] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (cfe63951-6ec2-457f-a59d-ca45ded1e50c)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5213] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5215] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5225] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5233] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c69f57bb-5f3c-4bd0-a9d7-d5e65876703c)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5236] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5238] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5248] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5257] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f6cd6db1-22ab-4174-9874-c51916c9cfe7)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5261] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5263] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5273] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5281] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (4a4fba87-357d-41a5-b4d9-4420040cd832)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5283] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5289] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5293] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5305] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5307] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5313] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5321] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e7a59cbf-4b5f-4c83-b04c-11cba88d92d8)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5323] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5330] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5334] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5337] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5340] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5360] device (eth1): disconnecting for new activation request.
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5362] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5368] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5372] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5375] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5381] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5383] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5390] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5398] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (bf5302c1-9b37-4a4c-941a-13070fb01fad)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5400] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5406] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5410] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5412] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5418] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5420] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5426] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5435] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (997e85d2-197b-4bb2-8975-1ff51c42f9c4)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5437] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5443] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5447] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5450] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5454] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5456] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5460] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5466] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (2db4eeff-70b8-4da4-bd1e-5c68940cc5b6)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5467] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5471] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5473] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5475] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5479] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <warn>  [1769158470.5480] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5484] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5491] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (26907dff-bfe1-4961-9e1c-ba8b7406580b)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5492] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5496] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5498] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5500] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5502] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5518] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=51649 uid=0 result="success"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5520] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5524] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5527] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5536] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5540] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5545] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5550] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5552] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:54:30 np0005593233 kernel: ovs-system: entered promiscuous mode
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5575] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5581] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 systemd-udevd[51653]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:54:30 np0005593233 kernel: Timeout policy base is empty
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5585] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5587] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5607] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5612] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5616] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5618] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5627] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5632] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5637] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5640] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5648] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5655] dhcp4 (eth0): canceled DHCP transaction
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5656] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5656] dhcp4 (eth0): state changed no lease
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5661] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5678] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5687] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51649 uid=0 result="fail" reason="Device is not activated"
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5693] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.5702] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:54:30 np0005593233 kernel: br-ex: entered promiscuous mode
Jan 23 03:54:30 np0005593233 kernel: vlan22: entered promiscuous mode
Jan 23 03:54:30 np0005593233 systemd-udevd[51654]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:54:30 np0005593233 kernel: vlan20: entered promiscuous mode
Jan 23 03:54:30 np0005593233 kernel: vlan21: entered promiscuous mode
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.6923] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.6931] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.6946] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.6968] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.6983] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.6994] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593233 NetworkManager[48871]: <info>  [1769158470.7004] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593233 kernel: vlan23: entered promiscuous mode
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3569] device (eth1): disconnecting for new activation request.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3572] audit: op="connection-activate" uuid="627d92e8-6942-5fb9-ae4f-28a944c718dc" name="ci-private-network" pid=51649 uid=0 result="success"
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3572] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3756] device (eth1): Activation: starting connection 'ci-private-network' (627d92e8-6942-5fb9-ae4f-28a944c718dc)
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3760] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3761] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3763] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3763] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3764] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3765] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3766] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3796] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3797] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3801] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3807] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3810] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3814] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3817] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3821] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3824] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3827] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3830] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3833] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3836] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3840] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3843] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3847] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3850] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3855] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51649 uid=0 result="success"
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3881] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3904] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3907] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3913] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3924] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3939] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3943] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3950] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3955] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3959] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3967] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3969] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3970] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3973] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3976] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3983] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3986] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3991] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.3995] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.4001] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.4002] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.4003] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.4007] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.4013] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:31 np0005593233 NetworkManager[48871]: <info>  [1769158471.4017] device (eth1): Activation: successful, device activated.
Jan 23 03:54:32 np0005593233 python3.9[52006]: ansible-ansible.legacy.async_status Invoked with jid=j253580900780.51643 mode=status _async_dir=/root/.ansible_async
Jan 23 03:54:32 np0005593233 NetworkManager[48871]: <info>  [1769158472.7564] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51649 uid=0 result="success"
Jan 23 03:54:32 np0005593233 NetworkManager[48871]: <info>  [1769158472.9320] checkpoint[0x564723182950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 03:54:32 np0005593233 NetworkManager[48871]: <info>  [1769158472.9323] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 ansible-async_wrapper.py[51646]: 51647 still running (300)
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.2631] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.2644] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.5137] audit: op="networking-control" arg="global-dns-configuration" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.5180] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.5220] audit: op="networking-control" arg="global-dns-configuration" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.5256] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.7440] checkpoint[0x564723182a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 03:54:33 np0005593233 NetworkManager[48871]: <info>  [1769158473.7445] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51649 uid=0 result="success"
Jan 23 03:54:33 np0005593233 ansible-async_wrapper.py[51647]: Module complete (51647)
Jan 23 03:54:35 np0005593233 python3.9[52112]: ansible-ansible.legacy.async_status Invoked with jid=j253580900780.51643 mode=status _async_dir=/root/.ansible_async
Jan 23 03:54:36 np0005593233 python3.9[52212]: ansible-ansible.legacy.async_status Invoked with jid=j253580900780.51643 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 03:54:36 np0005593233 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:54:37 np0005593233 python3.9[52366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:54:37 np0005593233 python3.9[52489]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158476.6838055-927-246112524278406/.source.returncode _original_basename=.p740gan8 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:38 np0005593233 ansible-async_wrapper.py[51646]: Done in kid B.
Jan 23 03:54:38 np0005593233 python3.9[52641]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:54:39 np0005593233 python3.9[52765]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158478.0824373-975-79620480298486/.source.cfg _original_basename=.pnitut0l follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:40 np0005593233 python3.9[52917]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:54:40 np0005593233 systemd[1]: Reloading Network Manager...
Jan 23 03:54:40 np0005593233 NetworkManager[48871]: <info>  [1769158480.2284] audit: op="reload" arg="0" pid=52921 uid=0 result="success"
Jan 23 03:54:40 np0005593233 NetworkManager[48871]: <info>  [1769158480.2298] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 03:54:40 np0005593233 systemd[1]: Reloaded Network Manager.
Jan 23 03:54:41 np0005593233 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 03:54:41 np0005593233 systemd[1]: session-11.scope: Consumed 51.795s CPU time.
Jan 23 03:54:41 np0005593233 systemd-logind[804]: Session 11 logged out. Waiting for processes to exit.
Jan 23 03:54:41 np0005593233 systemd-logind[804]: Removed session 11.
Jan 23 03:54:46 np0005593233 systemd-logind[804]: New session 12 of user zuul.
Jan 23 03:54:46 np0005593233 systemd[1]: Started Session 12 of User zuul.
Jan 23 03:54:47 np0005593233 python3.9[53105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:54:48 np0005593233 python3.9[53259]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:54:49 np0005593233 python3.9[53453]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:54:50 np0005593233 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:54:50 np0005593233 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 03:54:50 np0005593233 systemd[1]: session-12.scope: Consumed 2.560s CPU time.
Jan 23 03:54:50 np0005593233 systemd-logind[804]: Session 12 logged out. Waiting for processes to exit.
Jan 23 03:54:50 np0005593233 systemd-logind[804]: Removed session 12.
Jan 23 03:54:56 np0005593233 systemd-logind[804]: New session 13 of user zuul.
Jan 23 03:54:56 np0005593233 systemd[1]: Started Session 13 of User zuul.
Jan 23 03:54:57 np0005593233 python3.9[53635]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:54:58 np0005593233 python3.9[53790]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:54:59 np0005593233 python3.9[53946]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:00 np0005593233 python3.9[54030]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:55:02 np0005593233 python3.9[54184]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:04 np0005593233 python3.9[54379]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:05 np0005593233 python3.9[54531]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:55:05 np0005593233 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1702556588-merged.mount: Deactivated successfully.
Jan 23 03:55:05 np0005593233 podman[54532]: 2026-01-23 08:55:05.929097758 +0000 UTC m=+0.056142543 system refresh
Jan 23 03:55:05 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 03:55:09 np0005593233 python3.9[54694]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:09 np0005593233 python3.9[54817]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158508.529324-198-139873896516235/.source.json follow=False _original_basename=podman_network_config.j2 checksum=aa6ee25b6ad6aab5d38ae40d1d30336917ea66c9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:10 np0005593233 python3.9[54969]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:11 np0005593233 python3.9[55092]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158510.1815763-243-193225256718872/.source.conf follow=False _original_basename=registries.conf.j2 checksum=7d6103ee1a01cd01d921f72f1af62704e0a47ff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:12 np0005593233 python3.9[55244]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:12 np0005593233 python3.9[55396]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:13 np0005593233 python3.9[55548]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:14 np0005593233 python3.9[55700]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:15 np0005593233 python3.9[55852]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:55:17 np0005593233 python3.9[56005]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:55:18 np0005593233 python3.9[56159]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:55:19 np0005593233 python3.9[56311]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:55:20 np0005593233 python3.9[56463]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:55:21 np0005593233 python3.9[56616]: ansible-service_facts Invoked
Jan 23 03:55:21 np0005593233 network[56633]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 03:55:21 np0005593233 network[56634]: 'network-scripts' will be removed from distribution in near future.
Jan 23 03:55:21 np0005593233 network[56635]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 03:55:27 np0005593233 python3.9[57087]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:55:30 np0005593233 python3.9[57240]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 03:55:31 np0005593233 python3.9[57392]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:32 np0005593233 python3.9[57517]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158531.3285427-676-165733252073912/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:33 np0005593233 python3.9[57671]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:33 np0005593233 python3.9[57796]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158532.7938952-721-220233448980264/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:35 np0005593233 python3.9[57950]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:37 np0005593233 python3.9[58104]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:38 np0005593233 python3.9[58188]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:55:41 np0005593233 python3.9[58342]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:42 np0005593233 python3.9[58426]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:55:42 np0005593233 chronyd[783]: chronyd exiting
Jan 23 03:55:42 np0005593233 systemd[1]: Stopping NTP client/server...
Jan 23 03:55:42 np0005593233 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 03:55:42 np0005593233 systemd[1]: Stopped NTP client/server.
Jan 23 03:55:42 np0005593233 systemd[1]: Starting NTP client/server...
Jan 23 03:55:42 np0005593233 chronyd[58434]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 03:55:42 np0005593233 chronyd[58434]: Frequency -28.392 +/- 0.140 ppm read from /var/lib/chrony/drift
Jan 23 03:55:42 np0005593233 chronyd[58434]: Loaded seccomp filter (level 2)
Jan 23 03:55:42 np0005593233 systemd[1]: Started NTP client/server.
Jan 23 03:55:43 np0005593233 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 03:55:43 np0005593233 systemd[1]: session-13.scope: Consumed 26.920s CPU time.
Jan 23 03:55:43 np0005593233 systemd-logind[804]: Session 13 logged out. Waiting for processes to exit.
Jan 23 03:55:43 np0005593233 systemd-logind[804]: Removed session 13.
Jan 23 03:55:49 np0005593233 systemd-logind[804]: New session 14 of user zuul.
Jan 23 03:55:49 np0005593233 systemd[1]: Started Session 14 of User zuul.
Jan 23 03:55:50 np0005593233 python3.9[58615]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:50 np0005593233 python3.9[58767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:51 np0005593233 python3.9[58890]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158550.258912-63-229300782083268/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:51 np0005593233 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 03:55:51 np0005593233 systemd[1]: session-14.scope: Consumed 1.645s CPU time.
Jan 23 03:55:51 np0005593233 systemd-logind[804]: Session 14 logged out. Waiting for processes to exit.
Jan 23 03:55:51 np0005593233 systemd-logind[804]: Removed session 14.
Jan 23 03:55:57 np0005593233 systemd-logind[804]: New session 15 of user zuul.
Jan 23 03:55:57 np0005593233 systemd[1]: Started Session 15 of User zuul.
Jan 23 03:55:58 np0005593233 python3.9[59068]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:55:59 np0005593233 python3.9[59224]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:00 np0005593233 python3.9[59399]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:01 np0005593233 python3.9[59522]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769158559.5721703-84-120701295227231/.source.json _original_basename=.atyp9xts follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:02 np0005593233 python3.9[59674]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:02 np0005593233 python3.9[59797]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158561.880914-153-215907021382334/.source _original_basename=.h7an_0fy follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:03 np0005593233 python3.9[59949]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:56:04 np0005593233 python3.9[60101]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:04 np0005593233 python3.9[60224]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158563.8705935-225-123948016527078/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:56:05 np0005593233 python3.9[60376]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:05 np0005593233 python3.9[60499]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158565.0098252-225-270282601851600/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:56:06 np0005593233 python3.9[60651]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:07 np0005593233 python3.9[60803]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:07 np0005593233 python3.9[60926]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158566.932538-336-65626887946779/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:08 np0005593233 python3.9[61078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:09 np0005593233 python3.9[61201]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158568.146291-381-73561796074260/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:10 np0005593233 python3.9[61353]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:10 np0005593233 systemd[1]: Reloading.
Jan 23 03:56:10 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:10 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:10 np0005593233 systemd[1]: Reloading.
Jan 23 03:56:10 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:10 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:10 np0005593233 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 03:56:10 np0005593233 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 03:56:11 np0005593233 python3.9[61581]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:12 np0005593233 python3.9[61704]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158571.2239184-450-208323251866706/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:13 np0005593233 python3.9[61856]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:13 np0005593233 python3.9[61979]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158572.6508505-495-260608646067574/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:14 np0005593233 python3.9[62131]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:14 np0005593233 systemd[1]: Reloading.
Jan 23 03:56:14 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:14 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:14 np0005593233 systemd[1]: Reloading.
Jan 23 03:56:14 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:14 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:15 np0005593233 systemd[1]: Starting Create netns directory...
Jan 23 03:56:15 np0005593233 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 03:56:15 np0005593233 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 03:56:15 np0005593233 systemd[1]: Finished Create netns directory.
Jan 23 03:56:15 np0005593233 python3.9[62358]: ansible-ansible.builtin.service_facts Invoked
Jan 23 03:56:16 np0005593233 network[62375]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 03:56:16 np0005593233 network[62376]: 'network-scripts' will be removed from distribution in near future.
Jan 23 03:56:16 np0005593233 network[62377]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 03:56:20 np0005593233 python3.9[62639]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:20 np0005593233 systemd[1]: Reloading.
Jan 23 03:56:20 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:20 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:20 np0005593233 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 03:56:20 np0005593233 iptables.init[62679]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 03:56:20 np0005593233 iptables.init[62679]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 03:56:20 np0005593233 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 03:56:20 np0005593233 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 03:56:21 np0005593233 python3.9[62875]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:22 np0005593233 python3.9[63029]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:22 np0005593233 systemd[1]: Reloading.
Jan 23 03:56:23 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:23 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:23 np0005593233 systemd[1]: Starting Netfilter Tables...
Jan 23 03:56:23 np0005593233 systemd[1]: Finished Netfilter Tables.
Jan 23 03:56:24 np0005593233 python3.9[63222]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:25 np0005593233 python3.9[63375]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:25 np0005593233 python3.9[63500]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158584.7711282-702-270122212809261/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:26 np0005593233 python3.9[63653]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:56:26 np0005593233 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 03:56:26 np0005593233 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 03:56:27 np0005593233 python3.9[63809]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:28 np0005593233 python3.9[63961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:28 np0005593233 python3.9[64084]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158587.6495936-795-92343699805435/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:29 np0005593233 python3.9[64236]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 03:56:29 np0005593233 systemd[1]: Starting Time & Date Service...
Jan 23 03:56:30 np0005593233 systemd[1]: Started Time & Date Service.
Jan 23 03:56:31 np0005593233 python3.9[64392]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:32 np0005593233 python3.9[64544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:33 np0005593233 python3.9[64667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158592.0663378-900-252286781011828/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:33 np0005593233 python3.9[64819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:34 np0005593233 python3.9[64942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158593.2962809-945-236966654775474/.source.yaml _original_basename=.kxv239xb follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:35 np0005593233 python3.9[65094]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:35 np0005593233 python3.9[65217]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158594.598355-990-146916063551048/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:36 np0005593233 python3.9[65369]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:36 np0005593233 python3.9[65522]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:37 np0005593233 python3[65675]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 03:56:38 np0005593233 python3.9[65827]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:39 np0005593233 python3.9[65950]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158598.414317-1107-266119830574807/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:40 np0005593233 python3.9[66102]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:40 np0005593233 python3.9[66225]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158599.8067722-1152-180543470470123/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:41 np0005593233 python3.9[66377]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:42 np0005593233 python3.9[66500]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158601.077538-1197-102934712743544/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:42 np0005593233 python3.9[66652]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:43 np0005593233 python3.9[66775]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158602.4025762-1242-90473127926075/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:44 np0005593233 python3.9[66927]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:44 np0005593233 python3.9[67050]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158603.6905217-1287-14239043330288/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:45 np0005593233 python3.9[67202]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:46 np0005593233 python3.9[67354]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:47 np0005593233 python3.9[67513]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:48 np0005593233 python3.9[67666]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:48 np0005593233 python3.9[67818]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:49 np0005593233 python3.9[67970]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 03:56:50 np0005593233 python3.9[68123]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 03:56:51 np0005593233 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 03:56:51 np0005593233 systemd[1]: session-15.scope: Consumed 36.843s CPU time.
Jan 23 03:56:51 np0005593233 systemd-logind[804]: Session 15 logged out. Waiting for processes to exit.
Jan 23 03:56:51 np0005593233 systemd-logind[804]: Removed session 15.
Jan 23 03:56:56 np0005593233 systemd-logind[804]: New session 16 of user zuul.
Jan 23 03:56:56 np0005593233 systemd[1]: Started Session 16 of User zuul.
Jan 23 03:56:57 np0005593233 python3.9[68304]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 03:56:58 np0005593233 python3.9[68456]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:56:59 np0005593233 python3.9[68608]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:00 np0005593233 python3.9[68760]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD7jdzOPltwN8PSb4q9DCiO5zY7TIK6sENpltjjN4gdZgxOTsj/dxnfxJlO2lYI1dFyyFnDdZj88a4x1KI5Bnnvl5KRvvZiianfivZWKq9Ngf9fzf7+5CsDFBiu6a7GAfXMf9FocVpqlXf7fsXmb5Iv2xUpNnye4EFIuW965X3SNrRpujRnDe+i0lIwrOsus4R86qn38MWOLfPBAWFYdBaVfTUYjC0eT/I81Y/T2RKqf7XK/bsuHobZ+/a7lymuPsS9L0DFg25ZoIlvkPUVfZxTO5FCyw8GMR+AgbnMQyHwx2JAmewwH3M2l+zVdDQjsE1ZRFlJCmwle9LBa1oFhuLfxLqsykQploeB5Ch/VppbnRQ/GamwWLU5HEKMH2wZ6IymURW7nSStlEhNWvK+Bb9rIy65M6AFOEW94xId4nc+IraS6rc2cuM3Rp97S/6olqjlFDZisdUwdAlhIKuJjA7SsYZ6HyCEbRN3mvMnWbkqpyY605kewQ6kdmucNeWgRtk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE++PPNOKtggGl2mGWEm1DV2WpblvGA/F2TEEVeMrsU2#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP3uOoytpWGDF46u3wwDFxwF05HMnZd51GvbceZrDgZRmc5sxbF+OawPD9kGTcjnaUTzvqWgbFNvcmpuaNTnpzc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsLbdPIA8nc52wSKcOItc1xJ6faU3FwhWecUgXZZC+Q1wLSrdN9vgOExBhQSwwodluzJ5/GT9VbCuujyBvk7RMEim1+fw7T58Th56PR8y2lL6F6F3ni4S21QxInTLml+/id8wwEZAkFjbCF/AjCRDyH7a6H4wIZtd5ZuzWJuuBENNdtu/qD1QQYkNegqllogNpkdpAFZgvee26yw2sbCX8kpbJoJsowaQUckoRtT2jj7985CLxErKZ8YO8ZozjfuCDCKbcJT0KFimievJZmKXvGaWG5H+P509XDsfN62aQr22US8FbYjdK1lfrJoetkc/MK4h7QuCs6MH2qYiqXIkJYKMSReM+sH3X7V7pSWSUkr0DHREVvBGcC2lRSx45lUCTEtcTY7XmxGORvCORMYla0l1H3mEIkfYLS4sXYtRSHkyFnyQgbNP5MnrmXlK0vrAA81r5U+dOhIL/H2e7S4xcLItH7weUOHIAmCj266mm9+xJyyd7NZ+eUgS0Md5p4Bc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUSudroiFEdRPXgUCqRHbNRLelYP5RQGMMCn6zD8pfH#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJLsx8RxJz6M7PIyGcFdzR+Ldl788501Y8ZWLJ8hnDzMCaRkGjzE+kzO/uN75IEtV3aVEl1jNQlk7wON+lORGQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq2Yxebv3BUxXHPuf6nN00teEMYUUVEWMZOqcwNO1dyibdbyxre6VweeeiBR/lerW1mIcmB67juCuLffEgDo8uPtZx9HrD1psd+ji78YeJuvbKIEcTwdtGF0I8PeogHunx+4KBxFsHeF6JHN9+H7lTHiSSIDFzk9BwDkAKEWsYHe8z+5SPDU//XiYNv0drE59KiQF586rnjPR3VZk6WaR+hp2PiHbUUSOvnyB4kI4bCXSCU/Oxv7HDvgeCJapABjisMZg4aiteZ7EaD1yVndkQiS6OxfOGP1srgtNkRL4Idc/XCFXH754lbRd8GzUF0n8N0HbWTcFDuTU+bvhuIH+3EDNxsDQkSCdJTw2EPb/mqZVdXSFxLXUBcXnYkBWZirpgC3g6okg2RQU2bxigFs7lFwJT6QE+wz0DK7Z3ib0XQxjRlY6PIwn1D2soMwKVarxpeM2FfsGrHMHaHioRTVbKpzBMA1oUICSUCvzyhd0I43cO2rUEK/8EMYSsTVRulKs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII4nVnNUbCVQAtKJF7UUtMQxNhMw9eVlRVofBpQ70iUi#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPqfkBgoQjr/gZBK1F9K576GMtkxSY6lVgROItGrW+R9EA2lvnOt71IGO0M0lGVvCkTtLktdNpSsYnBu2cJn+4c=#012 create=True mode=0644 path=/tmp/ansible.y3pf2ajk state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:01 np0005593233 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 03:57:01 np0005593233 python3.9[68914]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.y3pf2ajk' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:02 np0005593233 python3.9[69068]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.y3pf2ajk state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:03 np0005593233 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 03:57:03 np0005593233 systemd[1]: session-16.scope: Consumed 3.683s CPU time.
Jan 23 03:57:03 np0005593233 systemd-logind[804]: Session 16 logged out. Waiting for processes to exit.
Jan 23 03:57:03 np0005593233 systemd-logind[804]: Removed session 16.
Jan 23 03:57:09 np0005593233 systemd-logind[804]: New session 17 of user zuul.
Jan 23 03:57:09 np0005593233 systemd[1]: Started Session 17 of User zuul.
Jan 23 03:57:10 np0005593233 python3.9[69246]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:11 np0005593233 python3.9[69402]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 03:57:13 np0005593233 python3.9[69556]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:57:14 np0005593233 python3.9[69709]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:15 np0005593233 python3.9[69862]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:57:16 np0005593233 python3.9[70016]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:17 np0005593233 python3.9[70171]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:17 np0005593233 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 03:57:17 np0005593233 systemd[1]: session-17.scope: Consumed 4.465s CPU time.
Jan 23 03:57:17 np0005593233 systemd-logind[804]: Session 17 logged out. Waiting for processes to exit.
Jan 23 03:57:17 np0005593233 systemd-logind[804]: Removed session 17.
Jan 23 03:57:23 np0005593233 systemd-logind[804]: New session 18 of user zuul.
Jan 23 03:57:23 np0005593233 systemd[1]: Started Session 18 of User zuul.
Jan 23 03:57:24 np0005593233 python3.9[70349]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:26 np0005593233 python3.9[70505]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:57:26 np0005593233 python3.9[70589]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 03:57:29 np0005593233 python3.9[70740]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:30 np0005593233 python3.9[70891]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 03:57:31 np0005593233 python3.9[71041]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:57:31 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:57:32 np0005593233 python3.9[71192]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:57:33 np0005593233 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 03:57:33 np0005593233 systemd[1]: session-18.scope: Consumed 5.924s CPU time.
Jan 23 03:57:33 np0005593233 systemd-logind[804]: Session 18 logged out. Waiting for processes to exit.
Jan 23 03:57:33 np0005593233 systemd-logind[804]: Removed session 18.
Jan 23 03:57:40 np0005593233 systemd-logind[804]: New session 19 of user zuul.
Jan 23 03:57:40 np0005593233 systemd[1]: Started Session 19 of User zuul.
Jan 23 03:57:46 np0005593233 python3[71958]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:48 np0005593233 python3[72054]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 03:57:50 np0005593233 python3[72081]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 03:57:50 np0005593233 python3[72107]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:50 np0005593233 kernel: loop: module loaded
Jan 23 03:57:50 np0005593233 kernel: loop3: detected capacity change from 0 to 14680064
Jan 23 03:57:51 np0005593233 python3[72142]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:51 np0005593233 lvm[72146]: PV /dev/loop3 not used.
Jan 23 03:57:51 np0005593233 lvm[72155]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 03:57:51 np0005593233 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 23 03:57:51 np0005593233 lvm[72157]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 23 03:57:51 np0005593233 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 23 03:57:51 np0005593233 python3[72235]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:57:52 np0005593233 python3[72308]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158671.524178-36870-243819163456427/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:52 np0005593233 chronyd[58434]: Selected source 216.232.132.95 (pool.ntp.org)
Jan 23 03:57:53 np0005593233 python3[72358]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:57:53 np0005593233 systemd[1]: Reloading.
Jan 23 03:57:53 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:57:53 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:57:53 np0005593233 systemd[1]: Starting Ceph OSD losetup...
Jan 23 03:57:53 np0005593233 bash[72397]: /dev/loop3: [64513]:4328450 (/var/lib/ceph-osd-0.img)
Jan 23 03:57:53 np0005593233 systemd[1]: Finished Ceph OSD losetup.
Jan 23 03:57:53 np0005593233 lvm[72398]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 03:57:53 np0005593233 lvm[72398]: VG ceph_vg0 finished
Jan 23 03:57:56 np0005593233 python3[72422]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:00:08 np0005593233 systemd-logind[804]: New session 20 of user ceph-admin.
Jan 23 04:00:08 np0005593233 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 04:00:08 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 04:00:08 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 04:00:08 np0005593233 systemd[1]: Starting User Manager for UID 42477...
Jan 23 04:00:08 np0005593233 systemd-logind[804]: New session 22 of user ceph-admin.
Jan 23 04:00:08 np0005593233 systemd[72470]: Queued start job for default target Main User Target.
Jan 23 04:00:08 np0005593233 systemd[72470]: Created slice User Application Slice.
Jan 23 04:00:08 np0005593233 systemd[72470]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:00:08 np0005593233 systemd[72470]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:00:08 np0005593233 systemd[72470]: Reached target Paths.
Jan 23 04:00:08 np0005593233 systemd[72470]: Reached target Timers.
Jan 23 04:00:08 np0005593233 systemd[72470]: Starting D-Bus User Message Bus Socket...
Jan 23 04:00:08 np0005593233 systemd[72470]: Starting Create User's Volatile Files and Directories...
Jan 23 04:00:08 np0005593233 systemd[72470]: Finished Create User's Volatile Files and Directories.
Jan 23 04:00:08 np0005593233 systemd[72470]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:00:08 np0005593233 systemd[72470]: Reached target Sockets.
Jan 23 04:00:08 np0005593233 systemd[72470]: Reached target Basic System.
Jan 23 04:00:08 np0005593233 systemd[72470]: Reached target Main User Target.
Jan 23 04:00:08 np0005593233 systemd[72470]: Startup finished in 135ms.
Jan 23 04:00:08 np0005593233 systemd[1]: Started User Manager for UID 42477.
Jan 23 04:00:08 np0005593233 systemd[1]: Started Session 20 of User ceph-admin.
Jan 23 04:00:08 np0005593233 systemd[1]: Started Session 22 of User ceph-admin.
Jan 23 04:00:09 np0005593233 systemd-logind[804]: New session 23 of user ceph-admin.
Jan 23 04:00:09 np0005593233 systemd[1]: Started Session 23 of User ceph-admin.
Jan 23 04:00:09 np0005593233 systemd-logind[804]: New session 24 of user ceph-admin.
Jan 23 04:00:09 np0005593233 systemd[1]: Started Session 24 of User ceph-admin.
Jan 23 04:00:10 np0005593233 systemd-logind[804]: New session 25 of user ceph-admin.
Jan 23 04:00:10 np0005593233 systemd[1]: Started Session 25 of User ceph-admin.
Jan 23 04:00:10 np0005593233 systemd-logind[804]: New session 26 of user ceph-admin.
Jan 23 04:00:10 np0005593233 systemd[1]: Started Session 26 of User ceph-admin.
Jan 23 04:00:11 np0005593233 systemd-logind[804]: New session 27 of user ceph-admin.
Jan 23 04:00:11 np0005593233 systemd[1]: Started Session 27 of User ceph-admin.
Jan 23 04:00:11 np0005593233 systemd-logind[804]: New session 28 of user ceph-admin.
Jan 23 04:00:11 np0005593233 systemd[1]: Started Session 28 of User ceph-admin.
Jan 23 04:00:12 np0005593233 systemd-logind[804]: New session 29 of user ceph-admin.
Jan 23 04:00:12 np0005593233 systemd[1]: Started Session 29 of User ceph-admin.
Jan 23 04:00:12 np0005593233 systemd-logind[804]: New session 30 of user ceph-admin.
Jan 23 04:00:12 np0005593233 systemd[1]: Started Session 30 of User ceph-admin.
Jan 23 04:00:13 np0005593233 systemd-logind[804]: New session 31 of user ceph-admin.
Jan 23 04:00:13 np0005593233 systemd[1]: Started Session 31 of User ceph-admin.
Jan 23 04:00:13 np0005593233 systemd-logind[804]: New session 32 of user ceph-admin.
Jan 23 04:00:13 np0005593233 systemd[1]: Started Session 32 of User ceph-admin.
Jan 23 04:00:13 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:14 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:15 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:15 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:15 np0005593233 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73442 (sysctl)
Jan 23 04:00:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:16 np0005593233 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 04:00:16 np0005593233 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 04:00:17 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:17 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:17 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:20 np0005593233 systemd[1]: var-lib-containers-storage-overlay-compat731264534-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.116937932 +0000 UTC m=+32.476375181 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.133235336 +0000 UTC m=+32.492672555 container create 9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_tesla, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:00:50 np0005593233 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 04:00:50 np0005593233 systemd[1]: Started libpod-conmon-9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f.scope.
Jan 23 04:00:50 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.232163505 +0000 UTC m=+32.591600754 container init 9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_tesla, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.238878358 +0000 UTC m=+32.598315577 container start 9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_tesla, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.243266098 +0000 UTC m=+32.602703367 container attach 9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_tesla, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Jan 23 04:00:50 np0005593233 exciting_tesla[73776]: 167 167
Jan 23 04:00:50 np0005593233 systemd[1]: libpod-9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f.scope: Deactivated successfully.
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.248252454 +0000 UTC m=+32.607689673 container died 9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_tesla, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Jan 23 04:00:50 np0005593233 systemd[1]: var-lib-containers-storage-overlay-b8cb9ca8953924835da8f05861578ceaa9380a94b4374a512d3700c8b93d1472-merged.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593233 podman[73715]: 2026-01-23 09:00:50.285572162 +0000 UTC m=+32.645009371 container remove 9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=exciting_tesla, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 23 04:00:50 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593233 systemd[1]: libpod-conmon-9c4614101bc82464081de37a1682b40626fafeb64f089271e315207fe34b143f.scope: Deactivated successfully.
Jan 23 04:00:50 np0005593233 podman[73801]: 2026-01-23 09:00:50.455037695 +0000 UTC m=+0.043040065 container create b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 04:00:50 np0005593233 podman[73801]: 2026-01-23 09:00:50.43761511 +0000 UTC m=+0.025617500 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:00:50 np0005593233 systemd[1]: Started libpod-conmon-b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d.scope.
Jan 23 04:00:50 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:00:50 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9193c6f6d948922bcaa430c8fb9af8e7c49f3056ba2a471671bbd0947cc798d2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:50 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9193c6f6d948922bcaa430c8fb9af8e7c49f3056ba2a471671bbd0947cc798d2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:50 np0005593233 podman[73801]: 2026-01-23 09:00:50.601453839 +0000 UTC m=+0.189456229 container init b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:00:50 np0005593233 podman[73801]: 2026-01-23 09:00:50.610413554 +0000 UTC m=+0.198415924 container start b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 04:00:50 np0005593233 podman[73801]: 2026-01-23 09:00:50.614978388 +0000 UTC m=+0.202980858 container attach b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]: [
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:    {
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "available": false,
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "ceph_device": false,
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "lsm_data": {},
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "lvs": [],
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "path": "/dev/sr0",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "rejected_reasons": [
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "Has a FileSystem",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "Insufficient space (<5GB)"
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        ],
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        "sys_api": {
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "actuators": null,
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "device_nodes": "sr0",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "devname": "sr0",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "human_readable_size": "482.00 KB",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "id_bus": "ata",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "model": "QEMU DVD-ROM",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "nr_requests": "2",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "parent": "/dev/sr0",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "partitions": {},
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "path": "/dev/sr0",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "removable": "1",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "rev": "2.5+",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "ro": "0",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "rotational": "1",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "sas_address": "",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "sas_device_handle": "",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "scheduler_mode": "mq-deadline",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "sectors": 0,
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "sectorsize": "2048",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "size": 493568.0,
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "support_discard": "2048",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "type": "disk",
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:            "vendor": "QEMU"
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:        }
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]:    }
Jan 23 04:00:51 np0005593233 flamboyant_jang[73817]: ]
Jan 23 04:00:51 np0005593233 systemd[1]: libpod-b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d.scope: Deactivated successfully.
Jan 23 04:00:51 np0005593233 systemd[1]: libpod-b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d.scope: Consumed 1.294s CPU time.
Jan 23 04:00:51 np0005593233 podman[73801]: 2026-01-23 09:00:51.899077829 +0000 UTC m=+1.487080199 container died b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:00:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay-9193c6f6d948922bcaa430c8fb9af8e7c49f3056ba2a471671bbd0947cc798d2-merged.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593233 podman[73801]: 2026-01-23 09:00:51.957853512 +0000 UTC m=+1.545855882 container remove b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_jang, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 04:00:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593233 systemd[1]: libpod-conmon-b4a83ed303d9543216e861f30037d82f0fa533391c839422d77cdfb25be0d29d.scope: Deactivated successfully.
Jan 23 04:00:57 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:57 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.241645683 +0000 UTC m=+0.020416168 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.413237484 +0000 UTC m=+0.192007949 container create 5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_shirley, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:00:57 np0005593233 systemd[1]: Started libpod-conmon-5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93.scope.
Jan 23 04:00:57 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.628339222 +0000 UTC m=+0.407109697 container init 5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_shirley, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.636299249 +0000 UTC m=+0.415069714 container start 5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_shirley, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.6400148 +0000 UTC m=+0.418785275 container attach 5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_shirley, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Jan 23 04:00:57 np0005593233 elastic_shirley[76625]: 167 167
Jan 23 04:00:57 np0005593233 systemd[1]: libpod-5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93.scope: Deactivated successfully.
Jan 23 04:00:57 np0005593233 conmon[76625]: conmon 5de3403fbbbb0f177857 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93.scope/container/memory.events
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.643472835 +0000 UTC m=+0.422243300 container died 5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_shirley, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:00:57 np0005593233 podman[76609]: 2026-01-23 09:00:57.679099207 +0000 UTC m=+0.457869682 container remove 5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elastic_shirley, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:00:57 np0005593233 systemd[1]: libpod-conmon-5de3403fbbbb0f177857e2e2488cbd7f972586b968d4654c2087025d71cf4e93.scope: Deactivated successfully.
Jan 23 04:00:57 np0005593233 systemd[1]: Reloading.
Jan 23 04:00:57 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:57 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:00:57 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:57 np0005593233 systemd[1]: Reloading.
Jan 23 04:00:58 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:58 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:00:58 np0005593233 systemd[1]: Reached target All Ceph clusters and services.
Jan 23 04:00:58 np0005593233 systemd[1]: Reloading.
Jan 23 04:00:58 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:58 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:00:58 np0005593233 systemd[1]: Reached target Ceph cluster e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:00:58 np0005593233 systemd[1]: Reloading.
Jan 23 04:00:58 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:58 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:00:58 np0005593233 systemd[1]: Reloading.
Jan 23 04:00:59 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:59 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:00:59 np0005593233 systemd[1]: Created slice Slice /system/ceph-e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:00:59 np0005593233 systemd[1]: Reached target System Time Set.
Jan 23 04:00:59 np0005593233 systemd[1]: Reached target System Time Synchronized.
Jan 23 04:00:59 np0005593233 systemd[1]: Starting Ceph crash.compute-1 for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:00:59 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:59 np0005593233 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:00:59 np0005593233 podman[76880]: 2026-01-23 09:00:59.500807003 +0000 UTC m=+0.040730562 container create 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:00:59 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d881562a3725e67a067049707d6e36db63f3e8fa188cbf981f77be135e04202/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:59 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d881562a3725e67a067049707d6e36db63f3e8fa188cbf981f77be135e04202/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:59 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d881562a3725e67a067049707d6e36db63f3e8fa188cbf981f77be135e04202/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:59 np0005593233 podman[76880]: 2026-01-23 09:00:59.579280274 +0000 UTC m=+0.119203913 container init 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 04:00:59 np0005593233 podman[76880]: 2026-01-23 09:00:59.482834643 +0000 UTC m=+0.022758222 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:00:59 np0005593233 podman[76880]: 2026-01-23 09:00:59.589173894 +0000 UTC m=+0.129097483 container start 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 23 04:00:59 np0005593233 bash[76880]: 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc
Jan 23 04:00:59 np0005593233 systemd[1]: Started Ceph crash.compute-1 for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:00:59 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: 2026-01-23T09:01:00.000+0000 7fbf49d5a640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: 2026-01-23T09:01:00.000+0000 7fbf49d5a640 -1 AuthRegistry(0x7fbf44067440) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: 2026-01-23T09:01:00.001+0000 7fbf49d5a640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: 2026-01-23T09:01:00.001+0000 7fbf49d5a640 -1 AuthRegistry(0x7fbf49d59000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: 2026-01-23T09:01:00.004+0000 7fbf437fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: 2026-01-23T09:01:00.004+0000 7fbf49d5a640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 23 04:01:00 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1[76896]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.304186499 +0000 UTC m=+0.046769787 container create 39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_panini, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:00 np0005593233 systemd[1]: Started libpod-conmon-39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291.scope.
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.284214495 +0000 UTC m=+0.026797803 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:00 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.412575806 +0000 UTC m=+0.155159114 container init 39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_panini, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True)
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.421039857 +0000 UTC m=+0.163623145 container start 39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_panini, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.424481541 +0000 UTC m=+0.167064849 container attach 39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_panini, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 23 04:01:00 np0005593233 great_panini[77067]: 167 167
Jan 23 04:01:00 np0005593233 systemd[1]: libpod-39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291.scope: Deactivated successfully.
Jan 23 04:01:00 np0005593233 conmon[77067]: conmon 39386f2d6ccd8ddebbda <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291.scope/container/memory.events
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.429044936 +0000 UTC m=+0.171628214 container died 39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_panini, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:00 np0005593233 systemd[1]: var-lib-containers-storage-overlay-ef0c7ce8761c779e8ae08566e75f7d5943d06e554d8b7992af0f2cbbcbfe6577-merged.mount: Deactivated successfully.
Jan 23 04:01:00 np0005593233 podman[77051]: 2026-01-23 09:01:00.472431459 +0000 UTC m=+0.215014747 container remove 39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=great_panini, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:00 np0005593233 systemd[1]: libpod-conmon-39386f2d6ccd8ddebbdaeaf38a4afb88b7a00a56ff28830219c784c71276f291.scope: Deactivated successfully.
Jan 23 04:01:00 np0005593233 podman[77090]: 2026-01-23 09:01:00.666004819 +0000 UTC m=+0.070452153 container create cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_almeida, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True)
Jan 23 04:01:00 np0005593233 systemd[1]: Started libpod-conmon-cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e.scope.
Jan 23 04:01:00 np0005593233 podman[77090]: 2026-01-23 09:01:00.622192674 +0000 UTC m=+0.026640058 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:00 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:00 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6eaf1b7f57272e8c27987dea63ab851de54070441db8b2d25794a87c2064535/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:00 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6eaf1b7f57272e8c27987dea63ab851de54070441db8b2d25794a87c2064535/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:00 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6eaf1b7f57272e8c27987dea63ab851de54070441db8b2d25794a87c2064535/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:00 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6eaf1b7f57272e8c27987dea63ab851de54070441db8b2d25794a87c2064535/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:00 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6eaf1b7f57272e8c27987dea63ab851de54070441db8b2d25794a87c2064535/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:00 np0005593233 podman[77090]: 2026-01-23 09:01:00.756311332 +0000 UTC m=+0.160758676 container init cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_almeida, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:01:00 np0005593233 podman[77090]: 2026-01-23 09:01:00.768191267 +0000 UTC m=+0.172638591 container start cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_almeida, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:00 np0005593233 podman[77090]: 2026-01-23 09:01:00.771747973 +0000 UTC m=+0.176195307 container attach cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_almeida, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 23 04:01:01 np0005593233 loving_almeida[77107]: --> passed data devices: 0 physical, 1 LVM
Jan 23 04:01:01 np0005593233 loving_almeida[77107]: --> relative data size: 1.0
Jan 23 04:01:01 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:01:01 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 6df7941f-8366-4880-b94b-b9b3810e23e9
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Jan 23 04:01:02 np0005593233 lvm[77169]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:01:02 np0005593233 lvm[77169]: VG ceph_vg0 finished
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: stderr: got monmap epoch 1
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: --> Creating keyring file for osd.1
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Jan 23 04:01:02 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid 6df7941f-8366-4880-b94b-b9b3810e23e9 --setuser ceph --setgroup ceph
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: stderr: 2026-01-23T09:01:02.927+0000 7f0749d1c740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: stderr: 2026-01-23T09:01:02.927+0000 7f0749d1c740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: stderr: 2026-01-23T09:01:02.927+0000 7f0749d1c740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: stderr: 2026-01-23T09:01:02.927+0000 7f0749d1c740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 23 04:01:06 np0005593233 loving_almeida[77107]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 23 04:01:06 np0005593233 systemd[1]: libpod-cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e.scope: Deactivated successfully.
Jan 23 04:01:06 np0005593233 systemd[1]: libpod-cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e.scope: Consumed 2.591s CPU time.
Jan 23 04:01:06 np0005593233 podman[78075]: 2026-01-23 09:01:06.24379102 +0000 UTC m=+0.027565173 container died cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_almeida, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:01:06 np0005593233 systemd[1]: var-lib-containers-storage-overlay-f6eaf1b7f57272e8c27987dea63ab851de54070441db8b2d25794a87c2064535-merged.mount: Deactivated successfully.
Jan 23 04:01:06 np0005593233 podman[78075]: 2026-01-23 09:01:06.393722557 +0000 UTC m=+0.177496710 container remove cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=loving_almeida, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 23 04:01:06 np0005593233 systemd[1]: libpod-conmon-cb24936ee53295393b2b5d26d37be739069b2bb1a387868bd6809e10af7d9c1e.scope: Deactivated successfully.
Jan 23 04:01:07 np0005593233 podman[78225]: 2026-01-23 09:01:06.97007319 +0000 UTC m=+0.020012159 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:07 np0005593233 podman[78225]: 2026-01-23 09:01:07.247903338 +0000 UTC m=+0.297842297 container create ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_borg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 23 04:01:07 np0005593233 systemd[1]: Started libpod-conmon-ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0.scope.
Jan 23 04:01:07 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:08 np0005593233 podman[78225]: 2026-01-23 09:01:08.235334191 +0000 UTC m=+1.285273230 container init ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_borg, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:01:08 np0005593233 podman[78225]: 2026-01-23 09:01:08.248794263 +0000 UTC m=+1.298733252 container start ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_borg, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:08 np0005593233 eloquent_borg[78241]: 167 167
Jan 23 04:01:08 np0005593233 systemd[1]: libpod-ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0.scope: Deactivated successfully.
Jan 23 04:01:08 np0005593233 conmon[78241]: conmon ac978eb25dd60413203c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0.scope/container/memory.events
Jan 23 04:01:08 np0005593233 podman[78225]: 2026-01-23 09:01:08.599501839 +0000 UTC m=+1.649440808 container attach ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_borg, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:08 np0005593233 podman[78225]: 2026-01-23 09:01:08.600897679 +0000 UTC m=+1.650836658 container died ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_borg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:09 np0005593233 systemd[1]: var-lib-containers-storage-overlay-5cec064782670d277e0642e917bef2fc163f5935b7062df0d9ca793bddee6cc5-merged.mount: Deactivated successfully.
Jan 23 04:01:10 np0005593233 podman[78225]: 2026-01-23 09:01:10.008429229 +0000 UTC m=+3.058368228 container remove ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_borg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 23 04:01:10 np0005593233 systemd[1]: libpod-conmon-ac978eb25dd60413203cdde11453f1748f898ed74bdb0de8877c1050623fa4d0.scope: Deactivated successfully.
Jan 23 04:01:10 np0005593233 podman[78265]: 2026-01-23 09:01:10.209520498 +0000 UTC m=+0.038141273 container create 7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_saha, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:01:10 np0005593233 systemd[1]: Started libpod-conmon-7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83.scope.
Jan 23 04:01:10 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:10 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a045ae354ae4903a1e20256dd047abe1bbad9efeedd6e067821dca551684c0cf/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:10 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a045ae354ae4903a1e20256dd047abe1bbad9efeedd6e067821dca551684c0cf/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:10 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a045ae354ae4903a1e20256dd047abe1bbad9efeedd6e067821dca551684c0cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:10 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a045ae354ae4903a1e20256dd047abe1bbad9efeedd6e067821dca551684c0cf/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:10 np0005593233 podman[78265]: 2026-01-23 09:01:10.193973187 +0000 UTC m=+0.022593972 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:10 np0005593233 podman[78265]: 2026-01-23 09:01:10.295112468 +0000 UTC m=+0.123733253 container init 7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_saha, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:01:10 np0005593233 podman[78265]: 2026-01-23 09:01:10.308867529 +0000 UTC m=+0.137488344 container start 7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_saha, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Jan 23 04:01:10 np0005593233 podman[78265]: 2026-01-23 09:01:10.313672745 +0000 UTC m=+0.142293550 container attach 7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_saha, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:11 np0005593233 strange_saha[78281]: {
Jan 23 04:01:11 np0005593233 strange_saha[78281]:    "1": [
Jan 23 04:01:11 np0005593233 strange_saha[78281]:        {
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "devices": [
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "/dev/loop3"
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            ],
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "lv_name": "ceph_lv0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "lv_size": "7511998464",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=4rapWc-68ZU-tTxz-KhUY-szMN-fCcG-ExiaJI,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=e1533653-0a5a-584c-b34b-8689f0d32e77,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=6df7941f-8366-4880-b94b-b9b3810e23e9,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "lv_uuid": "4rapWc-68ZU-tTxz-KhUY-szMN-fCcG-ExiaJI",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "name": "ceph_lv0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "tags": {
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.block_uuid": "4rapWc-68ZU-tTxz-KhUY-szMN-fCcG-ExiaJI",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.cephx_lockbox_secret": "",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.cluster_fsid": "e1533653-0a5a-584c-b34b-8689f0d32e77",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.cluster_name": "ceph",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.crush_device_class": "",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.encrypted": "0",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.osd_fsid": "6df7941f-8366-4880-b94b-b9b3810e23e9",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.osd_id": "1",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.type": "block",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:                "ceph.vdo": "0"
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            },
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "type": "block",
Jan 23 04:01:11 np0005593233 strange_saha[78281]:            "vg_name": "ceph_vg0"
Jan 23 04:01:11 np0005593233 strange_saha[78281]:        }
Jan 23 04:01:11 np0005593233 strange_saha[78281]:    ]
Jan 23 04:01:11 np0005593233 strange_saha[78281]: }
Jan 23 04:01:11 np0005593233 systemd[1]: libpod-7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83.scope: Deactivated successfully.
Jan 23 04:01:11 np0005593233 conmon[78281]: conmon 7336aa9297bf127a0d87 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83.scope/container/memory.events
Jan 23 04:01:11 np0005593233 podman[78290]: 2026-01-23 09:01:11.164165051 +0000 UTC m=+0.031875146 container died 7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_saha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 23 04:01:11 np0005593233 systemd[1]: var-lib-containers-storage-overlay-a045ae354ae4903a1e20256dd047abe1bbad9efeedd6e067821dca551684c0cf-merged.mount: Deactivated successfully.
Jan 23 04:01:11 np0005593233 podman[78290]: 2026-01-23 09:01:11.30045164 +0000 UTC m=+0.168161715 container remove 7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_saha, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:11 np0005593233 systemd[1]: libpod-conmon-7336aa9297bf127a0d87f75457b870f157ebdecf2e6025b40fa3a10ee1cbbc83.scope: Deactivated successfully.
Jan 23 04:01:11 np0005593233 podman[78445]: 2026-01-23 09:01:11.964851702 +0000 UTC m=+0.045077981 container create ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_lamport, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:12 np0005593233 systemd[1]: Started libpod-conmon-ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d.scope.
Jan 23 04:01:12 np0005593233 podman[78445]: 2026-01-23 09:01:11.944998228 +0000 UTC m=+0.025224507 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:12 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:12 np0005593233 podman[78445]: 2026-01-23 09:01:12.057622226 +0000 UTC m=+0.137848515 container init ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_lamport, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 23 04:01:12 np0005593233 podman[78445]: 2026-01-23 09:01:12.065042337 +0000 UTC m=+0.145268596 container start ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_lamport, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 23 04:01:12 np0005593233 podman[78445]: 2026-01-23 09:01:12.068932587 +0000 UTC m=+0.149158866 container attach ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_lamport, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:01:12 np0005593233 xenodochial_lamport[78462]: 167 167
Jan 23 04:01:12 np0005593233 systemd[1]: libpod-ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d.scope: Deactivated successfully.
Jan 23 04:01:12 np0005593233 podman[78445]: 2026-01-23 09:01:12.070623245 +0000 UTC m=+0.150849534 container died ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_lamport, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 23 04:01:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay-8e092ef5dd07522c2f950091bb0f1b61c9a7c8748dad728678d9e249d7422d24-merged.mount: Deactivated successfully.
Jan 23 04:01:12 np0005593233 podman[78445]: 2026-01-23 09:01:12.11376644 +0000 UTC m=+0.193992719 container remove ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_lamport, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 23 04:01:12 np0005593233 systemd[1]: libpod-conmon-ee19d011cd3d229e83a38fd361f22b0e1738b9ae091bb633d9a0a698a2bf365d.scope: Deactivated successfully.
Jan 23 04:01:12 np0005593233 podman[78492]: 2026-01-23 09:01:12.35574878 +0000 UTC m=+0.049893398 container create e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:12 np0005593233 systemd[1]: Started libpod-conmon-e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76.scope.
Jan 23 04:01:12 np0005593233 podman[78492]: 2026-01-23 09:01:12.338408608 +0000 UTC m=+0.032553216 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:12 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c848cb62d6f247b7c1519e1de0a9a83527ef785c1a63596fb50d6b60048a2098/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c848cb62d6f247b7c1519e1de0a9a83527ef785c1a63596fb50d6b60048a2098/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c848cb62d6f247b7c1519e1de0a9a83527ef785c1a63596fb50d6b60048a2098/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c848cb62d6f247b7c1519e1de0a9a83527ef785c1a63596fb50d6b60048a2098/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c848cb62d6f247b7c1519e1de0a9a83527ef785c1a63596fb50d6b60048a2098/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:12 np0005593233 podman[78492]: 2026-01-23 09:01:12.457826438 +0000 UTC m=+0.151971086 container init e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:01:12 np0005593233 podman[78492]: 2026-01-23 09:01:12.473950126 +0000 UTC m=+0.168094744 container start e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:01:12 np0005593233 podman[78492]: 2026-01-23 09:01:12.47832732 +0000 UTC m=+0.172471988 container attach e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 23 04:01:13 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test[78508]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 23 04:01:13 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test[78508]:                            [--no-systemd] [--no-tmpfs]
Jan 23 04:01:13 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test[78508]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 23 04:01:13 np0005593233 systemd[1]: libpod-e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76.scope: Deactivated successfully.
Jan 23 04:01:13 np0005593233 podman[78513]: 2026-01-23 09:01:13.2232892 +0000 UTC m=+0.024743744 container died e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:13 np0005593233 systemd[1]: var-lib-containers-storage-overlay-c848cb62d6f247b7c1519e1de0a9a83527ef785c1a63596fb50d6b60048a2098-merged.mount: Deactivated successfully.
Jan 23 04:01:13 np0005593233 podman[78513]: 2026-01-23 09:01:13.28668222 +0000 UTC m=+0.088136744 container remove e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate-test, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:13 np0005593233 systemd[1]: libpod-conmon-e09139bccc766504a61d94ab4f333d3de340dcc0ae0bd5628931e7c375660c76.scope: Deactivated successfully.
Jan 23 04:01:13 np0005593233 systemd[1]: Reloading.
Jan 23 04:01:13 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:13 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:13 np0005593233 systemd[1]: Reloading.
Jan 23 04:01:13 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:13 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:14 np0005593233 systemd[1]: Starting Ceph osd.1 for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:01:14 np0005593233 podman[78671]: 2026-01-23 09:01:14.360178557 +0000 UTC m=+0.118729712 container create 36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:14 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:14 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34649404c38087ad5703f58cc7bc4620d14c592420a0a2def220dda0cdc883c1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:14 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34649404c38087ad5703f58cc7bc4620d14c592420a0a2def220dda0cdc883c1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:14 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34649404c38087ad5703f58cc7bc4620d14c592420a0a2def220dda0cdc883c1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:14 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34649404c38087ad5703f58cc7bc4620d14c592420a0a2def220dda0cdc883c1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:14 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34649404c38087ad5703f58cc7bc4620d14c592420a0a2def220dda0cdc883c1/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:14 np0005593233 podman[78671]: 2026-01-23 09:01:14.337705028 +0000 UTC m=+0.096256193 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:14 np0005593233 podman[78671]: 2026-01-23 09:01:14.450160641 +0000 UTC m=+0.208711816 container init 36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:01:14 np0005593233 podman[78671]: 2026-01-23 09:01:14.457764567 +0000 UTC m=+0.216315722 container start 36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 04:01:14 np0005593233 podman[78671]: 2026-01-23 09:01:14.462713877 +0000 UTC m=+0.221265132 container attach 36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 23 04:01:15 np0005593233 bash[78671]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:01:15 np0005593233 bash[78671]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:01:15 np0005593233 bash[78671]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:01:15 np0005593233 bash[78671]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:15 np0005593233 bash[78671]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 23 04:01:15 np0005593233 bash[78671]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 23 04:01:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate[78686]: --> ceph-volume raw activate successful for osd ID: 1
Jan 23 04:01:15 np0005593233 bash[78671]: --> ceph-volume raw activate successful for osd ID: 1
Jan 23 04:01:15 np0005593233 systemd[1]: libpod-36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029.scope: Deactivated successfully.
Jan 23 04:01:15 np0005593233 podman[78804]: 2026-01-23 09:01:15.432813938 +0000 UTC m=+0.025487675 container died 36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 23 04:01:15 np0005593233 systemd[1]: var-lib-containers-storage-overlay-34649404c38087ad5703f58cc7bc4620d14c592420a0a2def220dda0cdc883c1-merged.mount: Deactivated successfully.
Jan 23 04:01:15 np0005593233 podman[78804]: 2026-01-23 09:01:15.507227421 +0000 UTC m=+0.099901158 container remove 36724b6a508e3b56323c0321e0c3e755b39f9e6de7897d0155e4e8d63266d029 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3)
Jan 23 04:01:15 np0005593233 podman[78861]: 2026-01-23 09:01:15.698671136 +0000 UTC m=+0.046544443 container create daf323854723725599189eb44ec931f5b4270357f38c0305d70859dd9cc742fc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:15 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146a8cf03478affea6aa8e661b4ef03d35b316fb541cc1341a2520803bbeeddb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:15 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146a8cf03478affea6aa8e661b4ef03d35b316fb541cc1341a2520803bbeeddb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:15 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146a8cf03478affea6aa8e661b4ef03d35b316fb541cc1341a2520803bbeeddb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:15 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146a8cf03478affea6aa8e661b4ef03d35b316fb541cc1341a2520803bbeeddb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:15 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146a8cf03478affea6aa8e661b4ef03d35b316fb541cc1341a2520803bbeeddb/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:15 np0005593233 podman[78861]: 2026-01-23 09:01:15.770508225 +0000 UTC m=+0.118381542 container init daf323854723725599189eb44ec931f5b4270357f38c0305d70859dd9cc742fc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:15 np0005593233 podman[78861]: 2026-01-23 09:01:15.678552715 +0000 UTC m=+0.026426052 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:15 np0005593233 podman[78861]: 2026-01-23 09:01:15.776083334 +0000 UTC m=+0.123956641 container start daf323854723725599189eb44ec931f5b4270357f38c0305d70859dd9cc742fc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:01:15 np0005593233 bash[78861]: daf323854723725599189eb44ec931f5b4270357f38c0305d70859dd9cc742fc
Jan 23 04:01:15 np0005593233 systemd[1]: Started Ceph osd.1 for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: pidfile_write: ignore empty --pid-file
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f13278d800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f13278d800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f13278d800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f13278d800 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f1335c5800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f1335c5800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f1335c5800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f1335c5800 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 23 04:01:15 np0005593233 ceph-osd[78880]: bdev(0x55f1335c5800 /var/lib/ceph/osd/ceph-1/block) close
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f13278d800 /var/lib/ceph/osd/ceph-1/block) close
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: load: jerasure load: lrc 
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) close
Jan 23 04:01:16 np0005593233 podman[79040]: 2026-01-23 09:01:16.501099507 +0000 UTC m=+0.031937697 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) close
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133646c00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluefs mount
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluefs mount shared_bdev_used = 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Git sha 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: DB SUMMARY
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: DB Session ID:  4TLM6MKVCXAHCQEBCGXM
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                                     Options.env: 0x55f133617c70
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                                Options.info_log: 0x55f13280aba0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.write_buffer_manager: 0x55f133720460
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.row_cache: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                              Options.wal_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.wal_compression: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Compression algorithms supported:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kZSTD supported: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a600)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132800430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1918d96e-b725-411f-a15d-c0935d629952
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158876919290, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158876919476, "job": 1, "event": "recovery_finished"}
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: freelist init
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: freelist _read_cfg
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bluefs umount
Jan 23 04:01:16 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) close
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bdev(0x55f133647400 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluefs mount
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluefs mount shared_bdev_used = 4718592
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Git sha 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: DB SUMMARY
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: DB Session ID:  4TLM6MKVCXAHCQEBCGXN
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                                     Options.env: 0x55f13284c690
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                                Options.info_log: 0x55f132814380
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.write_buffer_manager: 0x55f133720460
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.row_cache: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                              Options.wal_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.wal_compression: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Compression algorithms supported:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kZSTD supported: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280a2a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801610#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280bea0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280bea0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:           Options.merge_operator: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f13280bea0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f132801770#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.compression: LZ4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1918d96e-b725-411f-a15d-c0935d629952
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158877181935, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:01:17 np0005593233 podman[79040]: 2026-01-23 09:01:17.84672848 +0000 UTC m=+1.377566590 container create 37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_rubin, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS)
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158877847327, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158877, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1918d96e-b725-411f-a15d-c0935d629952", "db_session_id": "4TLM6MKVCXAHCQEBCGXN", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158877940069, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158877, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1918d96e-b725-411f-a15d-c0935d629952", "db_session_id": "4TLM6MKVCXAHCQEBCGXN", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158877955798, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158877, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1918d96e-b725-411f-a15d-c0935d629952", "db_session_id": "4TLM6MKVCXAHCQEBCGXN", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158877960889, "job": 1, "event": "recovery_finished"}
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 23 04:01:17 np0005593233 systemd[1]: Started libpod-conmon-37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c.scope.
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55f1328d3c00
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: DB pointer 0x55f133709a00
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.8 total, 0.8 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.8 total, 0.8 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.7 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.8 total, 0.8 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.8 total, 0.8 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 460.80 MB usag
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: _get_class not permitted to load lua
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: _get_class not permitted to load sdk
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: _get_class not permitted to load test_remote_reads
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 load_pgs
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 load_pgs opened 0 pgs
Jan 23 04:01:17 np0005593233 ceph-osd[78880]: osd.1 0 log_to_monitors true
Jan 23 04:01:17 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1[78876]: 2026-01-23T09:01:17.989+0000 7efe4e826740 -1 osd.1 0 log_to_monitors true
Jan 23 04:01:18 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:18 np0005593233 podman[79040]: 2026-01-23 09:01:18.02846429 +0000 UTC m=+1.559302380 container init 37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_rubin, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 23 04:01:18 np0005593233 podman[79040]: 2026-01-23 09:01:18.035082398 +0000 UTC m=+1.565920488 container start 37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_rubin, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 04:01:18 np0005593233 podman[79040]: 2026-01-23 09:01:18.038398742 +0000 UTC m=+1.569236832 container attach 37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_rubin, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef)
Jan 23 04:01:18 np0005593233 brave_rubin[79436]: 167 167
Jan 23 04:01:18 np0005593233 systemd[1]: libpod-37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c.scope: Deactivated successfully.
Jan 23 04:01:18 np0005593233 podman[79040]: 2026-01-23 09:01:18.04288716 +0000 UTC m=+1.573725290 container died 37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_rubin, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:18 np0005593233 systemd[1]: var-lib-containers-storage-overlay-2fb094aebf0026b0175587bba66964e1adfca59dc16552391e83d4bc383ef73a-merged.mount: Deactivated successfully.
Jan 23 04:01:18 np0005593233 podman[79040]: 2026-01-23 09:01:18.085233752 +0000 UTC m=+1.616071842 container remove 37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_rubin, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 23 04:01:18 np0005593233 systemd[1]: libpod-conmon-37e532be78db6e80682319a094bd7a42fd0230b5de62bdada66e3ec46688f48c.scope: Deactivated successfully.
Jan 23 04:01:18 np0005593233 podman[79493]: 2026-01-23 09:01:18.254420795 +0000 UTC m=+0.059158060 container create c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_meitner, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Jan 23 04:01:18 np0005593233 podman[79493]: 2026-01-23 09:01:18.229013664 +0000 UTC m=+0.033750959 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:18 np0005593233 systemd[1]: Started libpod-conmon-c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed.scope.
Jan 23 04:01:18 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:18 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c40add513e9c4dc98ae028f0a394abb736f501346953f8a3769f43a19ad4d55f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:18 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c40add513e9c4dc98ae028f0a394abb736f501346953f8a3769f43a19ad4d55f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:18 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c40add513e9c4dc98ae028f0a394abb736f501346953f8a3769f43a19ad4d55f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:18 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c40add513e9c4dc98ae028f0a394abb736f501346953f8a3769f43a19ad4d55f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:18 np0005593233 podman[79493]: 2026-01-23 09:01:18.43635958 +0000 UTC m=+0.241096915 container init c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_meitner, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:18 np0005593233 podman[79493]: 2026-01-23 09:01:18.449546865 +0000 UTC m=+0.254284120 container start c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_meitner, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Jan 23 04:01:18 np0005593233 podman[79493]: 2026-01-23 09:01:18.453650061 +0000 UTC m=+0.258387496 container attach c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_meitner, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:01:18 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 23 04:01:18 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]: {
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:    "6df7941f-8366-4880-b94b-b9b3810e23e9": {
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:        "ceph_fsid": "e1533653-0a5a-584c-b34b-8689f0d32e77",
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:        "osd_id": 1,
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:        "osd_uuid": "6df7941f-8366-4880-b94b-b9b3810e23e9",
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:        "type": "bluestore"
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]:    }
Jan 23 04:01:19 np0005593233 admiring_meitner[79509]: }
Jan 23 04:01:19 np0005593233 systemd[1]: libpod-c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed.scope: Deactivated successfully.
Jan 23 04:01:19 np0005593233 systemd[1]: libpod-c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed.scope: Consumed 1.064s CPU time.
Jan 23 04:01:19 np0005593233 podman[79493]: 2026-01-23 09:01:19.509653271 +0000 UTC m=+1.314390566 container died c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_meitner, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:19 np0005593233 systemd[1]: var-lib-containers-storage-overlay-c40add513e9c4dc98ae028f0a394abb736f501346953f8a3769f43a19ad4d55f-merged.mount: Deactivated successfully.
Jan 23 04:01:19 np0005593233 podman[79493]: 2026-01-23 09:01:19.607694564 +0000 UTC m=+1.412431849 container remove c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_meitner, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 23 04:01:19 np0005593233 systemd[1]: libpod-conmon-c0626962cbc26f19e199f65b1c7afa1a9e1d815cf5c6f7921ec460b52c43a9ed.scope: Deactivated successfully.
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0 done with init, starting boot process
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0 start_boot
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 23 04:01:19 np0005593233 ceph-osd[78880]: osd.1 0  bench count 12288000 bsize 4 KiB
Jan 23 04:01:20 np0005593233 podman[79758]: 2026-01-23 09:01:20.945961829 +0000 UTC m=+0.092473697 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:21 np0005593233 podman[79758]: 2026-01-23 09:01:21.130290111 +0000 UTC m=+0.276801969 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2)
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.196651496 +0000 UTC m=+0.066233692 container create 3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_swanson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 23 04:01:23 np0005593233 systemd[1]: Started libpod-conmon-3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1.scope.
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.155073795 +0000 UTC m=+0.024656001 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:23 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.298476166 +0000 UTC m=+0.168058472 container init 3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_swanson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.304428275 +0000 UTC m=+0.174010461 container start 3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_swanson, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:01:23 np0005593233 elegant_swanson[80095]: 167 167
Jan 23 04:01:23 np0005593233 systemd[1]: libpod-3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1.scope: Deactivated successfully.
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.314196873 +0000 UTC m=+0.183779059 container attach 3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_swanson, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.314759239 +0000 UTC m=+0.184341435 container died 3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_swanson, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Jan 23 04:01:23 np0005593233 systemd[1]: var-lib-containers-storage-overlay-f8fcc70a4d2ced89b7c5959e2c3cc748439681a95cf2ddb37c11b7dcc7901f24-merged.mount: Deactivated successfully.
Jan 23 04:01:23 np0005593233 podman[80079]: 2026-01-23 09:01:23.401414309 +0000 UTC m=+0.270996525 container remove 3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_swanson, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:23 np0005593233 systemd[1]: libpod-conmon-3cacd770a0804b1f204b1f9b58afba1e1f6c8f5d48ab789cafd3391800f34aa1.scope: Deactivated successfully.
Jan 23 04:01:23 np0005593233 podman[80117]: 2026-01-23 09:01:23.584758164 +0000 UTC m=+0.041777697 container create 524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_gauss, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:23 np0005593233 systemd[1]: Started libpod-conmon-524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f.scope.
Jan 23 04:01:23 np0005593233 podman[80117]: 2026-01-23 09:01:23.568486432 +0000 UTC m=+0.025505985 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:23 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:23 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b274eb14feb7015b9771658f6e5bd7629959d840ca0b2bb52d50fa7d0f491ed/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:23 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b274eb14feb7015b9771658f6e5bd7629959d840ca0b2bb52d50fa7d0f491ed/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:23 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b274eb14feb7015b9771658f6e5bd7629959d840ca0b2bb52d50fa7d0f491ed/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:23 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b274eb14feb7015b9771658f6e5bd7629959d840ca0b2bb52d50fa7d0f491ed/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:23 np0005593233 podman[80117]: 2026-01-23 09:01:23.720979891 +0000 UTC m=+0.177999454 container init 524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_gauss, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Jan 23 04:01:23 np0005593233 podman[80117]: 2026-01-23 09:01:23.728651599 +0000 UTC m=+0.185671132 container start 524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:01:23 np0005593233 podman[80117]: 2026-01-23 09:01:23.735947616 +0000 UTC m=+0.192967149 container attach 524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_gauss, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 16.837 iops: 4310.202 elapsed_sec: 0.696
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: log_channel(cluster) log [WRN] : OSD bench result of 4310.201773 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 0 waiting for initial osdmap
Jan 23 04:01:24 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1[78876]: 2026-01-23T09:01:24.387+0000 7efe4afbd640 -1 osd.1 0 waiting for initial osdmap
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 crush map has features 288514050185494528, adjusting msgr requires for clients
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 crush map has features 3314932999778484224, adjusting msgr requires for osds
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 check_osdmap_features require_osd_release unknown -> reef
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 set_numa_affinity not setting numa affinity
Jan 23 04:01:24 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-1[78876]: 2026-01-23T09:01:24.415+0000 7efe45dce640 -1 osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:01:24 np0005593233 ceph-osd[78880]: osd.1 8 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]: [
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:    {
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "available": false,
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "ceph_device": false,
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "lsm_data": {},
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "lvs": [],
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "path": "/dev/sr0",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "rejected_reasons": [
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "Has a FileSystem",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "Insufficient space (<5GB)"
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        ],
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        "sys_api": {
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "actuators": null,
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "device_nodes": "sr0",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "devname": "sr0",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "human_readable_size": "482.00 KB",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "id_bus": "ata",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "model": "QEMU DVD-ROM",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "nr_requests": "2",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "parent": "/dev/sr0",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "partitions": {},
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "path": "/dev/sr0",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "removable": "1",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "rev": "2.5+",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "ro": "0",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "rotational": "1",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "sas_address": "",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "sas_device_handle": "",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "scheduler_mode": "mq-deadline",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "sectors": 0,
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "sectorsize": "2048",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "size": 493568.0,
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "support_discard": "2048",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "type": "disk",
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:            "vendor": "QEMU"
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:        }
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]:    }
Jan 23 04:01:25 np0005593233 nifty_gauss[80133]: ]
Jan 23 04:01:25 np0005593233 ceph-osd[78880]: osd.1 9 state: booting -> active
Jan 23 04:01:25 np0005593233 systemd[1]: libpod-524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f.scope: Deactivated successfully.
Jan 23 04:01:25 np0005593233 systemd[1]: libpod-524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f.scope: Consumed 1.481s CPU time.
Jan 23 04:01:25 np0005593233 podman[80117]: 2026-01-23 09:01:25.192726565 +0000 UTC m=+1.649746108 container died 524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_gauss, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:01:25 np0005593233 systemd[1]: var-lib-containers-storage-overlay-0b274eb14feb7015b9771658f6e5bd7629959d840ca0b2bb52d50fa7d0f491ed-merged.mount: Deactivated successfully.
Jan 23 04:01:25 np0005593233 podman[80117]: 2026-01-23 09:01:25.254866109 +0000 UTC m=+1.711885642 container remove 524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_gauss, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Jan 23 04:01:25 np0005593233 systemd[1]: libpod-conmon-524260337c0a1bf3d2ad4353a1d807ee1b7808d7f8c7457bf1b06c849f4cd97f.scope: Deactivated successfully.
Jan 23 04:01:26 np0005593233 ceph-osd[78880]: osd.1 10 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 23 04:01:26 np0005593233 ceph-osd[78880]: osd.1 10 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Jan 23 04:01:26 np0005593233 ceph-osd[78880]: osd.1 10 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 23 04:01:26 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 10 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [1] r=0 lpr=10 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:01:27 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 11 pg[1.0( empty local-lis/les=10/11 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [1] r=0 lpr=10 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.094004474 +0000 UTC m=+0.046861501 container create b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_spence, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 04:01:55 np0005593233 systemd[1]: Started libpod-conmon-b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c.scope.
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.073825072 +0000 UTC m=+0.026682209 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:55 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.201054364 +0000 UTC m=+0.153911431 container init b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_spence, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True)
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.214968439 +0000 UTC m=+0.167825476 container start b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_spence, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.218712425 +0000 UTC m=+0.171569492 container attach b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_spence, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 23 04:01:55 np0005593233 modest_spence[81358]: 167 167
Jan 23 04:01:55 np0005593233 systemd[1]: libpod-b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c.scope: Deactivated successfully.
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.224429507 +0000 UTC m=+0.177286564 container died b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_spence, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 23 04:01:55 np0005593233 systemd[1]: var-lib-containers-storage-overlay-7f1103f23d1d19825384709d6c2057d4000af2d6b7469205840e079a9ca2e0bd-merged.mount: Deactivated successfully.
Jan 23 04:01:55 np0005593233 podman[81342]: 2026-01-23 09:01:55.26926589 +0000 UTC m=+0.222122927 container remove b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=modest_spence, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:01:55 np0005593233 systemd[1]: libpod-conmon-b7849317b8dd54fa20fc07b87335b2160c5be40cc88913a33b0999e236d40f3c.scope: Deactivated successfully.
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.354652054 +0000 UTC m=+0.046617404 container create 59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_napier, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Jan 23 04:01:55 np0005593233 systemd[1]: Started libpod-conmon-59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb.scope.
Jan 23 04:01:55 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:01:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0843b71ada32fdc471582bd9bc3be1644f2ab0215550bdeebff6a407313fd1cd/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0843b71ada32fdc471582bd9bc3be1644f2ab0215550bdeebff6a407313fd1cd/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0843b71ada32fdc471582bd9bc3be1644f2ab0215550bdeebff6a407313fd1cd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0843b71ada32fdc471582bd9bc3be1644f2ab0215550bdeebff6a407313fd1cd/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.423987203 +0000 UTC m=+0.115952573 container init 59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_napier, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.335757748 +0000 UTC m=+0.027723108 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.430889769 +0000 UTC m=+0.122855119 container start 59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_napier, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.4362288 +0000 UTC m=+0.128194270 container attach 59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_napier, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:01:55 np0005593233 systemd[1]: libpod-59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb.scope: Deactivated successfully.
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.613515565 +0000 UTC m=+0.305480935 container died 59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_napier, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:55 np0005593233 systemd[1]: var-lib-containers-storage-overlay-0843b71ada32fdc471582bd9bc3be1644f2ab0215550bdeebff6a407313fd1cd-merged.mount: Deactivated successfully.
Jan 23 04:01:55 np0005593233 podman[81377]: 2026-01-23 09:01:55.653301683 +0000 UTC m=+0.345267033 container remove 59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_napier, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:01:55 np0005593233 systemd[1]: libpod-conmon-59bf8892d55132a626d24ef3b6e7fdf9b772ac84cefaf9d0950320de6d6c7bbb.scope: Deactivated successfully.
Jan 23 04:01:55 np0005593233 systemd[1]: Reloading.
Jan 23 04:01:55 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:55 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:55 np0005593233 systemd[1]: Reloading.
Jan 23 04:01:56 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:56 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:56 np0005593233 systemd[1]: Starting Ceph mon.compute-1 for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:01:56 np0005593233 podman[81554]: 2026-01-23 09:01:56.454357206 +0000 UTC m=+0.024013103 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:56 np0005593233 podman[81554]: 2026-01-23 09:01:56.573602671 +0000 UTC m=+0.143258538 container create af70d30f511442a746c09c7024f06ac4b0720cd208e03af2f109613e6f2578aa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-1, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:56 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce7c2f0db28a9e6d03f6892ae7255c01a8e79c1286d0726703c6a7534f24a1a2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:56 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce7c2f0db28a9e6d03f6892ae7255c01a8e79c1286d0726703c6a7534f24a1a2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:56 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce7c2f0db28a9e6d03f6892ae7255c01a8e79c1286d0726703c6a7534f24a1a2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:56 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce7c2f0db28a9e6d03f6892ae7255c01a8e79c1286d0726703c6a7534f24a1a2/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:56 np0005593233 podman[81554]: 2026-01-23 09:01:56.682072891 +0000 UTC m=+0.251728788 container init af70d30f511442a746c09c7024f06ac4b0720cd208e03af2f109613e6f2578aa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 23 04:01:56 np0005593233 podman[81554]: 2026-01-23 09:01:56.691648873 +0000 UTC m=+0.261304740 container start af70d30f511442a746c09c7024f06ac4b0720cd208e03af2f109613e6f2578aa (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:56 np0005593233 bash[81554]: af70d30f511442a746c09c7024f06ac4b0720cd208e03af2f109613e6f2578aa
Jan 23 04:01:56 np0005593233 systemd[1]: Started Ceph mon.compute-1 for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: pidfile_write: ignore empty --pid-file
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: load: jerasure load: lrc 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Git sha 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: DB SUMMARY
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: DB Session ID:  KQET2Q3DBZ4VI5YCO0ZU
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                                     Options.env: 0x55962c77fc40
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                                Options.info_log: 0x55962d2d2fc0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                                 Options.wal_dir: 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                    Options.write_buffer_manager: 0x55962d2e2b40
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                               Options.row_cache: None
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                              Options.wal_filter: None
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.wal_compression: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.max_background_jobs: 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.max_total_wal_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:       Options.compaction_readahead_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Compression algorithms supported:
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kZSTD supported: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:           Options.merge_operator: 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55962d2d2c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55962d2cb1f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:        Options.write_buffer_size: 33554432
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:  Options.max_write_buffer_number: 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.compression: NoCompression
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d3e46583-96e0-4f5f-ac42-7b628f2a09c0
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158916736360, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158916738478, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158916738601, "job": 1, "event": "recovery_finished"}
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55962d2f4e00
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: DB pointer 0x55962d3fc000
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid e1533653-0a5a-584c-b34b-8689f0d32e77
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(???) e0 preinit fsid e1533653-0a5a-584c-b34b-8689f0d32e77
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 3314933000852226048, adjusting msgr requires
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Updating compute-2:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Updating compute-2:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.client.admin.keyring
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Deploying daemon mon.compute-2 on compute-2
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: Cluster is now healthy
Jan 23 04:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 23 04:02:02 np0005593233 ceph-mon[81574]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 23 04:02:02 np0005593233 ceph-mon[81574]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 23 04:02:02 np0005593233 ceph-mon[81574]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 23 04:02:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: Deploying daemon mon.compute-1 on compute-1
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-0 calling monitor election
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-2 calling monitor election
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.nrjyzu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.nrjyzu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-23T09:01:55.467949Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864308,os=Linux}
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-0 calling monitor election
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-2 calling monitor election
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-1 calling monitor election
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.59568223 +0000 UTC m=+0.050845945 container create bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_thompson, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 04:02:06 np0005593233 systemd[1]: Started libpod-conmon-bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d.scope.
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.570529135 +0000 UTC m=+0.025692890 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:06 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.690337817 +0000 UTC m=+0.145501522 container init bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_thompson, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.702697858 +0000 UTC m=+0.157861573 container start bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_thompson, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef)
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.706777524 +0000 UTC m=+0.161941229 container attach bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:06 np0005593233 recursing_thompson[81768]: 167 167
Jan 23 04:02:06 np0005593233 systemd[1]: libpod-bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d.scope: Deactivated successfully.
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.713973848 +0000 UTC m=+0.169137563 container died bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_thompson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Jan 23 04:02:06 np0005593233 systemd[1]: var-lib-containers-storage-overlay-bd9e65c8a718088f211ecdfe1e8e269be6578c87ec4a178e1b7be6b6d5e60a65-merged.mount: Deactivated successfully.
Jan 23 04:02:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e12 _set_new_cache_sizes cache_size:1019935685 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:06 np0005593233 podman[81752]: 2026-01-23 09:02:06.761109586 +0000 UTC m=+0.216273291 container remove bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_thompson, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 04:02:06 np0005593233 systemd[1]: libpod-conmon-bb1776eee4ef600501a3783e7005e21633662df3585224e664e06be85d2fbc2d.scope: Deactivated successfully.
Jan 23 04:02:06 np0005593233 systemd[1]: Reloading.
Jan 23 04:02:06 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:06 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:07 np0005593233 systemd[1]: Reloading.
Jan 23 04:02:07 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:07 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:07 np0005593233 systemd[1]: Starting Ceph mgr.compute-1.wsgywz for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:02:07 np0005593233 podman[81911]: 2026-01-23 09:02:07.690012148 +0000 UTC m=+0.019814173 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.wsgywz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.wsgywz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:02:08 np0005593233 ceph-mon[81574]: Deploying daemon mgr.compute-1.wsgywz on compute-1
Jan 23 04:02:08 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/936567403' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:08 np0005593233 podman[81911]: 2026-01-23 09:02:08.064904741 +0000 UTC m=+0.394706776 container create 792b1d7bb8e5b0e8ba98e7d0f71e083368fd6359e2cdf1e1efc803ee7c8742f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Jan 23 04:02:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9aa66cb542c92e00baf75fbd7072d0e1d6dbbbb429c359e1a9e79949031b0f3e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9aa66cb542c92e00baf75fbd7072d0e1d6dbbbb429c359e1a9e79949031b0f3e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9aa66cb542c92e00baf75fbd7072d0e1d6dbbbb429c359e1a9e79949031b0f3e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9aa66cb542c92e00baf75fbd7072d0e1d6dbbbb429c359e1a9e79949031b0f3e/merged/var/lib/ceph/mgr/ceph-compute-1.wsgywz supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:08 np0005593233 podman[81911]: 2026-01-23 09:02:08.738270888 +0000 UTC m=+1.068072983 container init 792b1d7bb8e5b0e8ba98e7d0f71e083368fd6359e2cdf1e1efc803ee7c8742f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 23 04:02:08 np0005593233 podman[81911]: 2026-01-23 09:02:08.748957011 +0000 UTC m=+1.078759046 container start 792b1d7bb8e5b0e8ba98e7d0f71e083368fd6359e2cdf1e1efc803ee7c8742f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:02:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e13 e13: 2 total, 2 up, 2 in
Jan 23 04:02:08 np0005593233 ceph-mgr[81930]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:02:08 np0005593233 ceph-mgr[81930]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 23 04:02:08 np0005593233 ceph-mgr[81930]: pidfile_write: ignore empty --pid-file
Jan 23 04:02:08 np0005593233 bash[81911]: 792b1d7bb8e5b0e8ba98e7d0f71e083368fd6359e2cdf1e1efc803ee7c8742f5
Jan 23 04:02:08 np0005593233 systemd[1]: Started Ceph mgr.compute-1.wsgywz for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:02:09 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 13 pg[2.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [1] r=0 lpr=13 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:09 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/936567403' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:09 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'alerts'
Jan 23 04:02:09 np0005593233 ceph-mgr[81930]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:02:09 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'balancer'
Jan 23 04:02:09 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:09.437+0000 7fd1c9015140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:02:09 np0005593233 ceph-mgr[81930]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:02:09 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'cephadm'
Jan 23 04:02:09 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:09.712+0000 7fd1c9015140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e14 e14: 2 total, 2 up, 2 in
Jan 23 04:02:10 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 14 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [1] r=0 lpr=13 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:02:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 23 04:02:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e14 _set_new_cache_sizes cache_size:1020053293 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:12 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'crash'
Jan 23 04:02:12 np0005593233 ceph-mgr[81930]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:02:12 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'dashboard'
Jan 23 04:02:12 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:12.432+0000 7fd1c9015140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:02:13 np0005593233 systemd[72470]: Starting Mark boot as successful...
Jan 23 04:02:13 np0005593233 systemd[72470]: Finished Mark boot as successful.
Jan 23 04:02:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e15 e15: 2 total, 2 up, 2 in
Jan 23 04:02:13 np0005593233 ceph-mon[81574]: Deploying daemon crash.compute-2 on compute-2
Jan 23 04:02:13 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2880218519' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:13 np0005593233 ceph-mon[81574]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:14 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:02:15 np0005593233 ceph-mgr[81930]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:02:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:15.190+0000 7fd1c9015140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:02:15 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:02:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e16 e16: 2 total, 2 up, 2 in
Jan 23 04:02:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:15 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2880218519' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:02:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:02:15 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]:  from numpy import show_config as show_numpy_config
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'influx'
Jan 23 04:02:16 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:15.999+0000 7fd1c9015140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:16.263+0000 7fd1c9015140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'insights'
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e17 e17: 2 total, 2 up, 2 in
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/136452763' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'iostat'
Jan 23 04:02:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e17 _set_new_cache_sizes cache_size:1020054715 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:02:16 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:16.886+0000 7fd1c9015140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/136452763' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e18 e18: 2 total, 2 up, 2 in
Jan 23 04:02:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 23 04:02:18 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.102:0/4125312751' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]: dispatch
Jan 23 04:02:18 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]: dispatch
Jan 23 04:02:18 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]': finished
Jan 23 04:02:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:18 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/3224534207' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:18 np0005593233 ceph-mon[81574]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:19 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'localpool'
Jan 23 04:02:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 23 04:02:19 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:02:19 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'mirroring'
Jan 23 04:02:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 23 04:02:20 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/3224534207' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:20 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'nfs'
Jan 23 04:02:20 np0005593233 ceph-mgr[81930]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:02:20 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:02:20 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:20.948+0000 7fd1c9015140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:02:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 23 04:02:21 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2806909960' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:21 np0005593233 ceph-mgr[81930]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:21 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:02:21 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:21.658+0000 7fd1c9015140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e22 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:21 np0005593233 ceph-mgr[81930]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:02:21 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'osd_support'
Jan 23 04:02:21 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:21.953+0000 7fd1c9015140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:02:22 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2806909960' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 23 04:02:22 np0005593233 ceph-mgr[81930]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:02:22 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:22.224+0000 7fd1c9015140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:02:22 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:02:22 np0005593233 ceph-mgr[81930]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:02:22 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'progress'
Jan 23 04:02:22 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:22.542+0000 7fd1c9015140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/3443060591' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 23 04:02:23 np0005593233 ceph-mgr[81930]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'prometheus'
Jan 23 04:02:23 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:23.198+0000 7fd1c9015140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 23 04:02:24 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 24 pg[7.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:24 np0005593233 ceph-mon[81574]: Deploying daemon osd.2 on compute-2
Jan 23 04:02:24 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/3443060591' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:24 np0005593233 ceph-mon[81574]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 23 04:02:24 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 25 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:24 np0005593233 ceph-mgr[81930]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:02:24 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:02:24 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:24.883+0000 7fd1c9015140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:02:25 np0005593233 ceph-mgr[81930]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:02:25 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'restful'
Jan 23 04:02:25 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:25.212+0000 7fd1c9015140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:02:25 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/927391621' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 23 04:02:25 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/927391621' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 23 04:02:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e26 e26: 3 total, 2 up, 3 in
Jan 23 04:02:26 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'rgw'
Jan 23 04:02:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:26 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/174650588' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 23 04:02:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e27 e27: 3 total, 2 up, 3 in
Jan 23 04:02:27 np0005593233 ceph-mgr[81930]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:02:27 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:27.169+0000 7fd1c9015140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:02:27 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'rook'
Jan 23 04:02:27 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/174650588' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 23 04:02:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 23 04:02:29 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/4038784885' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 23 04:02:30 np0005593233 ceph-mgr[81930]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:02:30 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:30.127+0000 7fd1c9015140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:02:30 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'selftest'
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/4038784885' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: from='osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1670773661' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 23 04:02:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e29 e29: 3 total, 2 up, 3 in
Jan 23 04:02:30 np0005593233 ceph-mgr[81930]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:02:30 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:02:30 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:30.518+0000 7fd1c9015140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:02:30 np0005593233 ceph-mgr[81930]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:02:30 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'stats'
Jan 23 04:02:30 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:30.939+0000 7fd1c9015140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:02:31 np0005593233 podman[82188]: 2026-01-23 09:02:31.111527302 +0000 UTC m=+0.083262229 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:31 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'status'
Jan 23 04:02:31 np0005593233 podman[82188]: 2026-01-23 09:02:31.293688367 +0000 UTC m=+0.265423194 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 23 04:02:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e30 e30: 3 total, 2 up, 3 in
Jan 23 04:02:31 np0005593233 ceph-mon[81574]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 23 04:02:31 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1670773661' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 23 04:02:31 np0005593233 ceph-mon[81574]: from='osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:02:31 np0005593233 ceph-mon[81574]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:02:31 np0005593233 ceph-mgr[81930]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:02:31 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'telegraf'
Jan 23 04:02:31 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:31.675+0000 7fd1c9015140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:02:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:31 np0005593233 ceph-mgr[81930]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:02:31 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'telemetry'
Jan 23 04:02:31 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:31.957+0000 7fd1c9015140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/3627784438' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Jan 23 04:02:32 np0005593233 ceph-mgr[81930]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:02:32 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:02:32 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:32.681+0000 7fd1c9015140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:02:33 np0005593233 ceph-mgr[81930]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:33 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'volumes'
Jan 23 04:02:33 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:33.411+0000 7fd1c9015140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:33 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/3627784438' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 23 04:02:34 np0005593233 ceph-mgr[81930]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:02:34 np0005593233 ceph-mgr[81930]: mgr[py] Loading python module 'zabbix'
Jan 23 04:02:34 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:34.207+0000 7fd1c9015140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:02:34 np0005593233 ceph-mgr[81930]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:02:34 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-1-wsgywz[81926]: 2026-01-23T09:02:34.510+0000 7fd1c9015140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:02:34 np0005593233 ceph-mgr[81930]: ms_deliver_dispatch: unhandled message 0x5588b22211e0 mon_map magic: 0 v1 from mon.2 v2:192.168.122.101:3300/0
Jan 23 04:02:34 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:02:34 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1955789217' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 23 04:02:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e32 e32: 3 total, 2 up, 3 in
Jan 23 04:02:35 np0005593233 ceph-mon[81574]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:35 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1955789217' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 23 04:02:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:02:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: OSD bench result of 5183.157191 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 23 04:02:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: Updating compute-0:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: Updating compute-2:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: Updating compute-1:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: Cluster is now healthy
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998] boot
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 23 04:02:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 23 04:02:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:38 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2470226038' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 23 04:02:38 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2470226038' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 23 04:02:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 35 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=35 pruub=10.545331001s) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active pruub 92.308898926s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=35 pruub=10.545331001s) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown pruub 92.308898926s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.6( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.2( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.a( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.b( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.3( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.4( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.c( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.d( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.e( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.f( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.12( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.13( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.10( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.11( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.14( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.15( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.18( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.19( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.16( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.17( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1c( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1d( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1a( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1b( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1e( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.1f( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.7( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.5( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.9( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 36 pg[2.8( empty local-lis/les=13/14 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1f( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1c( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1b( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.a( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.9( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.8( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.6( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1e( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.4( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.5( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.3( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1d( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.b( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.f( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.0( empty local-lis/les=35/37 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.7( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.d( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.10( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.e( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.14( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.13( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.c( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.15( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.11( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.16( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.17( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.18( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.19( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.1a( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.12( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 37 pg[2.2( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=13/13 les/c/f=14/14/0 sis=35) [1] r=0 lpr=35 pi=[13,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:40 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2903992486' entity='client.admin' 
Jan 23 04:02:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 23 04:02:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: Saving service ingress.rgw.default spec with placement count:2
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 23 04:02:43 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Jan 23 04:02:43 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e2 new map
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:02:44.729005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 23 04:02:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 40 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=10.853440285s) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 98.527908325s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=10.853440285s) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown pruub 98.527908325s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.yntofk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1d( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.12( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.10( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.16( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.14( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.17( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.b( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.0( empty local-lis/les=40/42 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.7( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.d( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1e( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.19( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [1] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:47 np0005593233 ceph-mon[81574]: Reconfiguring mgr.compute-0.yntofk (monmap changed)...
Jan 23 04:02:47 np0005593233 ceph-mon[81574]: Reconfiguring daemon mgr.compute-0.yntofk on compute-0
Jan 23 04:02:47 np0005593233 ceph-mon[81574]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:02:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: Reconfiguring osd.0 (monmap changed)...
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: Reconfiguring daemon osd.0 on compute-0
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.456444318 +0000 UTC m=+0.048270068 container create bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_golick, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:48 np0005593233 systemd[1]: Started libpod-conmon-bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84.scope.
Jan 23 04:02:48 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.436431477 +0000 UTC m=+0.028257257 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.542692654 +0000 UTC m=+0.134518404 container init bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_golick, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.552710865 +0000 UTC m=+0.144536625 container start bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_golick, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.557208152 +0000 UTC m=+0.149033992 container attach bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Jan 23 04:02:48 np0005593233 dazzling_golick[83436]: 167 167
Jan 23 04:02:48 np0005593233 systemd[1]: libpod-bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84.scope: Deactivated successfully.
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.562292175 +0000 UTC m=+0.154117925 container died bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:02:48 np0005593233 systemd[1]: var-lib-containers-storage-overlay-f01ff6fb5d771dbf4b548f9398e2e458c4280d62544d00de4358a1bc38d46352-merged.mount: Deactivated successfully.
Jan 23 04:02:48 np0005593233 podman[83420]: 2026-01-23 09:02:48.600294124 +0000 UTC m=+0.192119894 container remove bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_golick, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True)
Jan 23 04:02:48 np0005593233 systemd[1]: libpod-conmon-bc39f915cd3a4009bcbb3a27fe6c385639529a07e08a4d45528bc442f827ba84.scope: Deactivated successfully.
Jan 23 04:02:49 np0005593233 podman[83572]: 2026-01-23 09:02:49.464984924 +0000 UTC m=+0.275253990 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:49 np0005593233 ceph-mon[81574]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 23 04:02:49 np0005593233 ceph-mon[81574]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 23 04:02:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 23 04:02:49 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2579638420' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 23 04:02:49 np0005593233 podman[83572]: 2026-01-23 09:02:49.894190822 +0000 UTC m=+0.704459828 container create 0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_meitner, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:02:49 np0005593233 systemd[1]: Started libpod-conmon-0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b.scope.
Jan 23 04:02:49 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:02:50 np0005593233 podman[83572]: 2026-01-23 09:02:50.011891278 +0000 UTC m=+0.822160324 container init 0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_meitner, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 04:02:50 np0005593233 podman[83572]: 2026-01-23 09:02:50.025526803 +0000 UTC m=+0.835795769 container start 0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_meitner, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:50 np0005593233 podman[83572]: 2026-01-23 09:02:50.029375543 +0000 UTC m=+0.839644539 container attach 0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_meitner, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:50 np0005593233 nostalgic_meitner[83588]: 167 167
Jan 23 04:02:50 np0005593233 systemd[1]: libpod-0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b.scope: Deactivated successfully.
Jan 23 04:02:50 np0005593233 podman[83572]: 2026-01-23 09:02:50.031308453 +0000 UTC m=+0.841577409 container died 0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_meitner, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:02:50 np0005593233 systemd[1]: var-lib-containers-storage-overlay-1be65e870747b9ef6861a75bf554e93bf50215be0006959db7f6e3ed91ffe13f-merged.mount: Deactivated successfully.
Jan 23 04:02:50 np0005593233 podman[83572]: 2026-01-23 09:02:50.088854972 +0000 UTC m=+0.899123978 container remove 0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_meitner, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:50 np0005593233 systemd[1]: libpod-conmon-0536e8bffd0d775478df74d4c6a9b5e082b42fdedf599a99a6247283b05cab7b.scope: Deactivated successfully.
Jan 23 04:02:50 np0005593233 ceph-mon[81574]: Reconfiguring osd.1 (monmap changed)...
Jan 23 04:02:50 np0005593233 ceph-mon[81574]: Reconfiguring daemon osd.1 on compute-1
Jan 23 04:02:50 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2579638420' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 23 04:02:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:02:50 np0005593233 podman[83729]: 2026-01-23 09:02:50.90205502 +0000 UTC m=+0.057485088 container create f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_boyd, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 23 04:02:50 np0005593233 systemd[1]: Started libpod-conmon-f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3.scope.
Jan 23 04:02:50 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:02:50 np0005593233 podman[83729]: 2026-01-23 09:02:50.874203235 +0000 UTC m=+0.029633393 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:50 np0005593233 podman[83729]: 2026-01-23 09:02:50.980241617 +0000 UTC m=+0.135671705 container init f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_boyd, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:02:50 np0005593233 podman[83729]: 2026-01-23 09:02:50.986830858 +0000 UTC m=+0.142260916 container start f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_boyd, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 23 04:02:50 np0005593233 podman[83729]: 2026-01-23 09:02:50.990485013 +0000 UTC m=+0.145915081 container attach f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_boyd, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:50 np0005593233 blissful_boyd[83746]: 167 167
Jan 23 04:02:50 np0005593233 systemd[1]: libpod-f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3.scope: Deactivated successfully.
Jan 23 04:02:50 np0005593233 podman[83729]: 2026-01-23 09:02:50.994873798 +0000 UTC m=+0.150303866 container died f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_boyd, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay-bae7684ae1e3656501088941c59f2cf282e2ef5d3db88d5baf902e635ae20d5e-merged.mount: Deactivated successfully.
Jan 23 04:02:51 np0005593233 podman[83729]: 2026-01-23 09:02:51.035883876 +0000 UTC m=+0.191313944 container remove f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_boyd, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 23 04:02:51 np0005593233 systemd[1]: libpod-conmon-f8b9c402119294c4cd0a270b5d69989b6349ec5160de0820b5160c81c442d6b3.scope: Deactivated successfully.
Jan 23 04:02:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.nrjyzu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Jan 23 04:02:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.19( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842750549s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752540588s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.13( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.929204941s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839019775s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1f( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928771019s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838577271s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.18( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842698097s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752532959s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1d( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.919963837s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.829811096s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.18( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842620850s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752532959s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.19( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842667580s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752540588s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.13( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.929100990s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839019775s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1d( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.919743538s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.829811096s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1f( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928691864s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838577271s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.10( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928226471s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838531494s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.16( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928246498s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838577271s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.11( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928128242s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838470459s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.10( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928189278s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838531494s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.16( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928214073s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838577271s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.11( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928091049s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838470459s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.15( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842066765s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752502441s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.15( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842041969s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752502441s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.12( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.842017174s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752555847s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.10( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841835976s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752441406s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.14( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928102493s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838722229s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.f( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841753960s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752403259s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.10( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841810226s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752441406s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.f( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841637611s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752403259s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.12( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841951370s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752555847s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.14( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.928056717s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838722229s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.e( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841552734s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752449036s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.e( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841530800s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752449036s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.a( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927897453s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838821411s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.a( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927866936s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838821411s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.d( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841379166s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752418518s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.d( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841344833s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752418518s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.8( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927712440s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838882446s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.c( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841208458s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752388000s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.9( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927671432s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838897705s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.9( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927648544s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838897705s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.8( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927626610s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838882446s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.c( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841116905s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752388000s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.b( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927557945s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838874817s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.b( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.841012955s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752380371s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.b( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927513123s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838874817s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.b( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840985298s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752380371s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.e( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927425385s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838912964s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.6( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927392006s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838920593s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.e( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927385330s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838912964s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.6( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927365303s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838920593s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.5( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927279472s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838920593s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.5( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927249908s) [2] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838920593s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.5( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840610504s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752334595s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840544701s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752304077s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.5( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840581894s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752334595s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.4( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927206993s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.838989258s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.4( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840487480s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752319336s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.4( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.927171707s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.838989258s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.4( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840466499s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752319336s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.6( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840095520s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752014160s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.6( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840072632s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752014160s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840466499s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752304077s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.2( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926963806s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839096069s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.9( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839805603s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.751960754s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.3( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926865578s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839019775s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.13( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840292931s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752464294s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.9( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839779854s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.751960754s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.3( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926799774s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839019775s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.13( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.840236664s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752464294s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.2( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926934242s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839096069s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1c( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.835844994s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.748344421s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1b( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839344025s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.751861572s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1c( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.835814476s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.748344421s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1b( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839318275s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.751861572s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.f( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926525116s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839096069s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.a( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839400291s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.751808167s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1e( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926477432s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839057922s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1e( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926450729s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839057922s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.f( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926486969s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839096069s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.a( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839197159s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.751808167s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1d( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839683533s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752357483s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.18( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926405907s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839103699s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1d( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839664459s) [2] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752357483s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.18( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926383018s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839103699s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1b( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926341057s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 104.839126587s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[7.1b( empty local-lis/les=40/42 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43 pruub=9.926320076s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.839126587s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1e( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839424133s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.752265930s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1f( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.835471153s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 106.748313904s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1f( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.835441589s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.748313904s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[2.1e( empty local-lis/les=35/37 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43 pruub=11.839387894s) [0] r=-1 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 106.752265930s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.1a( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.15( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.1f( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.13( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.11( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.14( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.10( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.16( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.13( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.15( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.16( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.10( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.1c( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.9( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.d( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.c( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.7( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.3( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.f( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.5( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.1( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.f( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.a( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.2( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[3.1c( empty local-lis/les=0/0 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.1b( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[5.18( empty local-lis/les=0/0 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.d( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.e( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.d( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.c( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.18( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.5( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.1a( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.5( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.7( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.a( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.8( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.3( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[4.1b( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 43 pg[6.19( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: Reconfiguring mgr.compute-2.nrjyzu (monmap changed)...
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: Reconfiguring daemon mgr.compute-2.nrjyzu on compute-2
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.3 deep-scrub starts
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.3 deep-scrub ok
Jan 23 04:02:53 np0005593233 podman[83935]: 2026-01-23 09:02:53.450376677 +0000 UTC m=+0.075953360 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default)
Jan 23 04:02:53 np0005593233 podman[83935]: 2026-01-23 09:02:53.562859816 +0000 UTC m=+0.188436429 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 23 04:02:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.16( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.11( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.1f( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.13( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.10( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.15( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.16( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.14( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.10( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.13( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.15( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.c( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.f( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.8( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.9( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.d( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.a( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.d( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.a( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.a( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.7( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.5( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.2( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.7( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.3( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.5( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.1( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.5( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.2( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.3( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.f( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.e( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.c( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.e( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.1c( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.1b( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.d( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.19( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.1a( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.1b( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[3.1c( empty local-lis/les=43/44 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[5.18( empty local-lis/les=43/44 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43) [1] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[6.1a( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 44 pg[4.18( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:55 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/2587378952' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 23 04:02:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:57 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.7 deep-scrub starts
Jan 23 04:02:57 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.7 deep-scrub ok
Jan 23 04:02:57 np0005593233 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 04:02:57 np0005593233 systemd[1]: session-19.scope: Consumed 8.912s CPU time.
Jan 23 04:02:57 np0005593233 systemd-logind[804]: Session 19 logged out. Waiting for processes to exit.
Jan 23 04:02:57 np0005593233 systemd-logind[804]: Removed session 19.
Jan 23 04:02:59 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.8 deep-scrub starts
Jan 23 04:02:59 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.8 deep-scrub ok
Jan 23 04:02:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.nxrebk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:03:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.nxrebk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:03:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:01 np0005593233 ceph-mon[81574]: Deploying daemon rgw.rgw.compute-2.nxrebk on compute-2
Jan 23 04:03:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 23 04:03:03 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 23 04:03:03 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.447039609 +0000 UTC m=+0.058579246 container create 1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_varahamihira, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 23 04:03:03 np0005593233 systemd[1]: Started libpod-conmon-1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3.scope.
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.417065068 +0000 UTC m=+0.028604795 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:03 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.537107095 +0000 UTC m=+0.148646802 container init 1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_varahamihira, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.545426591 +0000 UTC m=+0.156966268 container start 1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_varahamihira, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:03 np0005593233 reverent_varahamihira[84179]: 167 167
Jan 23 04:03:03 np0005593233 systemd[1]: libpod-1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3.scope: Deactivated successfully.
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.550339979 +0000 UTC m=+0.161879646 container attach 1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_varahamihira, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.551399767 +0000 UTC m=+0.162939484 container died 1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_varahamihira, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:03:03 np0005593233 systemd[1]: var-lib-containers-storage-overlay-8e05fe0510b7e3ce050e6a4b9250294dd466f29516641791d66c25ad982bee07-merged.mount: Deactivated successfully.
Jan 23 04:03:03 np0005593233 podman[84162]: 2026-01-23 09:03:03.605140177 +0000 UTC m=+0.216679814 container remove 1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_varahamihira, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:03:03 np0005593233 systemd[1]: libpod-conmon-1ae9b7b21124739a7bdcd419e2f113ae2564ae484862529997201c38364cf0c3.scope: Deactivated successfully.
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.odtvxh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.odtvxh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: Deploying daemon rgw.rgw.compute-1.odtvxh on compute-1
Jan 23 04:03:03 np0005593233 systemd[1]: Reloading.
Jan 23 04:03:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 23 04:03:03 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:03 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:03 np0005593233 systemd[1]: Reloading.
Jan 23 04:03:04 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:04 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:04 np0005593233 systemd[1]: Starting Ceph rgw.rgw.compute-1.odtvxh for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:03:04 np0005593233 podman[84318]: 2026-01-23 09:03:04.529642444 +0000 UTC m=+0.046286017 container create 1311505032e63b7ad744fabe1f1d1bea1845c017566415b56d2e2b6ce8d34111 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-1-odtvxh, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 23 04:03:04 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bf5b8d213e9bfa6b62bef5e1530eaf1a1ef9b72bac8849646b3c01f9e45e5da/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:04 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bf5b8d213e9bfa6b62bef5e1530eaf1a1ef9b72bac8849646b3c01f9e45e5da/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:04 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bf5b8d213e9bfa6b62bef5e1530eaf1a1ef9b72bac8849646b3c01f9e45e5da/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:04 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bf5b8d213e9bfa6b62bef5e1530eaf1a1ef9b72bac8849646b3c01f9e45e5da/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.odtvxh supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:04 np0005593233 podman[84318]: 2026-01-23 09:03:04.593361763 +0000 UTC m=+0.110005366 container init 1311505032e63b7ad744fabe1f1d1bea1845c017566415b56d2e2b6ce8d34111 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-1-odtvxh, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 04:03:04 np0005593233 podman[84318]: 2026-01-23 09:03:04.598438065 +0000 UTC m=+0.115081638 container start 1311505032e63b7ad744fabe1f1d1bea1845c017566415b56d2e2b6ce8d34111 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-1-odtvxh, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 04:03:04 np0005593233 bash[84318]: 1311505032e63b7ad744fabe1f1d1bea1845c017566415b56d2e2b6ce8d34111
Jan 23 04:03:04 np0005593233 podman[84318]: 2026-01-23 09:03:04.510464694 +0000 UTC m=+0.027108277 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:04 np0005593233 systemd[1]: Started Ceph rgw.rgw.compute-1.odtvxh for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:03:04 np0005593233 radosgw[84337]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:03:04 np0005593233 radosgw[84337]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 23 04:03:04 np0005593233 radosgw[84337]: framework: beast
Jan 23 04:03:04 np0005593233 radosgw[84337]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 23 04:03:04 np0005593233 radosgw[84337]: init_numa not setting numa affinity
Jan 23 04:03:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 23 04:03:04 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jgxhia", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jgxhia", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: Deploying daemon rgw.rgw.compute-0.jgxhia on compute-0
Jan 23 04:03:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 23 04:03:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 23 04:03:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 23 04:03:06 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 49 pg[10.0( empty local-lis/les=0/0 n=0 ec=49/49 lis/c=0/0 les/c/f=0/0/0 sis=49) [1] r=0 lpr=49 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/2162450091' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.cfzfln", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.cfzfln", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:03:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e49 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 23 04:03:07 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 50 pg[10.0( empty local-lis/les=49/50 n=0 ec=49/49 lis/c=0/0 les/c/f=0/0/0 sis=49) [1] r=0 lpr=49 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: Deploying daemon mds.cephfs.compute-2.cfzfln on compute-2
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/671819638' entity='client.rgw.rgw.compute-0.jgxhia' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.101:0/2162450091' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/249273750' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/671819638' entity='client.rgw.rgw.compute-0.jgxhia' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.djntrk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.djntrk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.102:0/448850748' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.101:0/249273750' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e3 new map
Jan 23 04:03:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:02:44.729005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.cfzfln{-1:24154} state up:standby seq 1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e4 new map
Jan 23 04:03:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:08.999863+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:creating seq 1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 23 04:03:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 23 04:03:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 23 04:03:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 23 04:03:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 23 04:03:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/249273750' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: Deploying daemon mds.cephfs.compute-0.djntrk on compute-0
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: daemon mds.cephfs.compute-2.cfzfln assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: Cluster is now healthy
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: daemon mds.cephfs.compute-2.cfzfln is now active in filesystem cephfs as rank 0
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.102:0/448850748' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.101:0/249273750' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e5 new map
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:10.006927+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 2 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e6 new map
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:10.006927+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 2 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 23 04:03:10 np0005593233 podman[84546]: 2026-01-23 09:03:10.917303237 +0000 UTC m=+0.050139377 container create 21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_rosalind, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 23 04:03:10 np0005593233 systemd[1]: Started libpod-conmon-21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4.scope.
Jan 23 04:03:10 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:03:10 np0005593233 podman[84546]: 2026-01-23 09:03:10.901073994 +0000 UTC m=+0.033910164 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:10 np0005593233 podman[84546]: 2026-01-23 09:03:10.999326103 +0000 UTC m=+0.132162273 container init 21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_rosalind, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 23 04:03:11 np0005593233 podman[84546]: 2026-01-23 09:03:11.008109352 +0000 UTC m=+0.140945502 container start 21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_rosalind, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:03:11 np0005593233 podman[84546]: 2026-01-23 09:03:11.011632103 +0000 UTC m=+0.144468273 container attach 21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_rosalind, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 23 04:03:11 np0005593233 focused_rosalind[84562]: 167 167
Jan 23 04:03:11 np0005593233 systemd[1]: libpod-21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4.scope: Deactivated successfully.
Jan 23 04:03:11 np0005593233 podman[84546]: 2026-01-23 09:03:11.016771927 +0000 UTC m=+0.149608127 container died 21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_rosalind, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.elkrlx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.elkrlx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: Deploying daemon mds.cephfs.compute-1.elkrlx on compute-1
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:03:11 np0005593233 systemd[1]: var-lib-containers-storage-overlay-5f2bfee716cf5901be6bd720fcbfa9d81c8119c8c1120b0cff51b1e0bd104bd3-merged.mount: Deactivated successfully.
Jan 23 04:03:11 np0005593233 podman[84546]: 2026-01-23 09:03:11.062271662 +0000 UTC m=+0.195107812 container remove 21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=focused_rosalind, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 23 04:03:11 np0005593233 radosgw[84337]: LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:03:11 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-1-odtvxh[84333]: 2026-01-23T09:03:11.131+0000 7fce899f1940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:03:11 np0005593233 radosgw[84337]: framework: beast
Jan 23 04:03:11 np0005593233 radosgw[84337]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 23 04:03:11 np0005593233 radosgw[84337]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 23 04:03:11 np0005593233 systemd[1]: libpod-conmon-21518ed888c870ab3f022cd0ba990107693df7eddb61c2c96f0ba3dfeaca62b4.scope: Deactivated successfully.
Jan 23 04:03:11 np0005593233 systemd[1]: Reloading.
Jan 23 04:03:11 np0005593233 radosgw[84337]: starting handler: beast
Jan 23 04:03:11 np0005593233 radosgw[84337]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:03:11 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:11 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:11 np0005593233 radosgw[84337]: mgrc service_daemon_register rgw.24161 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.odtvxh,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864308,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=3e406311-b9dd-4a98-8793-19b1bcf2f2db,zone_name=default,zonegroup_id=b6190aad-4d81-46cc-a15a-858fefbf7de5,zonegroup_name=default}
Jan 23 04:03:11 np0005593233 systemd[1]: Reloading.
Jan 23 04:03:11 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:11 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:11 np0005593233 systemd[1]: Starting Ceph mds.cephfs.compute-1.elkrlx for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:03:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:12 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 23 04:03:12 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 23 04:03:12 np0005593233 podman[85245]: 2026-01-23 09:03:12.168482632 +0000 UTC m=+0.041078501 container create 9a6a352b8a3ec374992b064e4c047322f562057637707d463b4a88f446b70ef2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-1-elkrlx, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 23 04:03:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5ebd46ea4107031c6d6a41aed36e6c842c87d84b28798b33f0b630fd6b96ce0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5ebd46ea4107031c6d6a41aed36e6c842c87d84b28798b33f0b630fd6b96ce0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5ebd46ea4107031c6d6a41aed36e6c842c87d84b28798b33f0b630fd6b96ce0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5ebd46ea4107031c6d6a41aed36e6c842c87d84b28798b33f0b630fd6b96ce0/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.elkrlx supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:12 np0005593233 podman[85245]: 2026-01-23 09:03:12.229786808 +0000 UTC m=+0.102382717 container init 9a6a352b8a3ec374992b064e4c047322f562057637707d463b4a88f446b70ef2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-1-elkrlx, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:03:12 np0005593233 podman[85245]: 2026-01-23 09:03:12.236714949 +0000 UTC m=+0.109310828 container start 9a6a352b8a3ec374992b064e4c047322f562057637707d463b4a88f446b70ef2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-1-elkrlx, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 23 04:03:12 np0005593233 bash[85245]: 9a6a352b8a3ec374992b064e4c047322f562057637707d463b4a88f446b70ef2
Jan 23 04:03:12 np0005593233 podman[85245]: 2026-01-23 09:03:12.152805323 +0000 UTC m=+0.025401212 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:12 np0005593233 systemd[1]: Started Ceph mds.cephfs.compute-1.elkrlx for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:03:12 np0005593233 ceph-mds[85262]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:03:12 np0005593233 ceph-mds[85262]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 23 04:03:12 np0005593233 ceph-mds[85262]: main not setting numa affinity
Jan 23 04:03:12 np0005593233 ceph-mds[85262]: pidfile_write: ignore empty --pid-file
Jan 23 04:03:12 np0005593233 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-1-elkrlx[85258]: starting mds.cephfs.compute-1.elkrlx at 
Jan 23 04:03:12 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Updating MDS map to version 6 from mon.2
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e7 new map
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:13.047531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.elkrlx{-1:24167} state up:standby seq 1 addr [v2:192.168.122.101:6804/4162024387,v1:192.168.122.101:6805/4162024387] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:13 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Updating MDS map to version 7 from mon.2
Jan 23 04:03:13 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Monitors have assigned me to become a standby.
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:14 np0005593233 ceph-mon[81574]: Deploying daemon haproxy.rgw.default.compute-0.iyrury on compute-0
Jan 23 04:03:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e8 new map
Jan 23 04:03:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:13.047531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.elkrlx{-1:24167} state up:standby seq 1 addr [v2:192.168.122.101:6804/4162024387,v1:192.168.122.101:6805/4162024387] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e9 new map
Jan 23 04:03:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:13.047531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.elkrlx{-1:24167} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/4162024387,v1:192.168.122.101:6805/4162024387] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:17 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Updating MDS map to version 9 from mon.2
Jan 23 04:03:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:18 np0005593233 ceph-mon[81574]: Deploying daemon haproxy.rgw.default.compute-2.xmknsp on compute-2
Jan 23 04:03:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.005000143s ======
Jan 23 04:03:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:18.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000143s
Jan 23 04:03:19 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Jan 23 04:03:19 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Jan 23 04:03:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:20.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:22.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:23.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:23 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 23 04:03:23 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 23 04:03:24 np0005593233 ceph-mon[81574]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:03:24 np0005593233 ceph-mon[81574]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:03:24 np0005593233 ceph-mon[81574]: Deploying daemon keepalived.rgw.default.compute-2.tkmlem on compute-2
Jan 23 04:03:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:24.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:25.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:26.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:27.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:28.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:29.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:30.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:31.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:31 np0005593233 ceph-mon[81574]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:03:31 np0005593233 ceph-mon[81574]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:03:31 np0005593233 ceph-mon[81574]: Deploying daemon keepalived.rgw.default.compute-0.qsixev on compute-0
Jan 23 04:03:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:32.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:33 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.7 deep-scrub starts
Jan 23 04:03:33 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.7 deep-scrub ok
Jan 23 04:03:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:33.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:34.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:35 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 23 04:03:35 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 23 04:03:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:35.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:36 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.d scrub starts
Jan 23 04:03:36 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.d scrub ok
Jan 23 04:03:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:36.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:37.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:38 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 23 04:03:38 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 23 04:03:38 np0005593233 podman[85502]: 2026-01-23 09:03:38.400535754 +0000 UTC m=+0.068935949 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:03:38 np0005593233 podman[85502]: 2026-01-23 09:03:38.548780426 +0000 UTC m=+0.217180611 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:03:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:38.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:39 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 23 04:03:39 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 23 04:03:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:39.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:40.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:03:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.17 deep-scrub starts
Jan 23 04:03:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.17 deep-scrub ok
Jan 23 04:03:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:42.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 23 04:03:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:43 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 23 04:03:43 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 23 04:03:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:43.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 23 04:03:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:44 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.1a deep-scrub starts
Jan 23 04:03:44 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.1a deep-scrub ok
Jan 23 04:03:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:44.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 23 04:03:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 23 04:03:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 23 04:03:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:45.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 23 04:03:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 57 pg[10.0( v 50'48 (0'0,50'48] local-lis/les=49/50 n=8 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=9.623423576s) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 50'47 mlcod 50'47 active pruub 157.782730103s@ mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 57 pg[10.0( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=9.623423576s) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 50'47 mlcod 0'0 unknown pruub 157.782730103s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:46.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 23 04:03:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:47.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.10( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.12( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.7( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1f( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1b( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1e( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.11( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1c( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1d( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1a( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.19( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.18( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.6( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.5( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.4( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.3( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.b( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.8( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.9( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.a( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.c( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.d( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.e( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.f( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1( v 50'48 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.2( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.14( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.13( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.15( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.16( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.17( v 50'48 lc 0'0 (0'0,50'48] local-lis/les=49/50 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.10( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.7( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1f( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.12( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1e( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1b( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.11( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1d( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.6( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.18( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.19( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.5( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1c( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.4( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.3( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1a( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.b( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.a( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.9( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.d( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.0( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 50'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.e( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.f( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.14( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.8( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.2( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.1( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.13( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.15( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.16( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.c( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 58 pg[10.17( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=49/49 les/c/f=50/50/0 sis=57) [1] r=0 lpr=57 pi=[49,57)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:48 np0005593233 podman[85977]: 2026-01-23 09:03:48.022470015 +0000 UTC m=+0.059917201 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:03:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 23 04:03:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:48 np0005593233 podman[85977]: 2026-01-23 09:03:48.340260928 +0000 UTC m=+0.377708034 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 23 04:03:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000058s ======
Jan 23 04:03:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:48.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Jan 23 04:03:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:49.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:03:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:50.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:51.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 23 04:03:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 23 04:03:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.12( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.159576416s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234649658s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.12( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.159499168s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234649658s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.10( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.154307365s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.230270386s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.1b( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158814430s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234802246s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.10( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.154241562s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.230270386s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.1e( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158686638s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234741211s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.1e( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158668518s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234741211s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.1b( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158687592s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234802246s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.19( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158475876s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234970093s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.18( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158370018s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234939575s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.11( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158835411s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234832764s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.19( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158384323s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234970093s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.18( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158349037s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234939575s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.4( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158225060s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.235092163s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.4( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158200264s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.235092163s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.3( v 60'51 (0'0,60'51] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158059120s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=58'49 lcod 58'50 mlcod 58'50 active pruub 165.235137939s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.5( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.158011436s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.234970093s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.8( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157857895s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.235183716s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.5( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157712936s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234970093s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.3( v 60'51 (0'0,60'51] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157695770s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=58'49 lcod 58'50 mlcod 0'0 unknown NOTIFY pruub 165.235137939s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.1( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157683372s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.235397339s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.1( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157666206s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.235397339s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.2( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157526970s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.235382080s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.2( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157512665s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.235382080s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.13( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157436371s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.235412598s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.13( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157418251s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.235412598s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.15( v 60'51 (0'0,60'51] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157264709s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=58'49 lcod 58'50 mlcod 58'50 active pruub 165.235412598s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.14( v 60'51 (0'0,60'51] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157102585s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=58'49 lcod 58'50 mlcod 58'50 active pruub 165.235321045s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.15( v 60'51 (0'0,60'51] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157201767s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=58'49 lcod 58'50 mlcod 0'0 unknown NOTIFY pruub 165.235412598s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.8( v 50'48 (0'0,50'48] local-lis/les=57/58 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157821655s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.235183716s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.11( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.157844543s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.234832764s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.f( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.156375885s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active pruub 165.235305786s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.f( v 50'48 (0'0,50'48] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.156170845s) [2] r=-1 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 165.235305786s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[10.14( v 60'51 (0'0,60'51] local-lis/les=57/58 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61 pruub=11.156352997s) [0] r=-1 lpr=61 pi=[57,61)/1 crt=58'49 lcod 58'50 mlcod 0'0 unknown NOTIFY pruub 165.235321045s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.12( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.12( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.10( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.17( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.7( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.4( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.14( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.5( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.8( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.f( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.4( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.1( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.1e( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.1d( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.1b( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.18( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.19( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.1a( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.1b( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[8.14( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 61 pg[11.1c( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:52.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.12( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.19( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.12( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.10( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.1a( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.1e( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.1d( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.18( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.1c( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.1b( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.1b( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.4( v 46'4 (0'0,46'4] local-lis/les=61/62 n=1 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.7( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.4( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.5( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.8( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.1( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.14( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[11.f( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [1] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.14( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 62 pg[8.17( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [1] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:03:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:03:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 23 04:03:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:03:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:53.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:54.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:56 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Jan 23 04:03:56 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Jan 23 04:03:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:56.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:57 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.13 deep-scrub starts
Jan 23 04:03:57 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.13 deep-scrub ok
Jan 23 04:03:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:57.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:03:58 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Jan 23 04:03:58 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Jan 23 04:03:58 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 23 04:03:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 23 04:03:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:03:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:58.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:03:59 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.d deep-scrub starts
Jan 23 04:03:59 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.d deep-scrub ok
Jan 23 04:03:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 23 04:03:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:03:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:03:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:04:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 23 04:04:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 23 04:04:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:00.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:01 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.e scrub starts
Jan 23 04:04:01 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.e scrub ok
Jan 23 04:04:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:01.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 23 04:04:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 23 04:04:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 23 04:04:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:02.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:04:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:03.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:04:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 23 04:04:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 23 04:04:04 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Jan 23 04:04:04 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Jan 23 04:04:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 23 04:04:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:04.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:05 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.c deep-scrub starts
Jan 23 04:04:05 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.c deep-scrub ok
Jan 23 04:04:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:05.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 23 04:04:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:04:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:06.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:04:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:07 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Jan 23 04:04:07 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Jan 23 04:04:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:07.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:08.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 23 04:04:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 23 04:04:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:04:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:04:09 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 23 04:04:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 23 04:04:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:10.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 23 04:04:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 23 04:04:10 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 71 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=71) [1] r=0 lpr=71 pi=[57,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:10 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 71 pg[9.e( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=71) [1] r=0 lpr=71 pi=[57,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:10 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 71 pg[9.6( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=71) [1] r=0 lpr=71 pi=[57,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:10 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 71 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=71) [1] r=0 lpr=71 pi=[57,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.5 deep-scrub starts
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.5 deep-scrub ok
Jan 23 04:04:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:11.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 23 04:04:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.6( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.e( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.6( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.e( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:11 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 72 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=72) [1]/[0] r=-1 lpr=72 pi=[57,72)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:12.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 23 04:04:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:13.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.6( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.6( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.e( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.e( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 74 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.015000428s ======
Jan 23 04:04:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:14.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.015000428s
Jan 23 04:04:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 23 04:04:14 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 75 pg[9.6( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=6 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:14 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 75 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:14 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 75 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:14 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 75 pg[9.e( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=6 ec=57/47 lis/c=72/57 les/c/f=73/59/0 sis=74) [1] r=0 lpr=74 pi=[57,74)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:15.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:16 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Jan 23 04:04:16 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Jan 23 04:04:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:16.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:17.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 23 04:04:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:18.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 23 04:04:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 23 04:04:19 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Jan 23 04:04:19 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Jan 23 04:04:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:19.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 23 04:04:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 23 04:04:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:20.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 23 04:04:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 23 04:04:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:21.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 23 04:04:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:22.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:23 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.d scrub starts
Jan 23 04:04:23 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.d scrub ok
Jan 23 04:04:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 23 04:04:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:23.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 23 04:04:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:24.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:25.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:26 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 23 04:04:26 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 23 04:04:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:26.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:27.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:28 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Jan 23 04:04:28 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Jan 23 04:04:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 23 04:04:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 23 04:04:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:28.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:29.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 23 04:04:30 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Jan 23 04:04:30 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Jan 23 04:04:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 23 04:04:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 23 04:04:30 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 83 pg[9.a( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=83) [1] r=0 lpr=83 pi=[57,83)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:30 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 83 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=83) [1] r=0 lpr=83 pi=[57,83)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:30.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:31.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 23 04:04:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 23 04:04:31 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 84 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:31 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 84 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:31 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 84 pg[9.a( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:31 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 84 pg[9.a( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [1]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 23 04:04:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:32.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:33.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 23 04:04:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 86 pg[9.a( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[57,86)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 86 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[57,86)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 86 pg[9.a( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 86 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 23 04:04:34 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 87 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:34 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 87 pg[9.a( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=6 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [1] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:04:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:34.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:04:35 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 23 04:04:35 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 23 04:04:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:35.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:36.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:37.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:38.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 23 04:04:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 23 04:04:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:39.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 23 04:04:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:40.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 23 04:04:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 23 04:04:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:04:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:41.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:04:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 23 04:04:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 23 04:04:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 23 04:04:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:42.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 90 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=90) [1] r=0 lpr=90 pi=[72,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 90 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=90) [1] r=0 lpr=90 pi=[72,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 23 04:04:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 04:04:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:43.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 23 04:04:44 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 91 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=-1 lpr=91 pi=[72,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:44 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 91 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=-1 lpr=91 pi=[72,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:44 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 91 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=-1 lpr=91 pi=[72,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:44 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 91 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=-1 lpr=91 pi=[72,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 04:04:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 23 04:04:44 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.16 deep-scrub starts
Jan 23 04:04:44 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.16 deep-scrub ok
Jan 23 04:04:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:44.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 23 04:04:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 23 04:04:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.15 deep-scrub starts
Jan 23 04:04:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:45.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.15 deep-scrub ok
Jan 23 04:04:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 23 04:04:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 93 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=93) [1] r=0 lpr=93 pi=[67,93)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 93 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=93) [1] r=0 lpr=93 pi=[67,93)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 93 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93) [1] r=0 lpr=93 pi=[72,93)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 93 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93) [1] r=0 lpr=93 pi=[72,93)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 93 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93) [1] r=0 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 93 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93) [1] r=0 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Jan 23 04:04:46 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Jan 23 04:04:46 np0005593233 systemd-logind[804]: New session 33 of user zuul.
Jan 23 04:04:46 np0005593233 systemd[1]: Started Session 33 of User zuul.
Jan 23 04:04:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:46.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 23 04:04:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 23 04:04:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 94 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=-1 lpr=94 pi=[67,94)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 94 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=-1 lpr=94 pi=[67,94)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 94 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=-1 lpr=94 pi=[67,94)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 94 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=-1 lpr=94 pi=[67,94)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 94 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=93/94 n=6 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93) [1] r=0 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:47 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 94 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=93/94 n=5 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93) [1] r=0 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:47.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:47 np0005593233 python3.9[86300]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:04:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 23 04:04:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:48.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 23 04:04:48 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 96 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96) [1] r=0 lpr=96 pi=[67,96)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:48 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 96 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96) [1] r=0 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:48 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 96 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96) [1] r=0 lpr=96 pi=[67,96)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:48 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 96 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96) [1] r=0 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:49 np0005593233 python3.9[86514]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:04:49 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Jan 23 04:04:49 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Jan 23 04:04:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:49.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 23 04:04:49 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 97 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=96/97 n=5 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96) [1] r=0 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:49 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 97 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=96/97 n=6 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96) [1] r=0 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:50 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Jan 23 04:04:50 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Jan 23 04:04:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:50.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:51.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 23 04:04:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:04:52 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 98 pg[9.10( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=98) [1] r=0 lpr=98 pi=[57,98)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:52.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:04:53 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Jan 23 04:04:53 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Jan 23 04:04:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:04:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:53.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:04:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 23 04:04:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 99 pg[9.10( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=99) [1]/[0] r=-1 lpr=99 pi=[57,99)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:53 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 99 pg[9.10( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=99) [1]/[0] r=-1 lpr=99 pi=[57,99)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 23 04:04:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:54.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:54 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 100 pg[9.11( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=100) [1] r=0 lpr=100 pi=[57,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:55.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:56 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 23 04:04:56 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 23 04:04:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:56.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:57.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:57 np0005593233 podman[86741]: 2026-01-23 09:04:57.489170588 +0000 UTC m=+0.056940993 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 23 04:04:57 np0005593233 podman[86741]: 2026-01-23 09:04:57.606277704 +0000 UTC m=+0.174048079 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:04:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:04:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:58.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:04:58 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 23 04:04:59 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Jan 23 04:04:59 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Jan 23 04:04:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:04:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:04:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:59.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 23 04:05:00 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 101 pg[9.11( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=101) [1]/[0] r=-1 lpr=101 pi=[57,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:00 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 101 pg[9.11( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=101) [1]/[0] r=-1 lpr=101 pi=[57,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:00 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 101 pg[9.10( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=99/57 les/c/f=100/59/0 sis=101) [1] r=0 lpr=101 pi=[57,101)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:00 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 101 pg[9.10( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=99/57 les/c/f=100/59/0 sis=101) [1] r=0 lpr=101 pi=[57,101)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:00 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Jan 23 04:05:00 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Jan 23 04:05:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 23 04:05:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 23 04:05:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:00.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:01.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 23 04:05:01 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 102 pg[9.12( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=102) [1] r=0 lpr=102 pi=[57,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:01 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 102 pg[9.10( v 53'1142 (0'0,53'1142] local-lis/les=101/102 n=6 ec=57/47 lis/c=99/57 les/c/f=100/59/0 sis=101) [1] r=0 lpr=101 pi=[57,101)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Jan 23 04:05:02 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.332114) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102332443, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6842, "num_deletes": 255, "total_data_size": 12194698, "memory_usage": 12466384, "flush_reason": "Manual Compaction"}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102420468, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7285540, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 257, "largest_seqno": 6847, "table_properties": {"data_size": 7259204, "index_size": 17005, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8261, "raw_key_size": 76785, "raw_average_key_size": 23, "raw_value_size": 7195772, "raw_average_value_size": 2191, "num_data_blocks": 757, "num_entries": 3283, "num_filter_entries": 3283, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 1769158916, "file_creation_time": 1769159102, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 88403 microseconds, and 35548 cpu microseconds.
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.420653) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7285540 bytes OK
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.420720) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.425501) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.425606) EVENT_LOG_v1 {"time_micros": 1769159102425596, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.425666) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 12158345, prev total WAL file size 12165776, number of live WAL files 2.
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.430215) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7114KB) 8(1648B)]
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102430581, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7287188, "oldest_snapshot_seqno": -1}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 23 04:05:02 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 103 pg[9.11( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=101/57 les/c/f=102/59/0 sis=103) [1] r=0 lpr=103 pi=[57,103)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:02 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 103 pg[9.11( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=101/57 les/c/f=102/59/0 sis=103) [1] r=0 lpr=103 pi=[57,103)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:02 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 103 pg[9.12( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=103) [1]/[0] r=-1 lpr=103 pi=[57,103)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:02 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 103 pg[9.12( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=103) [1]/[0] r=-1 lpr=103 pi=[57,103)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3032 keys, 7282008 bytes, temperature: kUnknown
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102508943, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7282008, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7256356, "index_size": 16966, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7621, "raw_key_size": 72607, "raw_average_key_size": 23, "raw_value_size": 7196052, "raw_average_value_size": 2373, "num_data_blocks": 756, "num_entries": 3032, "num_filter_entries": 3032, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159102, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.509366) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7282008 bytes
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.510991) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 92.9 rd, 92.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(6.9, 0.0 +0.0 blob) out(6.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3288, records dropped: 256 output_compression: NoCompression
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.511012) EVENT_LOG_v1 {"time_micros": 1769159102511003, "job": 4, "event": "compaction_finished", "compaction_time_micros": 78472, "compaction_time_cpu_micros": 26171, "output_level": 6, "num_output_files": 1, "total_output_size": 7282008, "num_input_records": 3288, "num_output_records": 3032, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102512487, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102512569, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 23 04:05:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:05:02.429675) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:05:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:02.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:03.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 23 04:05:04 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.f deep-scrub starts
Jan 23 04:05:04 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.f deep-scrub ok
Jan 23 04:05:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 23 04:05:04 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 104 pg[9.11( v 53'1142 (0'0,53'1142] local-lis/les=103/104 n=6 ec=57/47 lis/c=101/57 les/c/f=102/59/0 sis=103) [1] r=0 lpr=103 pi=[57,103)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:04.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:05 np0005593233 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 04:05:05 np0005593233 systemd[1]: session-33.scope: Consumed 8.743s CPU time.
Jan 23 04:05:05 np0005593233 systemd-logind[804]: Session 33 logged out. Waiting for processes to exit.
Jan 23 04:05:05 np0005593233 systemd-logind[804]: Removed session 33.
Jan 23 04:05:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:05.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 23 04:05:06 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 105 pg[9.12( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=103/57 les/c/f=104/59/0 sis=105) [1] r=0 lpr=105 pi=[57,105)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:06 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 105 pg[9.12( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=103/57 les/c/f=104/59/0 sis=105) [1] r=0 lpr=105 pi=[57,105)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.d scrub starts
Jan 23 04:05:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.d scrub ok
Jan 23 04:05:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:05:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:05:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:06.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 23 04:05:07 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 106 pg[9.12( v 53'1142 (0'0,53'1142] local-lis/les=105/106 n=5 ec=57/47 lis/c=103/57 les/c/f=104/59/0 sis=105) [1] r=0 lpr=105 pi=[57,105)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:07.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 23 04:05:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:08.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 23 04:05:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 23 04:05:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 23 04:05:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:09.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 23 04:05:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 23 04:05:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 23 04:05:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:10.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:11 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.a scrub starts
Jan 23 04:05:11 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.a scrub ok
Jan 23 04:05:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 23 04:05:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:11.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:12 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Jan 23 04:05:12 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Jan 23 04:05:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 23 04:05:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 23 04:05:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:12.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:12 np0005593233 systemd[72470]: Created slice User Background Tasks Slice.
Jan 23 04:05:12 np0005593233 systemd[72470]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 04:05:12 np0005593233 systemd[72470]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 04:05:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 109 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=109) [1] r=0 lpr=109 pi=[72,109)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:13.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 23 04:05:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 110 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=110) [1]/[2] r=-1 lpr=110 pi=[72,110)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:13 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 110 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=110) [1]/[2] r=-1 lpr=110 pi=[72,110)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 23 04:05:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:14.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 23 04:05:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 23 04:05:15 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 111 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=111 pruub=11.595157623s) [2] r=-1 lpr=111 pi=[74,111)/1 crt=53'1142 mlcod 0'0 active pruub 248.861068726s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:15 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 111 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=111 pruub=11.595061302s) [2] r=-1 lpr=111 pi=[74,111)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 248.861068726s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:15.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 23 04:05:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 23 04:05:16 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 112 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=110/72 les/c/f=111/73/0 sis=112) [1] r=0 lpr=112 pi=[72,112)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:16 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 112 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=112) [2]/[1] r=0 lpr=112 pi=[74,112)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:16 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 112 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=110/72 les/c/f=111/73/0 sis=112) [1] r=0 lpr=112 pi=[72,112)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:16 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 112 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=112) [2]/[1] r=0 lpr=112 pi=[74,112)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:16.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:17.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 23 04:05:17 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 113 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=112/113 n=5 ec=57/47 lis/c=110/72 les/c/f=111/73/0 sis=112) [1] r=0 lpr=112 pi=[72,112)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:17 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 113 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=112/113 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=112) [2]/[1] async=[2] r=0 lpr=112 pi=[74,112)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:18 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Jan 23 04:05:18 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Jan 23 04:05:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 23 04:05:18 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 114 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=112/113 n=5 ec=57/47 lis/c=112/74 les/c/f=113/75/0 sis=114 pruub=14.940567970s) [2] async=[2] r=-1 lpr=114 pi=[74,114)/1 crt=53'1142 mlcod 53'1142 active pruub 255.438583374s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:18 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 114 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=112/113 n=5 ec=57/47 lis/c=112/74 les/c/f=113/75/0 sis=114 pruub=14.940318108s) [2] r=-1 lpr=114 pi=[74,114)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 255.438583374s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:18.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:19.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 23 04:05:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:20.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:20 np0005593233 systemd-logind[804]: New session 34 of user zuul.
Jan 23 04:05:20 np0005593233 systemd[1]: Started Session 34 of User zuul.
Jan 23 04:05:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:21.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:21 np0005593233 python3.9[87198]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 04:05:22 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 23 04:05:22 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 23 04:05:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:22.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:23 np0005593233 python3.9[87372]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:05:23 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 23 04:05:23 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 23 04:05:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:23.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:24.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:25.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:25 np0005593233 python3.9[87528]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:05:26 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.3 deep-scrub starts
Jan 23 04:05:26 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.3 deep-scrub ok
Jan 23 04:05:26 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 23 04:05:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 23 04:05:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:26.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:27 np0005593233 python3.9[87681]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:05:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 23 04:05:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:27.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:28 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.a deep-scrub starts
Jan 23 04:05:28 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.a deep-scrub ok
Jan 23 04:05:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 23 04:05:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 23 04:05:28 np0005593233 python3.9[87835]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:05:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:28.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:29 np0005593233 python3.9[87987]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:05:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 23 04:05:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:30 np0005593233 python3.9[88137]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:05:30 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.8 deep-scrub starts
Jan 23 04:05:30 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.8 deep-scrub ok
Jan 23 04:05:30 np0005593233 network[88154]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:05:30 np0005593233 network[88155]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:05:30 np0005593233 network[88156]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:05:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 23 04:05:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 23 04:05:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:30.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 23 04:05:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 23 04:05:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:31.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:32 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 23 04:05:32 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 23 04:05:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 04:05:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 23 04:05:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:32.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:33 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Jan 23 04:05:33 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Jan 23 04:05:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 04:05:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 23 04:05:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 120 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=120 pruub=13.341302872s) [0] r=-1 lpr=120 pi=[86,120)/1 crt=53'1142 mlcod 0'0 active pruub 268.752349854s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 121 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=120 pruub=13.340609550s) [0] r=-1 lpr=120 pi=[86,120)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 268.752349854s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:33.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 23 04:05:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 122 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=122) [0]/[1] r=0 lpr=122 pi=[86,122)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:33 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 122 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=122) [0]/[1] r=0 lpr=122 pi=[86,122)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 23 04:05:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:34.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 23 04:05:34 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 123 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=122/123 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=122) [0]/[1] async=[0] r=0 lpr=122 pi=[86,122)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 23 04:05:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:35.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:35 np0005593233 python3.9[88416]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:05:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 23 04:05:35 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 124 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=122/123 n=5 ec=57/47 lis/c=122/86 les/c/f=123/87/0 sis=124 pruub=15.002258301s) [0] async=[0] r=-1 lpr=124 pi=[86,124)/1 crt=53'1142 mlcod 53'1142 active pruub 272.903198242s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:35 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 124 pg[9.1a( v 53'1142 (0'0,53'1142] local-lis/les=122/123 n=5 ec=57/47 lis/c=122/86 les/c/f=123/87/0 sis=124 pruub=15.001556396s) [0] r=-1 lpr=124 pi=[86,124)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 272.903198242s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:36 np0005593233 python3.9[88566]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:05:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:36.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 23 04:05:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 23 04:05:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:37.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 23 04:05:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 23 04:05:38 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.a scrub starts
Jan 23 04:05:38 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 4.a scrub ok
Jan 23 04:05:38 np0005593233 python3.9[88720]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:05:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:38.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 23 04:05:39 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.7 deep-scrub starts
Jan 23 04:05:39 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 6.7 deep-scrub ok
Jan 23 04:05:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 23 04:05:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:39.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:39 np0005593233 python3.9[88878]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:05:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 23 04:05:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 127 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=93/94 n=5 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=127 pruub=11.046813965s) [2] r=-1 lpr=127 pi=[93,127)/1 crt=53'1142 mlcod 0'0 active pruub 273.223419189s@ mbc={}] start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 127 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=93/94 n=5 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=127 pruub=11.046676636s) [2] r=-1 lpr=127 pi=[93,127)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 273.223419189s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 23 04:05:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 128 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=128 pruub=10.545815468s) [0] r=-1 lpr=128 pi=[74,128)/1 crt=53'1142 mlcod 0'0 active pruub 272.861602783s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:40 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 128 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=128 pruub=10.545728683s) [0] r=-1 lpr=128 pi=[74,128)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 272.861602783s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:40.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:40 np0005593233 python3.9[88962]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:05:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.6 scrub starts
Jan 23 04:05:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.6 scrub ok
Jan 23 04:05:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 23 04:05:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 23 04:05:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 23 04:05:41 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 129 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=129) [0]/[1] r=0 lpr=129 pi=[74,129)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:41 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 129 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=74/75 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=129) [0]/[1] r=0 lpr=129 pi=[74,129)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:41 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 129 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=93/94 n=5 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[1] r=0 lpr=129 pi=[93,129)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:41 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 129 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=93/94 n=5 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[1] r=0 lpr=129 pi=[93,129)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:41.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:05:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 23 04:05:42 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 130 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=96/97 n=5 ec=57/47 lis/c=96/96 les/c/f=97/97/0 sis=130 pruub=11.551455498s) [0] r=-1 lpr=130 pi=[96,130)/1 crt=53'1142 mlcod 0'0 active pruub 275.895507812s@ mbc={}] start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:42 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 130 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=96/97 n=5 ec=57/47 lis/c=96/96 les/c/f=97/97/0 sis=130 pruub=11.551339149s) [0] r=-1 lpr=130 pi=[96,130)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 275.895507812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:42 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 130 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=129/130 n=5 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[1] async=[2] r=0 lpr=129 pi=[93,129)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:42 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 130 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=129/130 n=5 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=129) [0]/[1] async=[0] r=0 lpr=129 pi=[74,129)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:42.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:05:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 23 04:05:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 131 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=129/130 n=5 ec=57/47 lis/c=129/93 les/c/f=130/94/0 sis=131 pruub=14.983423233s) [2] async=[2] r=-1 lpr=131 pi=[93,131)/1 crt=53'1142 mlcod 53'1142 active pruub 280.350952148s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 131 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=129/130 n=5 ec=57/47 lis/c=129/93 les/c/f=130/94/0 sis=131 pruub=14.983318329s) [2] r=-1 lpr=131 pi=[93,131)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 280.350952148s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 131 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=129/130 n=5 ec=57/47 lis/c=129/74 les/c/f=130/75/0 sis=131 pruub=14.989916801s) [0] async=[0] r=-1 lpr=131 pi=[74,131)/1 crt=53'1142 mlcod 53'1142 active pruub 280.357604980s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 131 pg[9.1e( v 53'1142 (0'0,53'1142] local-lis/les=129/130 n=5 ec=57/47 lis/c=129/74 les/c/f=130/75/0 sis=131 pruub=14.989864349s) [0] r=-1 lpr=131 pi=[74,131)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 280.357604980s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 131 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=96/97 n=5 ec=57/47 lis/c=96/96 les/c/f=97/97/0 sis=131) [0]/[1] r=0 lpr=131 pi=[96,131)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:43 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 131 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=96/97 n=5 ec=57/47 lis/c=96/96 les/c/f=97/97/0 sis=131) [0]/[1] r=0 lpr=131 pi=[96,131)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:43.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:44 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 23 04:05:44 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 23 04:05:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 23 04:05:44 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 132 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=131/132 n=5 ec=57/47 lis/c=96/96 les/c/f=97/97/0 sis=131) [0]/[1] async=[0] r=0 lpr=131 pi=[96,131)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:44.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Jan 23 04:05:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Jan 23 04:05:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:45.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 23 04:05:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 133 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=131/132 n=5 ec=57/47 lis/c=131/96 les/c/f=132/97/0 sis=133 pruub=14.860004425s) [0] async=[0] r=-1 lpr=133 pi=[96,133)/1 crt=53'1142 mlcod 53'1142 active pruub 282.404296875s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:45 np0005593233 ceph-osd[78880]: osd.1 pg_epoch: 133 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=131/132 n=5 ec=57/47 lis/c=131/96 les/c/f=132/97/0 sis=133 pruub=14.859866142s) [0] r=-1 lpr=133 pi=[96,133)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 282.404296875s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:46 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 23 04:05:46 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 23 04:05:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:46.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 23 04:05:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:47.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:48 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.b scrub starts
Jan 23 04:05:48 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.b scrub ok
Jan 23 04:05:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:48.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:49.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:50 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 23 04:05:50 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 23 04:05:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:50.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:51.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:52.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:53.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:54 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.d scrub starts
Jan 23 04:05:54 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.d scrub ok
Jan 23 04:05:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:54.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:05:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:55.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:05:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:56.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:05:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:57.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:58.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:05:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:05:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:59.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:06:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:00.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:01 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.e scrub starts
Jan 23 04:06:01 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.e scrub ok
Jan 23 04:06:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:01.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:02 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 23 04:06:02 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 23 04:06:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:02.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:03.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:04.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:04 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Jan 23 04:06:04 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Jan 23 04:06:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:05 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Jan 23 04:06:05 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Jan 23 04:06:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 23 04:06:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 23 04:06:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:06:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:06.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:06:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:07.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:07 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1d scrub starts
Jan 23 04:06:07 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1d scrub ok
Jan 23 04:06:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:08.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:06:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:09.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:06:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Jan 23 04:06:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Jan 23 04:06:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:10.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:11.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:11 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.12 deep-scrub starts
Jan 23 04:06:11 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.12 deep-scrub ok
Jan 23 04:06:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:06:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:12.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:06:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:14 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Jan 23 04:06:14 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Jan 23 04:06:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:14.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:15.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:06:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:16 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Jan 23 04:06:16 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Jan 23 04:06:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:16.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:06:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:17.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:06:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:18.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:06:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:19.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:19 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Jan 23 04:06:19 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Jan 23 04:06:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:20.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:21.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:22.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:24 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 23 04:06:24 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 23 04:06:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:24.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:25.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:26.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:27.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:28 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 23 04:06:28 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 23 04:06:28 np0005593233 python3.9[89440]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:06:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:28.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:29.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:29 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 23 04:06:29 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 23 04:06:30 np0005593233 python3.9[89727]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 04:06:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:30.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:31.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:31 np0005593233 python3.9[89879]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 04:06:32 np0005593233 python3.9[90031]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:06:32 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.f scrub starts
Jan 23 04:06:32 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.f scrub ok
Jan 23 04:06:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:32.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:33.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:33 np0005593233 python3.9[90183]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 04:06:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:34.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:35 np0005593233 python3.9[90335]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:35.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:35 np0005593233 python3.9[90487]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:06:36 np0005593233 python3.9[90565]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:06:36 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Jan 23 04:06:36 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Jan 23 04:06:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:36.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:37.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:37 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 23 04:06:37 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 23 04:06:37 np0005593233 python3.9[90717]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:06:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:38.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:39 np0005593233 python3.9[90871]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 04:06:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:39.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:40 np0005593233 python3.9[91024]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 04:06:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:40.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:41.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:41 np0005593233 python3.9[91177]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:06:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 23 04:06:41 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 23 04:06:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:42 np0005593233 python3.9[91329]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 04:06:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:42.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:43.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:43 np0005593233 python3.9[91481]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:06:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:44.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:45.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Jan 23 04:06:45 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Jan 23 04:06:46 np0005593233 python3.9[91635]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:46 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 23 04:06:46 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 23 04:06:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:46.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:47 np0005593233 python3.9[91787]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:06:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:47.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:48 np0005593233 python3.9[91865]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:48 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.14 deep-scrub starts
Jan 23 04:06:48 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.14 deep-scrub ok
Jan 23 04:06:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:48.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:49.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:50 np0005593233 python3.9[92017]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:06:50 np0005593233 python3.9[92095]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000059s ======
Jan 23 04:06:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:50.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 23 04:06:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:52 np0005593233 python3.9[92247]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:06:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:53.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:53.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:53 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1d deep-scrub starts
Jan 23 04:06:53 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1d deep-scrub ok
Jan 23 04:06:54 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 23 04:06:54 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 23 04:06:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:55.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:55 np0005593233 python3.9[92398]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:06:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:55.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:56 np0005593233 python3.9[92550]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 04:06:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000059s ======
Jan 23 04:06:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:57.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 23 04:06:57 np0005593233 python3.9[92700]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:06:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:06:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:57.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:06:58 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.14 deep-scrub starts
Jan 23 04:06:58 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.14 deep-scrub ok
Jan 23 04:06:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:59.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:59 np0005593233 python3.9[92852]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:06:59 np0005593233 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 04:06:59 np0005593233 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 04:06:59 np0005593233 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 04:06:59 np0005593233 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:06:59 np0005593233 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:06:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:06:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:59.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:00 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 23 04:07:00 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 23 04:07:00 np0005593233 python3.9[93013]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 04:07:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:07:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:01.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:07:01 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 23 04:07:01 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 23 04:07:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:01.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:02 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.1b deep-scrub starts
Jan 23 04:07:02 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 8.1b deep-scrub ok
Jan 23 04:07:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:03.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:03.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:05.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:05 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Jan 23 04:07:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:05.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:05 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Jan 23 04:07:06 np0005593233 python3.9[93165]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:07:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 23 04:07:06 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 23 04:07:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:07.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:07 np0005593233 python3.9[93319]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:07:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:07:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:07.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:07:07 np0005593233 systemd[1]: session-34.scope: Deactivated successfully.
Jan 23 04:07:07 np0005593233 systemd[1]: session-34.scope: Consumed 1min 11.212s CPU time.
Jan 23 04:07:07 np0005593233 systemd-logind[804]: Session 34 logged out. Waiting for processes to exit.
Jan 23 04:07:07 np0005593233 systemd-logind[804]: Removed session 34.
Jan 23 04:07:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:09.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:09.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.6 scrub starts
Jan 23 04:07:09 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.6 scrub ok
Jan 23 04:07:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:11.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000059s ======
Jan 23 04:07:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:11.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 23 04:07:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:12 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.a scrub starts
Jan 23 04:07:12 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.a scrub ok
Jan 23 04:07:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:13.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:13.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:13 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 23 04:07:13 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 23 04:07:13 np0005593233 systemd-logind[804]: New session 35 of user zuul.
Jan 23 04:07:13 np0005593233 systemd[1]: Started Session 35 of User zuul.
Jan 23 04:07:15 np0005593233 python3.9[93500]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:07:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:15.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:07:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:07:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:15.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:07:16 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.f scrub starts
Jan 23 04:07:16 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.f scrub ok
Jan 23 04:07:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:17.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:17 np0005593233 python3.9[93656]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 04:07:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:17.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:18 np0005593233 python3.9[93809]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:07:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:19.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:19 np0005593233 python3.9[93893]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:07:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:19.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:21.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:21 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Jan 23 04:07:21 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Jan 23 04:07:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:21.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:22 np0005593233 python3.9[94046]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:22 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.11 deep-scrub starts
Jan 23 04:07:22 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.11 deep-scrub ok
Jan 23 04:07:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:23.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:23.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:24 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.12 deep-scrub starts
Jan 23 04:07:24 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.12 deep-scrub ok
Jan 23 04:07:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:25.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:25 np0005593233 python3.9[94329]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:07:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:25.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:26 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:07:26 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:26 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:07:26 np0005593233 python3.9[94482]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:27.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:27.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:27 np0005593233 python3.9[94634]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 04:07:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:29.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:29 np0005593233 python3.9[94784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:29 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 23 04:07:29 np0005593233 ceph-osd[78880]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 23 04:07:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:29.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:30 np0005593233 python3.9[94942]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:31.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:31.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:32 np0005593233 python3.9[95146]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:07:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:33.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:33.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:35 np0005593233 python3.9[95433]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 04:07:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:35.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:35.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:36 np0005593233 python3.9[95583]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:07:36 np0005593233 python3.9[95737]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:37.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:37.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:07:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:39.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:07:39 np0005593233 python3.9[95890]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:39.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:41.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:07:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:41.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:07:42 np0005593233 python3.9[96043]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:07:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:43 np0005593233 python3.9[96197]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 23 04:07:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:43.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:43.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:44 np0005593233 systemd-logind[804]: Session 35 logged out. Waiting for processes to exit.
Jan 23 04:07:44 np0005593233 systemd[1]: session-35.scope: Deactivated successfully.
Jan 23 04:07:44 np0005593233 systemd[1]: session-35.scope: Consumed 20.076s CPU time.
Jan 23 04:07:44 np0005593233 systemd-logind[804]: Removed session 35.
Jan 23 04:07:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:45.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:45.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:47.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:07:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:47.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:07:47 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 23 04:07:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:47.862683) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:07:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 23 04:07:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267862851, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2810, "num_deletes": 251, "total_data_size": 5709066, "memory_usage": 5779352, "flush_reason": "Manual Compaction"}
Jan 23 04:07:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159268010051, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3706461, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6852, "largest_seqno": 9657, "table_properties": {"data_size": 3695135, "index_size": 6860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 29501, "raw_average_key_size": 21, "raw_value_size": 3669857, "raw_average_value_size": 2732, "num_data_blocks": 302, "num_entries": 1343, "num_filter_entries": 1343, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159102, "oldest_key_time": 1769159102, "file_creation_time": 1769159267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 147461 microseconds, and 17002 cpu microseconds.
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.010189) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3706461 bytes OK
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.010254) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.159798) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.159891) EVENT_LOG_v1 {"time_micros": 1769159268159875, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.159978) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5695745, prev total WAL file size 5695745, number of live WAL files 2.
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.163165) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3619KB)], [15(7111KB)]
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159268163337, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 10988469, "oldest_snapshot_seqno": -1}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3854 keys, 9442022 bytes, temperature: kUnknown
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159268316241, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9442022, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9410560, "index_size": 20728, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9669, "raw_key_size": 92932, "raw_average_key_size": 24, "raw_value_size": 9335294, "raw_average_value_size": 2422, "num_data_blocks": 905, "num_entries": 3854, "num_filter_entries": 3854, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159268, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.316894) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9442022 bytes
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.320549) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 71.7 rd, 61.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 6.9 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 4375, records dropped: 521 output_compression: NoCompression
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.320573) EVENT_LOG_v1 {"time_micros": 1769159268320561, "job": 6, "event": "compaction_finished", "compaction_time_micros": 153277, "compaction_time_cpu_micros": 34335, "output_level": 6, "num_output_files": 1, "total_output_size": 9442022, "num_input_records": 4375, "num_output_records": 3854, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159268321620, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159268323275, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.162816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.323409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.323417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.323418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.323420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:07:48.323421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:07:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:49.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:07:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:07:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:49.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:07:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:51.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:51.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:51 np0005593233 systemd-logind[804]: New session 36 of user zuul.
Jan 23 04:07:51 np0005593233 systemd[1]: Started Session 36 of User zuul.
Jan 23 04:07:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:53 np0005593233 python3.9[96375]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:53.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:53.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:54 np0005593233 python3.9[96529]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:07:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:07:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:55.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:07:55 np0005593233 python3.9[96722]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:07:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:55.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:56 np0005593233 systemd-logind[804]: Session 36 logged out. Waiting for processes to exit.
Jan 23 04:07:56 np0005593233 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 04:07:56 np0005593233 systemd[1]: session-36.scope: Consumed 2.625s CPU time.
Jan 23 04:07:56 np0005593233 systemd-logind[804]: Removed session 36.
Jan 23 04:07:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:57.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:57.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:07:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:59.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:07:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:07:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:59.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:01.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:01.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:02 np0005593233 systemd-logind[804]: New session 37 of user zuul.
Jan 23 04:08:02 np0005593233 systemd[1]: Started Session 37 of User zuul.
Jan 23 04:08:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:03.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:03.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:04 np0005593233 python3.9[96901]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:08:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:05.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:05 np0005593233 python3.9[97055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:08:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:05.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:06 np0005593233 python3.9[97211]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:08:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:07.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:07 np0005593233 python3.9[97295]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:08:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:07.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:09.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:09.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:09 np0005593233 python3.9[97448]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:08:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:11.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:11 np0005593233 python3.9[97643]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:11.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:12 np0005593233 python3.9[97795]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:08:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:12 np0005593233 python3.9[97960]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:13.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:13 np0005593233 python3.9[98038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:13.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:14 np0005593233 python3.9[98190]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:14 np0005593233 python3.9[98268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:15.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:15 np0005593233 python3.9[98420]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:15.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:16 np0005593233 python3.9[98572]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:16 np0005593233 python3.9[98724]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:17.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:17 np0005593233 python3.9[98876]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:17.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:18 np0005593233 python3.9[99028]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:08:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:19.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:19.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:21.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:21 np0005593233 python3.9[99181]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:08:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:21.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:22 np0005593233 python3.9[99335]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:08:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:22 np0005593233 python3.9[99487]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:08:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:23.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:23.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:23 np0005593233 python3.9[99639]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:08:24 np0005593233 python3.9[99792]: ansible-service_facts Invoked
Jan 23 04:08:24 np0005593233 network[99809]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:08:24 np0005593233 network[99810]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:08:24 np0005593233 network[99811]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:08:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:25.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:25.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:27.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:27.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:29.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:29.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:31.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:31.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000084s ======
Jan 23 04:08:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:33.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Jan 23 04:08:33 np0005593233 python3.9[100364]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:08:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:35.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:08:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:35.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:08:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:37.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:37.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:08:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:08:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:39.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:39.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:41.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:41.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:43.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:43.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.588694) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324589039, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 748, "num_deletes": 254, "total_data_size": 1450383, "memory_usage": 1464624, "flush_reason": "Manual Compaction"}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324624345, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 622003, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9663, "largest_seqno": 10405, "table_properties": {"data_size": 618910, "index_size": 1001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7879, "raw_average_key_size": 19, "raw_value_size": 612447, "raw_average_value_size": 1527, "num_data_blocks": 45, "num_entries": 401, "num_filter_entries": 401, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159268, "oldest_key_time": 1769159268, "file_creation_time": 1769159324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 35648 microseconds, and 5112 cpu microseconds.
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.624523) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 622003 bytes OK
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.624577) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.686468) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.686567) EVENT_LOG_v1 {"time_micros": 1769159324686555, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.686602) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1446405, prev total WAL file size 1462378, number of live WAL files 2.
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.687904) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323535' seq:0, type:0; will stop at (end)
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(607KB)], [18(9220KB)]
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324688091, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10064025, "oldest_snapshot_seqno": -1}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3756 keys, 7583887 bytes, temperature: kUnknown
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324892220, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7583887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7555969, "index_size": 17440, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9413, "raw_key_size": 91365, "raw_average_key_size": 24, "raw_value_size": 7485198, "raw_average_value_size": 1992, "num_data_blocks": 762, "num_entries": 3756, "num_filter_entries": 3756, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.892735) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7583887 bytes
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.895113) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 49.3 rd, 37.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 9.0 +0.0 blob) out(7.2 +0.0 blob), read-write-amplify(28.4) write-amplify(12.2) OK, records in: 4255, records dropped: 499 output_compression: NoCompression
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.895134) EVENT_LOG_v1 {"time_micros": 1769159324895125, "job": 8, "event": "compaction_finished", "compaction_time_micros": 204278, "compaction_time_cpu_micros": 22706, "output_level": 6, "num_output_files": 1, "total_output_size": 7583887, "num_input_records": 4255, "num_output_records": 3756, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324895460, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324897299, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.687644) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.897355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.897361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.897362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.897364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:08:44.897365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:45.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:45 np0005593233 python3.9[100548]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 04:08:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:45.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:47 np0005593233 python3.9[100700]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:47.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:47 np0005593233 python3.9[100778]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:08:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:47.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:08:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:48 np0005593233 python3.9[100930]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:48 np0005593233 python3.9[101032]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:49.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:50.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000054s ======
Jan 23 04:08:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:51.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 23 04:08:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:52.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:52 np0005593233 python3.9[101210]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:54.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:55.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:55 np0005593233 python3.9[101362]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:08:56 np0005593233 python3.9[101446]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:08:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:56.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:08:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:58.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:08:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:08:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:59.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:00.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:01.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:03.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:04 np0005593233 systemd[1]: session-37.scope: Deactivated successfully.
Jan 23 04:09:04 np0005593233 systemd[1]: session-37.scope: Consumed 25.812s CPU time.
Jan 23 04:09:04 np0005593233 systemd-logind[804]: Session 37 logged out. Waiting for processes to exit.
Jan 23 04:09:04 np0005593233 systemd-logind[804]: Removed session 37.
Jan 23 04:09:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:04.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:09:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:05.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:09:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:06.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:07.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:08.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:09.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:10.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:09:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:11.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:09:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:12.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:13.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:14.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:15.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:16.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:17.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:18 np0005593233 systemd-logind[804]: New session 38 of user zuul.
Jan 23 04:09:18 np0005593233 systemd[1]: Started Session 38 of User zuul.
Jan 23 04:09:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:18.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:19.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:19 np0005593233 python3.9[101629]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:20 np0005593233 python3.9[101781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:20.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:21 np0005593233 python3.9[101859]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:21.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:21 np0005593233 systemd-logind[804]: Session 38 logged out. Waiting for processes to exit.
Jan 23 04:09:21 np0005593233 systemd[1]: session-38.scope: Deactivated successfully.
Jan 23 04:09:21 np0005593233 systemd[1]: session-38.scope: Consumed 1.698s CPU time.
Jan 23 04:09:21 np0005593233 systemd-logind[804]: Removed session 38.
Jan 23 04:09:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:22.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:23.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:24.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:25.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:26.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:27.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:28 np0005593233 systemd-logind[804]: New session 39 of user zuul.
Jan 23 04:09:28 np0005593233 systemd[1]: Started Session 39 of User zuul.
Jan 23 04:09:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:28.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:29.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:29 np0005593233 python3.9[102039]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:09:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:30.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:31 np0005593233 python3.9[102195]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:31.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:31 np0005593233 python3.9[102370]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:32 np0005593233 python3.9[102448]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.4pncmcn6 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:32.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:33.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:33 np0005593233 python3.9[102600]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:33 np0005593233 python3.9[102678]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.4kum8bsc recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:34 np0005593233 python3.9[102830]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:09:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:34.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:35.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:35 np0005593233 python3.9[102982]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:35 np0005593233 python3.9[103060]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:09:36 np0005593233 python3.9[103212]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:36.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:37 np0005593233 python3.9[103290]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:09:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:37.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:37 np0005593233 python3.9[103442]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:38 np0005593233 python3.9[103594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:38.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:39 np0005593233 python3.9[103672]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:39.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:39 np0005593233 python3.9[103824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:40 np0005593233 python3.9[103903]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:40.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:41.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:41 np0005593233 python3.9[104055]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:09:41 np0005593233 systemd[1]: Reloading.
Jan 23 04:09:41 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:09:41 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:09:42 np0005593233 python3.9[104245]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:42.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:43 np0005593233 python3.9[104323]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:09:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:43.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:09:43 np0005593233 python3.9[104475]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:44 np0005593233 python3.9[104553]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:44.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:45 np0005593233 python3.9[104705]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:09:45 np0005593233 systemd[1]: Reloading.
Jan 23 04:09:45 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:09:45 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:09:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:45.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:45 np0005593233 systemd[1]: Starting Create netns directory...
Jan 23 04:09:45 np0005593233 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:09:45 np0005593233 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:09:45 np0005593233 systemd[1]: Finished Create netns directory.
Jan 23 04:09:46 np0005593233 python3.9[104898]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:09:46 np0005593233 network[104915]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:09:46 np0005593233 network[104916]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:09:46 np0005593233 network[104917]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:09:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:46.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:47.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:09:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:49.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:09:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:51.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:51.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:09:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:09:52 np0005593233 python3.9[105309]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:09:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:53.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:09:53 np0005593233 python3.9[105387]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:53.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:54 np0005593233 python3.9[105539]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:54 np0005593233 python3.9[105691]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:09:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:55.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:09:55 np0005593233 python3.9[105769]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:55.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:56 np0005593233 python3.9[105921]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:09:56 np0005593233 systemd[1]: Starting Time & Date Service...
Jan 23 04:09:56 np0005593233 systemd[1]: Started Time & Date Service.
Jan 23 04:09:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:57.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:09:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:57.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:09:57 np0005593233 python3.9[106077]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:58 np0005593233 python3.9[106229]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:58 np0005593233 python3.9[106357]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:59.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:09:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:59.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:59 np0005593233 python3.9[106509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:00 np0005593233 python3.9[106587]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.20os9w6a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 04:10:00 np0005593233 python3.9[106739]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:01.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:01 np0005593233 python3.9[106817]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:01.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:02 np0005593233 python3.9[106969]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:03.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:03 np0005593233 python3[107122]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:10:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:03.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:03 np0005593233 python3.9[107274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:04 np0005593233 python3.9[107352]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:05.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:05 np0005593233 python3.9[107504]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:05.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:05 np0005593233 python3.9[107629]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159404.56886-901-192025295636255/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:06 np0005593233 python3.9[107781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:07.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:07 np0005593233 python3.9[107859]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:07.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:08 np0005593233 python3.9[108011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:08 np0005593233 python3.9[108089]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:09.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:09.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:09 np0005593233 python3.9[108241]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:09 np0005593233 python3.9[108319]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:10 np0005593233 python3.9[108471]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:11.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:11.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:11 np0005593233 python3.9[108626]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:12 np0005593233 python3.9[108778]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:13 np0005593233 python3.9[108930]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:13.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:13 np0005593233 python3.9[109082]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:10:14 np0005593233 python3.9[109234]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:10:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:15.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:15 np0005593233 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 04:10:15 np0005593233 systemd[1]: session-39.scope: Consumed 31.538s CPU time.
Jan 23 04:10:15 np0005593233 systemd-logind[804]: Session 39 logged out. Waiting for processes to exit.
Jan 23 04:10:15 np0005593233 systemd-logind[804]: Removed session 39.
Jan 23 04:10:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:15.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:17.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:17.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:19.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:19.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:20 np0005593233 systemd-logind[804]: New session 40 of user zuul.
Jan 23 04:10:20 np0005593233 systemd[1]: Started Session 40 of User zuul.
Jan 23 04:10:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:21.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:21.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:21 np0005593233 python3.9[109414]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 04:10:22 np0005593233 python3.9[109566]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:23 np0005593233 python3.9[109720]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 23 04:10:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:23.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:23.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:23 np0005593233 python3.9[109872]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.4c_pgl0p follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:24 np0005593233 python3.9[109997]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.4c_pgl0p mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159423.2949677-108-53003618618079/.source.4c_pgl0p _original_basename=.9fiu2piq follow=False checksum=10ad371b9444ca89894e9504601831d6af2e14d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:25.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:25.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:25 np0005593233 python3.9[110149]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:10:26 np0005593233 python3.9[110301]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsLbdPIA8nc52wSKcOItc1xJ6faU3FwhWecUgXZZC+Q1wLSrdN9vgOExBhQSwwodluzJ5/GT9VbCuujyBvk7RMEim1+fw7T58Th56PR8y2lL6F6F3ni4S21QxInTLml+/id8wwEZAkFjbCF/AjCRDyH7a6H4wIZtd5ZuzWJuuBENNdtu/qD1QQYkNegqllogNpkdpAFZgvee26yw2sbCX8kpbJoJsowaQUckoRtT2jj7985CLxErKZ8YO8ZozjfuCDCKbcJT0KFimievJZmKXvGaWG5H+P509XDsfN62aQr22US8FbYjdK1lfrJoetkc/MK4h7QuCs6MH2qYiqXIkJYKMSReM+sH3X7V7pSWSUkr0DHREVvBGcC2lRSx45lUCTEtcTY7XmxGORvCORMYla0l1H3mEIkfYLS4sXYtRSHkyFnyQgbNP5MnrmXlK0vrAA81r5U+dOhIL/H2e7S4xcLItH7weUOHIAmCj266mm9+xJyyd7NZ+eUgS0Md5p4Bc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUSudroiFEdRPXgUCqRHbNRLelYP5RQGMMCn6zD8pfH#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJLsx8RxJz6M7PIyGcFdzR+Ldl788501Y8ZWLJ8hnDzMCaRkGjzE+kzO/uN75IEtV3aVEl1jNQlk7wON+lORGQ=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD7jdzOPltwN8PSb4q9DCiO5zY7TIK6sENpltjjN4gdZgxOTsj/dxnfxJlO2lYI1dFyyFnDdZj88a4x1KI5Bnnvl5KRvvZiianfivZWKq9Ngf9fzf7+5CsDFBiu6a7GAfXMf9FocVpqlXf7fsXmb5Iv2xUpNnye4EFIuW965X3SNrRpujRnDe+i0lIwrOsus4R86qn38MWOLfPBAWFYdBaVfTUYjC0eT/I81Y/T2RKqf7XK/bsuHobZ+/a7lymuPsS9L0DFg25ZoIlvkPUVfZxTO5FCyw8GMR+AgbnMQyHwx2JAmewwH3M2l+zVdDQjsE1ZRFlJCmwle9LBa1oFhuLfxLqsykQploeB5Ch/VppbnRQ/GamwWLU5HEKMH2wZ6IymURW7nSStlEhNWvK+Bb9rIy65M6AFOEW94xId4nc+IraS6rc2cuM3Rp97S/6olqjlFDZisdUwdAlhIKuJjA7SsYZ6HyCEbRN3mvMnWbkqpyY605kewQ6kdmucNeWgRtk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE++PPNOKtggGl2mGWEm1DV2WpblvGA/F2TEEVeMrsU2#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP3uOoytpWGDF46u3wwDFxwF05HMnZd51GvbceZrDgZRmc5sxbF+OawPD9kGTcjnaUTzvqWgbFNvcmpuaNTnpzc=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq2Yxebv3BUxXHPuf6nN00teEMYUUVEWMZOqcwNO1dyibdbyxre6VweeeiBR/lerW1mIcmB67juCuLffEgDo8uPtZx9HrD1psd+ji78YeJuvbKIEcTwdtGF0I8PeogHunx+4KBxFsHeF6JHN9+H7lTHiSSIDFzk9BwDkAKEWsYHe8z+5SPDU//XiYNv0drE59KiQF586rnjPR3VZk6WaR+hp2PiHbUUSOvnyB4kI4bCXSCU/Oxv7HDvgeCJapABjisMZg4aiteZ7EaD1yVndkQiS6OxfOGP1srgtNkRL4Idc/XCFXH754lbRd8GzUF0n8N0HbWTcFDuTU+bvhuIH+3EDNxsDQkSCdJTw2EPb/mqZVdXSFxLXUBcXnYkBWZirpgC3g6okg2RQU2bxigFs7lFwJT6QE+wz0DK7Z3ib0XQxjRlY6PIwn1D2soMwKVarxpeM2FfsGrHMHaHioRTVbKpzBMA1oUICSUCvzyhd0I43cO2rUEK/8EMYSsTVRulKs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII4nVnNUbCVQAtKJF7UUtMQxNhMw9eVlRVofBpQ70iUi#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPqfkBgoQjr/gZBK1F9K576GMtkxSY6lVgROItGrW+R9EA2lvnOt71IGO0M0lGVvCkTtLktdNpSsYnBu2cJn+4c=#012 create=True mode=0644 path=/tmp/ansible.4c_pgl0p state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:26 np0005593233 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:10:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:27.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:27.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:27 np0005593233 python3.9[110455]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.4c_pgl0p' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:28 np0005593233 python3.9[110609]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.4c_pgl0p state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:28 np0005593233 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 04:10:28 np0005593233 systemd[1]: session-40.scope: Consumed 5.672s CPU time.
Jan 23 04:10:28 np0005593233 systemd-logind[804]: Session 40 logged out. Waiting for processes to exit.
Jan 23 04:10:28 np0005593233 systemd-logind[804]: Removed session 40.
Jan 23 04:10:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:29.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:31.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:31.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:33.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:33.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:34 np0005593233 systemd-logind[804]: New session 41 of user zuul.
Jan 23 04:10:34 np0005593233 systemd[1]: Started Session 41 of User zuul.
Jan 23 04:10:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:35.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:35.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:35 np0005593233 python3.9[110788]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:10:36 np0005593233 python3.9[110944]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:10:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:37.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:37.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:37 np0005593233 python3.9[111098]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:10:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:38 np0005593233 python3.9[111251]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:39.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:39.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:39 np0005593233 python3.9[111404]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:40 np0005593233 python3.9[111556]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:41 np0005593233 systemd[1]: session-41.scope: Deactivated successfully.
Jan 23 04:10:41 np0005593233 systemd[1]: session-41.scope: Consumed 4.351s CPU time.
Jan 23 04:10:41 np0005593233 systemd-logind[804]: Session 41 logged out. Waiting for processes to exit.
Jan 23 04:10:41 np0005593233 systemd-logind[804]: Removed session 41.
Jan 23 04:10:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:41.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:41.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:43.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:43.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:45.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:45.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:46 np0005593233 systemd-logind[804]: New session 42 of user zuul.
Jan 23 04:10:46 np0005593233 systemd[1]: Started Session 42 of User zuul.
Jan 23 04:10:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:10:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:47.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:10:47 np0005593233 python3.9[111734]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:10:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:48 np0005593233 python3.9[111890]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:10:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:49.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:49.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:49 np0005593233 python3.9[111974]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:10:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:51.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:51.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:52 np0005593233 python3.9[112125]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:53.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:53.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:10:54 np0005593233 python3.9[112276]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:10:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:55.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:10:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:55.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:10:55 np0005593233 python3.9[112426]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:56 np0005593233 python3.9[112576]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:56 np0005593233 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 04:10:56 np0005593233 systemd[1]: session-42.scope: Consumed 6.415s CPU time.
Jan 23 04:10:56 np0005593233 systemd-logind[804]: Session 42 logged out. Waiting for processes to exit.
Jan 23 04:10:56 np0005593233 systemd-logind[804]: Removed session 42.
Jan 23 04:10:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:57.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:57.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:59.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:10:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:10:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:10:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:10:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:59.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:11:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:11:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:01.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:01.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:02 np0005593233 systemd-logind[804]: New session 43 of user zuul.
Jan 23 04:11:02 np0005593233 systemd[1]: Started Session 43 of User zuul.
Jan 23 04:11:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:03.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:03.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:04 np0005593233 python3.9[113005]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:11:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:05.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:05.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:05 np0005593233 python3.9[113161]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:06 np0005593233 python3.9[113363]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:07.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:07.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:07 np0005593233 python3.9[113515]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:08 np0005593233 python3.9[113638]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159466.880023-155-153781200555697/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=2e3a605733dc7abb1d9bccd9972e73f2904529a9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:08 np0005593233 python3.9[113790]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:09.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:09.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:09 np0005593233 python3.9[113913]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159468.4236012-155-36267869470574/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b5c90b44c6774a0fb2738dc9aefa548e4239c50f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:10 np0005593233 python3.9[114065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:10 np0005593233 python3.9[114188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159469.7189991-155-69214904339932/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=30ac7fdf1144f84e8c1b8f3bd3259a803a555a33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:11.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:11.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:11 np0005593233 python3.9[114340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:12 np0005593233 python3.9[114492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:12 np0005593233 python3.9[114644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:13.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:13.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:13 np0005593233 python3.9[114767]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159472.4994874-344-150739547956317/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c8d7781c0bfc2335dbc825e347c24fcc3522728b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:14 np0005593233 python3.9[114919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:14 np0005593233 python3.9[115042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159473.824116-344-54107794930187/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=4d54572c36838e9e23d527be56268c8c0160f31d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:15.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:15.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:15 np0005593233 python3.9[115194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:16 np0005593233 python3.9[115317]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159475.0956025-344-33002542945942/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=ef65ee70045874cfa6dbf194ee53b6f573c2a20b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:17 np0005593233 python3.9[115469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:17.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:17.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:17 np0005593233 python3.9[115621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:11:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.8 total, 600.0 interval#012Cumulative writes: 5794 writes, 24K keys, 5794 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5794 writes, 902 syncs, 6.42 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5794 writes, 24K keys, 5794 commit groups, 1.0 writes per commit group, ingest: 18.98 MB, 0.03 MB/s#012Interval WAL: 5794 writes, 902 syncs, 6.42 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.8 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.7 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.8 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.8 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtab
Jan 23 04:11:18 np0005593233 python3.9[115773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:19 np0005593233 python3.9[115896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159478.003566-532-84830556757081/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=d91ad8909b3bd8fde43f686d0cf336dcfd53cfdd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:19.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:19.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:19 np0005593233 python3.9[116048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:20 np0005593233 python3.9[116171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159479.1480792-532-96857983787048/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=4d54572c36838e9e23d527be56268c8c0160f31d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:21 np0005593233 python3.9[116323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:21.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:21.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:21 np0005593233 python3.9[116446]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159480.5653176-532-242425261589760/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=800fcab52c35f5dee3d2e2e0edd09fe1b89625e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:22 np0005593233 python3.9[116598]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:23.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:23.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:23 np0005593233 python3.9[116750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:24 np0005593233 python3.9[116873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159483.1389544-739-265881360985653/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:25.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:25 np0005593233 python3.9[117025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:25.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:25 np0005593233 python3.9[117177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:26 np0005593233 python3.9[117300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159485.5266082-815-58007066287314/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:27.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:27.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:27 np0005593233 python3.9[117452]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:28 np0005593233 python3.9[117604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:29 np0005593233 python3.9[117727]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159487.8076274-888-13095898036159/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:29.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:29.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:29 np0005593233 python3.9[117879]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:30 np0005593233 python3.9[118031]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:31.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:31 np0005593233 python3.9[118154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159490.1595912-962-77442076620369/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:31.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:32 np0005593233 python3.9[118306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:32 np0005593233 python3.9[118458]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:33.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:33.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:33 np0005593233 python3.9[118581]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159492.2956524-1026-124569817892012/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:34 np0005593233 python3.9[118733]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:35 np0005593233 python3.9[118885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:35.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:35.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:35 np0005593233 python3.9[119008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159494.472016-1100-163680581347140/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:36 np0005593233 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 04:11:36 np0005593233 systemd[1]: session-43.scope: Consumed 24.258s CPU time.
Jan 23 04:11:36 np0005593233 systemd-logind[804]: Session 43 logged out. Waiting for processes to exit.
Jan 23 04:11:36 np0005593233 systemd-logind[804]: Removed session 43.
Jan 23 04:11:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:37.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:37.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:39.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:39.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.026828) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501026985, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1824, "num_deletes": 251, "total_data_size": 4652281, "memory_usage": 4725864, "flush_reason": "Manual Compaction"}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501071760, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3042263, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10410, "largest_seqno": 12229, "table_properties": {"data_size": 3034673, "index_size": 4597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 14784, "raw_average_key_size": 19, "raw_value_size": 3019675, "raw_average_value_size": 3962, "num_data_blocks": 207, "num_entries": 762, "num_filter_entries": 762, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159324, "oldest_key_time": 1769159324, "file_creation_time": 1769159501, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 44959 microseconds, and 8101 cpu microseconds.
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.071839) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3042263 bytes OK
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.071868) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.073166) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.073190) EVENT_LOG_v1 {"time_micros": 1769159501073185, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.073209) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4644062, prev total WAL file size 4644062, number of live WAL files 2.
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.074783) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2970KB)], [21(7406KB)]
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501074905, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10626150, "oldest_snapshot_seqno": -1}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4001 keys, 8626343 bytes, temperature: kUnknown
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501172985, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8626343, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8596339, "index_size": 18886, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 97081, "raw_average_key_size": 24, "raw_value_size": 8520756, "raw_average_value_size": 2129, "num_data_blocks": 817, "num_entries": 4001, "num_filter_entries": 4001, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159501, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.173551) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8626343 bytes
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.175543) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 108.1 rd, 87.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 7.2 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(6.3) write-amplify(2.8) OK, records in: 4518, records dropped: 517 output_compression: NoCompression
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.175564) EVENT_LOG_v1 {"time_micros": 1769159501175554, "job": 10, "event": "compaction_finished", "compaction_time_micros": 98279, "compaction_time_cpu_micros": 22479, "output_level": 6, "num_output_files": 1, "total_output_size": 8626343, "num_input_records": 4518, "num_output_records": 4001, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501176383, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501178032, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.074600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.178196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.178207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.178212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.178216) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:11:41.178221) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:41.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:11:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:41.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:43.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:43.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:44 np0005593233 systemd-logind[804]: New session 44 of user zuul.
Jan 23 04:11:44 np0005593233 systemd[1]: Started Session 44 of User zuul.
Jan 23 04:11:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:45.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:45.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:45 np0005593233 python3.9[119188]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:46 np0005593233 python3.9[119340]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:47.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:47.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:47 np0005593233 python3.9[119463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159506.1093476-63-33701893535139/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=9a6a528427b32e6ef98709d36c90302cf328f9ef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:48 np0005593233 python3.9[119615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:49 np0005593233 python3.9[119738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159507.8268313-63-36651408194751/.source.conf _original_basename=ceph.conf follow=False checksum=e4aedaaab1f9b40918a770d92609389e4ab78681 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:49.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:49.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:49 np0005593233 systemd[1]: session-44.scope: Deactivated successfully.
Jan 23 04:11:49 np0005593233 systemd[1]: session-44.scope: Consumed 2.750s CPU time.
Jan 23 04:11:49 np0005593233 systemd-logind[804]: Session 44 logged out. Waiting for processes to exit.
Jan 23 04:11:49 np0005593233 systemd-logind[804]: Removed session 44.
Jan 23 04:11:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:11:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:51.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:11:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:51.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:53.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:53.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:55.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:55.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:11:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2090 writes, 12K keys, 2090 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2090 writes, 2090 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2090 writes, 12K keys, 2090 commit groups, 1.0 writes per commit group, ingest: 23.20 MB, 0.04 MB/s#012Interval WAL: 2090 writes, 2090 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     43.9      0.32              0.07         5    0.064       0      0       0.0       0.0#012  L6      1/0    8.23 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.2     69.5     58.8      0.53              0.11         4    0.134     16K   1793       0.0       0.0#012 Sum      1/0    8.23 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.2     43.6     53.2      0.85              0.17         9    0.095     16K   1793       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.2     43.7     53.3      0.85              0.17         8    0.106     16K   1793       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     69.5     58.8      0.53              0.11         4    0.134     16K   1793       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     44.2      0.32              0.07         4    0.079       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.014, interval 0.014#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.9 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 308.00 MB usage: 1.51 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(74,1.34 MB,0.436609%) FilterBlock(9,54.48 KB,0.0172751%) IndexBlock(9,118.23 KB,0.0374881%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:11:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:57.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:11:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:57.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:11:57 np0005593233 systemd-logind[804]: New session 45 of user zuul.
Jan 23 04:11:57 np0005593233 systemd[1]: Started Session 45 of User zuul.
Jan 23 04:11:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:58 np0005593233 python3.9[119916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:11:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:59.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:11:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:11:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:59.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:00 np0005593233 python3.9[120072]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:01 np0005593233 python3.9[120224]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:01.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:01.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:01 np0005593233 python3.9[120374]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:12:02 np0005593233 python3.9[120526]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 04:12:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:03.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:03.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:05.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:05 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 04:12:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:05.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:05 np0005593233 python3.9[120682]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:12:06 np0005593233 python3.9[120839]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:12:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:07.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:12:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:12:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 23 04:12:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:07.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:12:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:12:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:09.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:09 np0005593233 python3.9[121048]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:12:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:09.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:10 np0005593233 python3[121203]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 04:12:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:11.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:11 np0005593233 python3.9[121357]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:11.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:12 np0005593233 python3.9[121509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:12 np0005593233 python3.9[121587]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:13.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:13.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:13 np0005593233 python3.9[121739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:14 np0005593233 python3.9[121817]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.iwaxbzl5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:15 np0005593233 python3.9[121993]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:15.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:15.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:15 np0005593233 python3.9[122097]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:16 np0005593233 python3.9[122249]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:17.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:17.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:17 np0005593233 python3[122402]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:12:18 np0005593233 python3.9[122554]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:19.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:19.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:19 np0005593233 python3.9[122679]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159538.243401-432-102252014682122/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:20 np0005593233 python3.9[122831]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:21.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:21 np0005593233 python3.9[122956]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159540.0127172-477-70771107310794/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:22 np0005593233 python3.9[123108]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:23 np0005593233 python3.9[123233]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159541.8720696-522-183168405100087/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:23.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:23.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:24 np0005593233 python3.9[123385]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:24 np0005593233 python3.9[123510]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159543.3400376-567-218495664435411/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:25.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:25.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:25 np0005593233 python3.9[123662]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:26 np0005593233 python3.9[123787]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159544.9429162-612-147323890824338/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:27 np0005593233 python3.9[123939]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:27.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:28 np0005593233 python3.9[124091]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:29 np0005593233 python3.9[124246]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:29.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:30 np0005593233 python3.9[124398]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:31 np0005593233 python3.9[124551]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:12:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:31.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:31.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:32 np0005593233 python3.9[124705]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:32 np0005593233 python3.9[124860]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:33.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:33.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:34 np0005593233 python3.9[125010]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:12:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:35.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:35.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:36 np0005593233 python3.9[125163]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:36 np0005593233 ovs-vsctl[125164]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 04:12:37 np0005593233 python3.9[125316]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:37.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:37.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:38 np0005593233 python3.9[125471]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:38 np0005593233 ovs-vsctl[125472]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 04:12:38 np0005593233 python3.9[125622]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:12:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:39.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:39.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:39 np0005593233 python3.9[125776]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:40 np0005593233 python3.9[125928]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:41 np0005593233 python3.9[126006]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:41.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:41.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:42 np0005593233 python3.9[126158]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:42 np0005593233 python3.9[126236]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:43.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:43 np0005593233 python3.9[126388]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:43.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:44 np0005593233 python3.9[126540]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:44 np0005593233 python3.9[126618]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:45.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:45.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:45 np0005593233 python3.9[126770]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:46 np0005593233 python3.9[126848]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:47 np0005593233 python3.9[127000]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:12:47 np0005593233 systemd[1]: Reloading.
Jan 23 04:12:47 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:12:47 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:12:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:47.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:47.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:48 np0005593233 python3.9[127191]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:49 np0005593233 python3.9[127269]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:49.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:49 np0005593233 python3.9[127421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:50 np0005593233 python3.9[127499]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:51 np0005593233 python3.9[127651]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:12:51 np0005593233 systemd[1]: Reloading.
Jan 23 04:12:51 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:12:51 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:12:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:51.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:51.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:51 np0005593233 systemd[1]: Starting Create netns directory...
Jan 23 04:12:51 np0005593233 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:12:51 np0005593233 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:12:51 np0005593233 systemd[1]: Finished Create netns directory.
Jan 23 04:12:52 np0005593233 python3.9[127846]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:53.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:53 np0005593233 python3.9[127998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:53.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:54 np0005593233 python3.9[128121]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159572.9691236-1365-107114095110002/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:12:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:55.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:12:55 np0005593233 python3.9[128273]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:55.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:56 np0005593233 python3.9[128425]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:57.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:57.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:57 np0005593233 python3.9[128578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:58 np0005593233 python3.9[128701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159576.5448964-1464-43788656525017/.source.json _original_basename=.1xllnevh follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:59 np0005593233 python3.9[128851]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:59.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:12:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:12:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:59.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:01.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:01.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:01 np0005593233 python3.9[129274]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 04:13:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:03 np0005593233 python3.9[129426]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:13:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:03.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:03.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:04 np0005593233 python3[129578]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:13:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:05.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:05.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:07.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:07.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:09.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:09 np0005593233 podman[129591]: 2026-01-23 09:13:09.579448444 +0000 UTC m=+4.980979261 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 04:13:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:09.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:09 np0005593233 podman[129708]: 2026-01-23 09:13:09.740454901 +0000 UTC m=+0.051397877 container create f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 04:13:09 np0005593233 podman[129708]: 2026-01-23 09:13:09.717150724 +0000 UTC m=+0.028093710 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 04:13:09 np0005593233 python3[129578]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 04:13:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:11.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:11.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:11 np0005593233 python3.9[129898]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:13:12 np0005593233 python3.9[130052]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:13:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:13 np0005593233 python3.9[130128]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:13:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:13.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:13.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:14 np0005593233 python3.9[130279]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159593.3472626-1698-10299869976749/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:13:14 np0005593233 python3.9[130355]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:13:14 np0005593233 systemd[1]: Reloading.
Jan 23 04:13:14 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:13:14 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:13:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:15.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:15.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:15 np0005593233 python3.9[130554]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:13:15 np0005593233 systemd[1]: Reloading.
Jan 23 04:13:15 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:13:15 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:13:16 np0005593233 systemd[1]: Starting ovn_controller container...
Jan 23 04:13:16 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:13:16 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ded2943128563d0273fb351909c4a4ec1fa20309a16ebcd209513c775aaebf76/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 04:13:16 np0005593233 systemd[1]: Started /usr/bin/podman healthcheck run f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe.
Jan 23 04:13:16 np0005593233 podman[130638]: 2026-01-23 09:13:16.387219279 +0000 UTC m=+0.316515313 container init f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: + sudo -E kolla_set_configs
Jan 23 04:13:16 np0005593233 podman[130638]: 2026-01-23 09:13:16.412956873 +0000 UTC m=+0.342252877 container start f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:13:16 np0005593233 systemd[1]: Created slice User Slice of UID 0.
Jan 23 04:13:16 np0005593233 edpm-start-podman-container[130638]: ovn_controller
Jan 23 04:13:16 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 04:13:16 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 04:13:16 np0005593233 systemd[1]: Starting User Manager for UID 0...
Jan 23 04:13:16 np0005593233 edpm-start-podman-container[130637]: Creating additional drop-in dependency for "ovn_controller" (f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe)
Jan 23 04:13:16 np0005593233 podman[130659]: 2026-01-23 09:13:16.519089058 +0000 UTC m=+0.094829102 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:13:16 np0005593233 systemd[1]: f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe-41d3e984e9ded1ec.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 04:13:16 np0005593233 systemd[1]: f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe-41d3e984e9ded1ec.service: Failed with result 'exit-code'.
Jan 23 04:13:16 np0005593233 systemd[1]: Reloading.
Jan 23 04:13:16 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:13:16 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:13:16 np0005593233 systemd[130687]: Queued start job for default target Main User Target.
Jan 23 04:13:16 np0005593233 systemd[130687]: Created slice User Application Slice.
Jan 23 04:13:16 np0005593233 systemd[130687]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 04:13:16 np0005593233 systemd[130687]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:13:16 np0005593233 systemd[130687]: Reached target Paths.
Jan 23 04:13:16 np0005593233 systemd[130687]: Reached target Timers.
Jan 23 04:13:16 np0005593233 systemd[130687]: Starting D-Bus User Message Bus Socket...
Jan 23 04:13:16 np0005593233 systemd[130687]: Starting Create User's Volatile Files and Directories...
Jan 23 04:13:16 np0005593233 systemd[130687]: Finished Create User's Volatile Files and Directories.
Jan 23 04:13:16 np0005593233 systemd[130687]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:13:16 np0005593233 systemd[130687]: Reached target Sockets.
Jan 23 04:13:16 np0005593233 systemd[130687]: Reached target Basic System.
Jan 23 04:13:16 np0005593233 systemd[130687]: Reached target Main User Target.
Jan 23 04:13:16 np0005593233 systemd[130687]: Startup finished in 148ms.
Jan 23 04:13:16 np0005593233 systemd[1]: Started User Manager for UID 0.
Jan 23 04:13:16 np0005593233 systemd[1]: Started ovn_controller container.
Jan 23 04:13:16 np0005593233 systemd[1]: Started Session c1 of User root.
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: INFO:__main__:Validating config file
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: INFO:__main__:Writing out command to execute
Jan 23 04:13:16 np0005593233 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: ++ cat /run_command
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: + ARGS=
Jan 23 04:13:16 np0005593233 ovn_controller[130653]: + sudo kolla_copy_cacerts
Jan 23 04:13:17 np0005593233 systemd[1]: Started Session c2 of User root.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: + [[ ! -n '' ]]
Jan 23 04:13:17 np0005593233 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: + . kolla_extend_start
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: + umask 0022
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.0730] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.0741] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:13:17 np0005593233 kernel: br-int: entered promiscuous mode
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <warn>  [1769159597.0745] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:13:17 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.0754] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.0760] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.0763] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00011|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00012|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00013|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00014|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00015|main|INFO|OVS feature set changed, force recompute.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00016|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 04:13:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:13:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:13:17 np0005593233 systemd-udevd[130784]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:17Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.2579] manager: (ovn-e9717b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.2646] manager: (ovn-3ec410-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 04:13:17 np0005593233 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.2823] device (genev_sys_6081): carrier: link connected
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.2827] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 04:13:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:17.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:17.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:17 np0005593233 NetworkManager[48871]: <info>  [1769159597.7061] manager: (ovn-d80bc7-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 04:13:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:18 np0005593233 python3.9[130914]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 04:13:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:19.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:13:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:13:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:21.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:21 np0005593233 python3.9[131066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:21.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:13:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:13:22 np0005593233 python3.9[131189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159601.0544229-1833-256112139414522/.source.yaml _original_basename=.7y_db7i9 follow=False checksum=d3cbb0a9c550a24d080b6861631678a3f2e708bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:13:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:22 np0005593233 python3.9[131341]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:13:22 np0005593233 ovs-vsctl[131342]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 04:13:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:23.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:23.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:23 np0005593233 python3.9[131517]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:13:23 np0005593233 ovs-vsctl[131546]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 04:13:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:13:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:13:24 np0005593233 python3.9[131699]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:13:24 np0005593233 ovs-vsctl[131700]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 04:13:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:25.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:25 np0005593233 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 04:13:25 np0005593233 systemd[1]: session-45.scope: Consumed 1min 1.501s CPU time.
Jan 23 04:13:25 np0005593233 systemd-logind[804]: Session 45 logged out. Waiting for processes to exit.
Jan 23 04:13:25 np0005593233 systemd-logind[804]: Removed session 45.
Jan 23 04:13:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:25.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:27 np0005593233 systemd[1]: Stopping User Manager for UID 0...
Jan 23 04:13:27 np0005593233 systemd[130687]: Activating special unit Exit the Session...
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped target Main User Target.
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped target Basic System.
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped target Paths.
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped target Sockets.
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped target Timers.
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:13:27 np0005593233 systemd[130687]: Closed D-Bus User Message Bus Socket.
Jan 23 04:13:27 np0005593233 systemd[130687]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:13:27 np0005593233 systemd[130687]: Removed slice User Application Slice.
Jan 23 04:13:27 np0005593233 systemd[130687]: Reached target Shutdown.
Jan 23 04:13:27 np0005593233 systemd[130687]: Finished Exit the Session.
Jan 23 04:13:27 np0005593233 systemd[130687]: Reached target Exit the Session.
Jan 23 04:13:27 np0005593233 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 04:13:27 np0005593233 systemd[1]: Stopped User Manager for UID 0.
Jan 23 04:13:27 np0005593233 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 04:13:27 np0005593233 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 04:13:27 np0005593233 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 04:13:27 np0005593233 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 04:13:27 np0005593233 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 04:13:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:27.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:27.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:29.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:29.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:31.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:31.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:32 np0005593233 systemd-logind[804]: New session 47 of user zuul.
Jan 23 04:13:32 np0005593233 systemd[1]: Started Session 47 of User zuul.
Jan 23 04:13:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:33 np0005593233 python3.9[131879]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:13:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:33.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:33.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:34 np0005593233 python3.9[132035]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:35.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:35 np0005593233 python3.9[132188]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:35.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:36 np0005593233 python3.9[132340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:36 np0005593233 python3.9[132492]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:37.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:37 np0005593233 python3.9[132644]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:37.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:38 np0005593233 python3.9[132798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:13:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:39.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:39.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:39 np0005593233 python3.9[132956]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 04:13:41 np0005593233 python3.9[133106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:41.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:41.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:42 np0005593233 python3.9[133227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159620.6774275-219-187811190059157/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:42 np0005593233 python3.9[133377]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:43 np0005593233 python3.9[133498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159622.242969-264-122829542701202/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:43.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:43.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:44 np0005593233 python3.9[133650]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:13:45 np0005593233 python3.9[133734]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:13:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:45.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:45.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:47Z|00025|memory|INFO|16256 kB peak resident set size after 30.1 seconds
Jan 23 04:13:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:13:47Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 23 04:13:47 np0005593233 podman[133786]: 2026-01-23 09:13:47.117727881 +0000 UTC m=+0.114451017 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 23 04:13:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:47.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:47.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:47 np0005593233 python3.9[133914]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:13:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:48 np0005593233 python3.9[134067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:49 np0005593233 python3.9[134188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159628.2383623-375-174309635261018/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:49.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:49.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:50 np0005593233 python3.9[134338]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:50 np0005593233 python3.9[134459]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159629.5853665-375-278819965022030/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:13:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:51.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:13:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:51.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:52 np0005593233 python3.9[134609]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:52 np0005593233 python3.9[134730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159631.58705-507-174511147570918/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:53.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:53 np0005593233 python3.9[134881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:53.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:54 np0005593233 python3.9[135002]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159632.9031644-507-123323039718978/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:55 np0005593233 python3.9[135152]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:13:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:55.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:55.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:56 np0005593233 python3.9[135306]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:57 np0005593233 python3.9[135458]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:57.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:57.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:57 np0005593233 python3.9[135536]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:58 np0005593233 python3.9[135688]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:59 np0005593233 python3.9[135766]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:13:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:59.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:13:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:13:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:59.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:59 np0005593233 python3.9[135918]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:00 np0005593233 python3.9[136070]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:01 np0005593233 python3.9[136148]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:01.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:01.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:01 np0005593233 python3.9[136300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:02 np0005593233 python3.9[136378]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:03.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:03 np0005593233 python3.9[136530]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:14:03 np0005593233 systemd[1]: Reloading.
Jan 23 04:14:03 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:03 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:03.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:05 np0005593233 python3.9[136718]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:05.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:05.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:05 np0005593233 python3.9[136796]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:06 np0005593233 python3.9[136948]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:07 np0005593233 python3.9[137026]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:07.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:07.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:08 np0005593233 python3.9[137178]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:14:08 np0005593233 systemd[1]: Reloading.
Jan 23 04:14:08 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:08 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:08 np0005593233 systemd[1]: Starting Create netns directory...
Jan 23 04:14:08 np0005593233 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:14:08 np0005593233 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:14:08 np0005593233 systemd[1]: Finished Create netns directory.
Jan 23 04:14:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:09.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:09 np0005593233 python3.9[137372]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:14:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:09.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:10 np0005593233 python3.9[137524]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:11 np0005593233 python3.9[137647]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159649.8754504-960-220455141194137/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:14:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:11.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:11.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:12 np0005593233 python3.9[137799]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:12 np0005593233 python3.9[137951]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:14:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:13.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:13.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:13 np0005593233 python3.9[138103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:14 np0005593233 python3.9[138226]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159653.2144759-1059-121350227856532/.source.json _original_basename=.4xdn6cgx follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:15 np0005593233 python3.9[138376]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:15.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:15.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:17.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:17 np0005593233 podman[138771]: 2026-01-23 09:14:17.663988743 +0000 UTC m=+0.119832197 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:14:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:17.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:17 np0005593233 python3.9[138816]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 04:14:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:18 np0005593233 python3.9[138977]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:14:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:19.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:19.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:20 np0005593233 python3[139129]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:14:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:21.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:21.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:23.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:23.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:25.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:25.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:27.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:27.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:29 np0005593233 podman[139143]: 2026-01-23 09:14:29.021029443 +0000 UTC m=+8.815201154 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:14:29 np0005593233 podman[139393]: 2026-01-23 09:14:29.165533182 +0000 UTC m=+0.059192248 container create 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:14:29 np0005593233 podman[139393]: 2026-01-23 09:14:29.135219888 +0000 UTC m=+0.028878994 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:14:29 np0005593233 python3[139129]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:14:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:29.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:29.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:31.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:31.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:33.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:33.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:35.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:35.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:35 np0005593233 python3.9[139595]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:14:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:14:37 np0005593233 python3.9[139749]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:37.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:14:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:14:37 np0005593233 python3.9[139825]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:14:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:37.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:38 np0005593233 python3.9[139976]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159677.726601-1293-19758014521329/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:39 np0005593233 python3.9[140052]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:14:39 np0005593233 systemd[1]: Reloading.
Jan 23 04:14:39 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:39 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:39.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:39.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:40 np0005593233 python3.9[140162]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:14:40 np0005593233 systemd[1]: Reloading.
Jan 23 04:14:40 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:40 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:40 np0005593233 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 04:14:40 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:14:40 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8da5511bbd9ef1abfbd776e9c46bb65f2d0e1ad7ec6be655ab97b9f734f1b043/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 04:14:40 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8da5511bbd9ef1abfbd776e9c46bb65f2d0e1ad7ec6be655ab97b9f734f1b043/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:14:40 np0005593233 systemd[1]: Started /usr/bin/podman healthcheck run 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357.
Jan 23 04:14:40 np0005593233 podman[140203]: 2026-01-23 09:14:40.594030348 +0000 UTC m=+0.157491970 container init 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + sudo -E kolla_set_configs
Jan 23 04:14:40 np0005593233 podman[140203]: 2026-01-23 09:14:40.63129027 +0000 UTC m=+0.194751902 container start 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 04:14:40 np0005593233 edpm-start-podman-container[140203]: ovn_metadata_agent
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Validating config file
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Copying service configuration files
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Writing out command to execute
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 04:14:40 np0005593233 edpm-start-podman-container[140202]: Creating additional drop-in dependency for "ovn_metadata_agent" (2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357)
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: ++ cat /run_command
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + CMD=neutron-ovn-metadata-agent
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + ARGS=
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + sudo kolla_copy_cacerts
Jan 23 04:14:40 np0005593233 systemd[1]: Reloading.
Jan 23 04:14:40 np0005593233 podman[140226]: 2026-01-23 09:14:40.749380596 +0000 UTC m=+0.097290164 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + [[ ! -n '' ]]
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + . kolla_extend_start
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + umask 0022
Jan 23 04:14:40 np0005593233 ovn_metadata_agent[140219]: + exec neutron-ovn-metadata-agent
Jan 23 04:14:40 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:40 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:41 np0005593233 systemd[1]: Started ovn_metadata_agent container.
Jan 23 04:14:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:41.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:41.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:42 np0005593233 python3.9[140456]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.542 140224 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.543 140224 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.543 140224 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.543 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.543 140224 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.543 140224 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.544 140224 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.545 140224 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.546 140224 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.547 140224 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.548 140224 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.549 140224 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.550 140224 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.551 140224 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.552 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.553 140224 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.554 140224 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.555 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.556 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.557 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.558 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.559 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.560 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.561 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.562 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.563 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.564 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.565 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.566 140224 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.567 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.568 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.569 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.570 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.571 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.572 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.573 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.574 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.575 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.576 140224 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.586 140224 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.586 140224 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.586 140224 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.587 140224 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.587 140224 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.600 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 539cfa5a-1c2f-4cb4-97af-2edb819f72fc (UUID: 539cfa5a-1c2f-4cb4-97af-2edb819f72fc) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.625 140224 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.626 140224 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.626 140224 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.626 140224 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.629 140224 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.636 140224 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.642 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '539cfa5a-1c2f-4cb4-97af-2edb819f72fc'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], external_ids={}, name=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, nb_cfg_timestamp=1769159605136, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.643 140224 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fb103e13f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.644 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.644 140224 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.644 140224 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.644 140224 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.649 140224 DEBUG oslo_service.service [-] Started child 140481 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.653 140224 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmphvtcxupn/privsep.sock']#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.654 140481 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-8297695'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.686 140481 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.687 140481 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.687 140481 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.690 140481 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.697 140481 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 04:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:42.706 140481 INFO eventlet.wsgi.server [-] (140481) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 23 04:14:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:43 np0005593233 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 04:14:43 np0005593233 python3.9[140661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.367 140224 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.368 140224 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmphvtcxupn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.227 140664 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.232 140664 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.235 140664 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.235 140664 INFO oslo.privsep.daemon [-] privsep daemon running as pid 140664#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.370 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[19080a90-f5da-4a9a-a959-583736d8da60]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:14:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:43.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:14:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:14:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:14:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:43.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.971 140664 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.972 140664 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:14:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:43.972 140664 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:14:44 np0005593233 python3.9[140793]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159682.7844167-1428-26029002435549/.source.yaml _original_basename=.xpn5s3_7 follow=False checksum=29c9ae8bd33f53131de391173ae7a464927d83f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:44 np0005593233 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 04:14:44 np0005593233 systemd[1]: session-47.scope: Consumed 1min 6ms CPU time.
Jan 23 04:14:44 np0005593233 systemd-logind[804]: Session 47 logged out. Waiting for processes to exit.
Jan 23 04:14:44 np0005593233 systemd-logind[804]: Removed session 47.
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.511 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[039a9766-d2ad-4513-bf19-7139ffda306d]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.514 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, column=external_ids, values=({'neutron:ovn-metadata-id': '637163bd-4073-5b8d-8cb5-e8ceaf4adc1c'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.769 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.781 140224 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.781 140224 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.781 140224 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.782 140224 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.782 140224 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.782 140224 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.783 140224 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.783 140224 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.784 140224 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.784 140224 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.784 140224 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.785 140224 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.785 140224 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.786 140224 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.786 140224 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.787 140224 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.787 140224 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.787 140224 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.788 140224 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.788 140224 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.788 140224 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.789 140224 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.789 140224 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.789 140224 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.790 140224 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.790 140224 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.791 140224 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.791 140224 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.792 140224 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.792 140224 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.792 140224 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.793 140224 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.793 140224 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.794 140224 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.794 140224 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.794 140224 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.795 140224 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.795 140224 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.796 140224 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.796 140224 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.796 140224 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.797 140224 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.797 140224 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.797 140224 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.798 140224 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.798 140224 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.798 140224 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.798 140224 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.799 140224 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.799 140224 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.799 140224 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.799 140224 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.800 140224 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.800 140224 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.800 140224 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.800 140224 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.800 140224 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.801 140224 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.801 140224 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.801 140224 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.801 140224 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.801 140224 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.802 140224 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.802 140224 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.802 140224 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.802 140224 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.803 140224 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.803 140224 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.803 140224 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.803 140224 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.804 140224 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.804 140224 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.804 140224 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.804 140224 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.804 140224 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.805 140224 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.805 140224 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.805 140224 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.805 140224 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.806 140224 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.806 140224 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.806 140224 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.806 140224 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.807 140224 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.807 140224 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.807 140224 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.807 140224 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.807 140224 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.808 140224 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.808 140224 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.808 140224 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.808 140224 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.808 140224 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.809 140224 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.809 140224 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.809 140224 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.810 140224 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.810 140224 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.810 140224 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.810 140224 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.810 140224 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.811 140224 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.811 140224 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.811 140224 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.811 140224 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.811 140224 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.812 140224 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.812 140224 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.812 140224 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.812 140224 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.813 140224 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.813 140224 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.813 140224 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.813 140224 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.814 140224 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.814 140224 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.814 140224 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.814 140224 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.815 140224 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.815 140224 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.815 140224 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.815 140224 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.816 140224 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.816 140224 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.816 140224 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.817 140224 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.817 140224 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.817 140224 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.817 140224 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.818 140224 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.818 140224 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.818 140224 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.819 140224 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.819 140224 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.819 140224 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.820 140224 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.820 140224 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.820 140224 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.820 140224 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.821 140224 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.821 140224 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.821 140224 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.822 140224 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.822 140224 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.822 140224 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.822 140224 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.823 140224 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.823 140224 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.823 140224 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.823 140224 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.824 140224 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.824 140224 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.824 140224 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.824 140224 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.824 140224 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.825 140224 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.825 140224 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.825 140224 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.825 140224 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.825 140224 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.826 140224 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.827 140224 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.828 140224 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.829 140224 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.829 140224 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.829 140224 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.829 140224 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.829 140224 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.829 140224 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.830 140224 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.830 140224 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.830 140224 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.830 140224 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.830 140224 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.830 140224 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.831 140224 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.832 140224 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.833 140224 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.834 140224 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.835 140224 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.836 140224 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.837 140224 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.837 140224 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.837 140224 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.837 140224 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.837 140224 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.837 140224 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.838 140224 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.839 140224 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.840 140224 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.841 140224 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.841 140224 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.841 140224 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.841 140224 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.841 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.841 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.842 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.843 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.844 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.844 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.844 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.844 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.845 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.846 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.846 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.846 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.846 140224 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.846 140224 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.846 140224 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.847 140224 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.847 140224 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:14:44.847 140224 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:14:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:45.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:45.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:47.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:47.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:48 np0005593233 podman[140818]: 2026-01-23 09:14:48.160783666 +0000 UTC m=+0.148544564 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:14:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:49.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:49.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:50 np0005593233 systemd-logind[804]: New session 48 of user zuul.
Jan 23 04:14:50 np0005593233 systemd[1]: Started Session 48 of User zuul.
Jan 23 04:14:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:51.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:51 np0005593233 python3.9[140997]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:14:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:51.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:53 np0005593233 python3.9[141153]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:53.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:53.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:55 np0005593233 python3.9[141317]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:14:55 np0005593233 systemd[1]: Reloading.
Jan 23 04:14:55 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:55 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:55.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:55.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:56 np0005593233 python3.9[141503]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:14:56 np0005593233 network[141520]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:14:56 np0005593233 network[141521]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:14:56 np0005593233 network[141522]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:14:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:14:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:57.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:14:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:57.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:59.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:14:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:59.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:01.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:01 np0005593233 python3.9[141784]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:01.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:02 np0005593233 python3.9[141937]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:03 np0005593233 python3.9[142090]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:03.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:03.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:04 np0005593233 python3.9[142243]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:05 np0005593233 python3.9[142396]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:05.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:05.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:05 np0005593233 python3.9[142549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:06 np0005593233 python3.9[142702]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:07.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:07.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:08 np0005593233 python3.9[142855]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:09 np0005593233 python3.9[143007]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:09.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:09 np0005593233 python3.9[143159]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:09.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:10 np0005593233 python3.9[143311]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:11 np0005593233 python3.9[143463]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:11.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:11 np0005593233 podman[143587]: 2026-01-23 09:15:11.67105934 +0000 UTC m=+0.097741452 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:15:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:11.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:11 np0005593233 python3.9[143632]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:12 np0005593233 python3.9[143786]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:13 np0005593233 python3.9[143938]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:13.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:14 np0005593233 python3.9[144090]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:14 np0005593233 python3.9[144242]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:15 np0005593233 python3.9[144394]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:15.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:15.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:16 np0005593233 python3.9[144546]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:16 np0005593233 python3.9[144698]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:17.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:17 np0005593233 python3.9[144850]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:17.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:18 np0005593233 podman[144974]: 2026-01-23 09:15:18.448405434 +0000 UTC m=+0.108706923 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:15:18 np0005593233 python3.9[145024]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:19.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:19.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:19 np0005593233 python3.9[145180]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:15:20 np0005593233 python3.9[145332]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:15:20 np0005593233 systemd[1]: Reloading.
Jan 23 04:15:21 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:15:21 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:15:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:21.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:21.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:22 np0005593233 python3.9[145518]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:22 np0005593233 python3.9[145671]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:23 np0005593233 python3.9[145824]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:23.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:23.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:24 np0005593233 python3.9[145977]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:25 np0005593233 python3.9[146130]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:25.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:25.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:26 np0005593233 python3.9[146283]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:27 np0005593233 python3.9[146436]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:27.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:28 np0005593233 python3.9[146589]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 04:15:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:29.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:29.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:29 np0005593233 python3.9[146742]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:15:31 np0005593233 python3.9[146900]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:15:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:31.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:31.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:32 np0005593233 python3.9[147060]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:15:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:33 np0005593233 python3.9[147144]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:15:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:33.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:33.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:35.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:35.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:37.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:37.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:39.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:39.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:41.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:41.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:42 np0005593233 podman[147214]: 2026-01-23 09:15:42.094570257 +0000 UTC m=+0.097058602 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 04:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:15:42.579 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:15:42.580 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:15:42.580 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:15:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:43.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:43.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:44 np0005593233 podman[147477]: 2026-01-23 09:15:44.021727109 +0000 UTC m=+0.067053048 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:15:44 np0005593233 podman[147477]: 2026-01-23 09:15:44.153337363 +0000 UTC m=+0.198663272 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Jan 23 04:15:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:45.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:15:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:15:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:45.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:47.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:47.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:49 np0005593233 podman[147774]: 2026-01-23 09:15:49.150662069 +0000 UTC m=+0.162430771 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.598359) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749598460, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2659, "num_deletes": 502, "total_data_size": 6244460, "memory_usage": 6333552, "flush_reason": "Manual Compaction"}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749654663, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4103421, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12234, "largest_seqno": 14888, "table_properties": {"data_size": 4092973, "index_size": 6302, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 22417, "raw_average_key_size": 18, "raw_value_size": 4070378, "raw_average_value_size": 3366, "num_data_blocks": 281, "num_entries": 1209, "num_filter_entries": 1209, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159502, "oldest_key_time": 1769159502, "file_creation_time": 1769159749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 56361 microseconds, and 36130 cpu microseconds.
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.654722) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4103421 bytes OK
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.654742) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.656254) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.656271) EVENT_LOG_v1 {"time_micros": 1769159749656266, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.656299) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 6231897, prev total WAL file size 6231897, number of live WAL files 2.
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.657860) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4007KB)], [24(8424KB)]
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749658133, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12729764, "oldest_snapshot_seqno": -1}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4187 keys, 10316288 bytes, temperature: kUnknown
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749771687, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 10316288, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10283462, "index_size": 21260, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10501, "raw_key_size": 103341, "raw_average_key_size": 24, "raw_value_size": 10202851, "raw_average_value_size": 2436, "num_data_blocks": 898, "num_entries": 4187, "num_filter_entries": 4187, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.771998) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 10316288 bytes
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.773419) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 112.0 rd, 90.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 8.2 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(5.6) write-amplify(2.5) OK, records in: 5210, records dropped: 1023 output_compression: NoCompression
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.773442) EVENT_LOG_v1 {"time_micros": 1769159749773432, "job": 12, "event": "compaction_finished", "compaction_time_micros": 113630, "compaction_time_cpu_micros": 29137, "output_level": 6, "num_output_files": 1, "total_output_size": 10316288, "num_input_records": 5210, "num_output_records": 4187, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749774053, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749775313, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.657736) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.775418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.775425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.775427) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.775429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:15:49.775431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:49.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:49.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:51.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:53.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:55.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:15:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:55.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:15:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:15:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:15:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:57.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:59.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:15:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:59.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:00 np0005593233 kernel: SELinux:  Converting 2773 SID table entries...
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:16:00 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:16:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:01.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:01.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:03.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:03.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:05.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:05.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:07.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:07.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:16:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:16:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:09.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:10 np0005593233 kernel: SELinux:  Converting 2773 SID table entries...
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:16:10 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:16:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:11.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:11.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:12 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 04:16:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:13 np0005593233 podman[147871]: 2026-01-23 09:16:13.119809111 +0000 UTC m=+0.095501908 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 04:16:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:13.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:15.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:15.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:17.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:17.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:19.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:19.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:20 np0005593233 podman[147890]: 2026-01-23 09:16:20.092118122 +0000 UTC m=+0.102875848 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:16:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:21.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:21.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:16:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:23.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:16:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:23.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:16:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:16:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:25.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:27.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:27.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:29.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:29.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:31.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:31.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:16:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:33.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:16:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:33.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:35.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:35.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:37.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:37.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:39.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:39.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:41.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:16:42.581 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:16:42.582 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:16:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:16:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:16:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:43.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:16:44 np0005593233 podman[158674]: 2026-01-23 09:16:44.132960982 +0000 UTC m=+0.069266500 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 04:16:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:45.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:45.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:47.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:49.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:49.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:51 np0005593233 podman[162631]: 2026-01-23 09:16:51.122081001 +0000 UTC m=+0.121186338 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Jan 23 04:16:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:51.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:51.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:53.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:53.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:16:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:55.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:16:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:16:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:16:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:16:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:16:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:57.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:16:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:16:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:57.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:16:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:59.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:16:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:59.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:01.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:01.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:17:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:17:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:03.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:03.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:05.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:07.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:09 np0005593233 kernel: SELinux:  Converting 2774 SID table entries...
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:17:09 np0005593233 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:17:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:09.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.974876) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159829975066, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 976, "num_deletes": 250, "total_data_size": 2162020, "memory_usage": 2190840, "flush_reason": "Manual Compaction"}
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 23 04:17:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:09.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159829985305, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 887777, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14894, "largest_seqno": 15864, "table_properties": {"data_size": 884206, "index_size": 1351, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9135, "raw_average_key_size": 19, "raw_value_size": 876562, "raw_average_value_size": 1913, "num_data_blocks": 62, "num_entries": 458, "num_filter_entries": 458, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159749, "oldest_key_time": 1769159749, "file_creation_time": 1769159829, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 10549 microseconds, and 5250 cpu microseconds.
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.985473) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 887777 bytes OK
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.985534) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.987263) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.987303) EVENT_LOG_v1 {"time_micros": 1769159829987295, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.987337) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2157146, prev total WAL file size 2157146, number of live WAL files 2.
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.988934) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323534' seq:72057594037927935, type:22 .. '6D67727374617400353035' seq:0, type:0; will stop at (end)
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(866KB)], [27(10074KB)]
Jan 23 04:17:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159829989146, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 11204065, "oldest_snapshot_seqno": -1}
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4166 keys, 7889814 bytes, temperature: kUnknown
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159830058806, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 7889814, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7860672, "index_size": 17640, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10437, "raw_key_size": 103194, "raw_average_key_size": 24, "raw_value_size": 7783883, "raw_average_value_size": 1868, "num_data_blocks": 741, "num_entries": 4166, "num_filter_entries": 4166, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159829, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.059273) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 7889814 bytes
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.060744) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.4 rd, 113.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 9.8 +0.0 blob) out(7.5 +0.0 blob), read-write-amplify(21.5) write-amplify(8.9) OK, records in: 4645, records dropped: 479 output_compression: NoCompression
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.060778) EVENT_LOG_v1 {"time_micros": 1769159830060762, "job": 14, "event": "compaction_finished", "compaction_time_micros": 69837, "compaction_time_cpu_micros": 26702, "output_level": 6, "num_output_files": 1, "total_output_size": 7889814, "num_input_records": 4645, "num_output_records": 4166, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159830061303, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159830063323, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:09.988654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.063477) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.063486) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.063488) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.063490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:17:10.063492) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593233 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 23 04:17:10 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 04:17:10 np0005593233 dbus-broker-launch[761]: Noticed file-system modification, trigger reload.
Jan 23 04:17:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:11.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:11.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:13.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:13.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:14 np0005593233 podman[165086]: 2026-01-23 09:17:14.472750226 +0000 UTC m=+0.080869162 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 23 04:17:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:15.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:15.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:17.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:17.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:19 np0005593233 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 04:17:19 np0005593233 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 04:17:19 np0005593233 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 04:17:19 np0005593233 systemd[1]: sshd.service: Consumed 2.962s CPU time, read 32.0K from disk, written 0B to disk.
Jan 23 04:17:19 np0005593233 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 04:17:19 np0005593233 systemd[1]: Stopping sshd-keygen.target...
Jan 23 04:17:19 np0005593233 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:17:19 np0005593233 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:17:19 np0005593233 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:17:19 np0005593233 systemd[1]: Reached target sshd-keygen.target.
Jan 23 04:17:19 np0005593233 systemd[1]: Starting OpenSSH server daemon...
Jan 23 04:17:19 np0005593233 systemd[1]: Started OpenSSH server daemon.
Jan 23 04:17:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:19.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:19.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:21 np0005593233 podman[166022]: 2026-01-23 09:17:21.320273186 +0000 UTC m=+0.137214690 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:17:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:21.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:21 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:17:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:21.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:22 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:17:22 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:22 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:22 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:22 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:17:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:23.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:24.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:25.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:26.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:27.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:28.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:29.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:30.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:31 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:17:31 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:17:31 np0005593233 systemd[1]: man-db-cache-update.service: Consumed 11.697s CPU time.
Jan 23 04:17:31 np0005593233 systemd[1]: run-r60b0c04938fd4c4fa9f431b3bd2d15ab.service: Deactivated successfully.
Jan 23 04:17:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:31.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:32.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:33.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:34.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:17:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:35.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:36.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:37 np0005593233 python3.9[174690]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:37 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:37 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:37 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:37.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:38.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:39 np0005593233 python3.9[174879]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:39 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:39 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:39 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:39.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:40.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:40 np0005593233 python3.9[175071]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:40 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:40 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:40 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:41 np0005593233 python3.9[175260]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:41 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:41 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:41 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:41.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:42.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:17:42.582 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:17:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:17:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:17:42 np0005593233 python3.9[175450]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:42 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:43 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:43 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:43.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:44.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:44 np0005593233 python3.9[175641]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:44 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:44 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:44 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:44 np0005593233 podman[175680]: 2026-01-23 09:17:44.710974141 +0000 UTC m=+0.094889946 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:17:45 np0005593233 python3.9[175849]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:45 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:45 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:45 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:46.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:46.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:46 np0005593233 python3.9[176040]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:47 np0005593233 python3.9[176195]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:47 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:47 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:47 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:48.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:48.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:49 np0005593233 python3.9[176385]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:49 np0005593233 systemd[1]: Reloading.
Jan 23 04:17:49 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:49 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:49 np0005593233 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 04:17:49 np0005593233 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 04:17:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:50.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:50.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:50 np0005593233 python3.9[176579]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:51 np0005593233 python3.9[176734]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:51 np0005593233 podman[176736]: 2026-01-23 09:17:51.504846277 +0000 UTC m=+0.113738198 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 04:17:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:52.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:52.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:52 np0005593233 python3.9[176915]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:53 np0005593233 python3.9[177070]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:53 np0005593233 python3.9[177225]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:17:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:54.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:17:55 np0005593233 python3.9[177380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:56.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:56.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:56 np0005593233 python3.9[177535]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:57 np0005593233 python3.9[177690]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:17:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:17:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:58.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:17:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:58 np0005593233 python3.9[177845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:58 np0005593233 python3.9[178000]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:59 np0005593233 python3.9[178155]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:18:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:00.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:00 np0005593233 python3.9[178310]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:18:01 np0005593233 python3.9[178465]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:18:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:02.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:02 np0005593233 python3.9[178620]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:18:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:03 np0005593233 python3.9[178875]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:04.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:04.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:04 np0005593233 python3.9[179057]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:04 np0005593233 python3.9[179209]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:18:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:18:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:18:05 np0005593233 python3.9[179361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:06.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:06.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:06 np0005593233 python3.9[179513]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:06 np0005593233 python3.9[179665]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:07 np0005593233 python3.9[179815]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:18:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:08.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:08.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:08 np0005593233 python3.9[179967]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:09 np0005593233 python3.9[180092]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159887.9002683-1647-252664280663256/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:09 np0005593233 python3.9[180244]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:10.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:10.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:10 np0005593233 python3.9[180419]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159889.4704382-1647-213997314414601/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:18:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:18:11 np0005593233 python3.9[180571]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:11 np0005593233 python3.9[180696]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159890.7983794-1647-89771093598696/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:12.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:12 np0005593233 python3.9[180848]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:13 np0005593233 python3.9[180973]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159892.1582005-1647-80930974745841/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:14.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:14.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:14 np0005593233 python3.9[181125]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:14 np0005593233 python3.9[181250]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159893.5795288-1647-198566493695971/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:14 np0005593233 podman[181251]: 2026-01-23 09:18:14.913906659 +0000 UTC m=+0.077118925 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 23 04:18:15 np0005593233 python3.9[181420]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:16.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:16.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:16 np0005593233 python3.9[181545]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159894.954269-1647-60275168064537/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:16 np0005593233 python3.9[181697]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:17 np0005593233 python3.9[181820]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159896.268056-1647-27999212737597/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:17 np0005593233 python3.9[181972]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:18.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:18.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:18 np0005593233 python3.9[182097]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159897.492647-1647-85417270473800/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:19 np0005593233 python3.9[182249]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 04:18:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000086s ======
Jan 23 04:18:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:20.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000086s
Jan 23 04:18:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:20.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:20 np0005593233 python3.9[182402]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:21 np0005593233 python3.9[182554]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:21 np0005593233 podman[182678]: 2026-01-23 09:18:21.869154649 +0000 UTC m=+0.101877497 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:18:22 np0005593233 python3.9[182726]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:22.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:22.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:22 np0005593233 python3.9[182884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:23 np0005593233 python3.9[183036]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:24.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:24.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:24 np0005593233 python3.9[183188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:24 np0005593233 python3.9[183340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:25 np0005593233 python3.9[183492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:18:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:26.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:18:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:26.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:26 np0005593233 python3.9[183644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:27 np0005593233 python3.9[183796]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:27 np0005593233 python3.9[183948]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:28.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:28.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:28 np0005593233 python3.9[184100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:29 np0005593233 python3.9[184252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:29 np0005593233 python3.9[184404]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:30.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:30.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:30 np0005593233 python3.9[184556]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:31 np0005593233 python3.9[184679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159910.1511648-2310-120891413423551/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:31 np0005593233 python3.9[184831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:32.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:32.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:32 np0005593233 python3.9[184954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159911.457908-2310-221890563910522/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:33 np0005593233 python3.9[185106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:33 np0005593233 python3.9[185229]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159912.7326272-2310-146445321093759/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:34.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:34.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:34 np0005593233 python3.9[185381]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:35 np0005593233 python3.9[185504]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159914.084423-2310-227865732304805/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:35 np0005593233 python3.9[185656]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:36.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:36.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:36 np0005593233 python3.9[185779]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159915.4301038-2310-116562231875949/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:37 np0005593233 python3.9[185931]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:37 np0005593233 python3.9[186054]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159916.6768582-2310-145858519001992/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:38.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:38 np0005593233 python3.9[186206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:39 np0005593233 python3.9[186329]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159917.933127-2310-256493541928041/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:39 np0005593233 python3.9[186481]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:40.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:18:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:40.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:18:40 np0005593233 python3.9[186604]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159919.1798966-2310-251973514448019/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:41 np0005593233 python3.9[186756]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:41 np0005593233 python3.9[186879]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159920.51947-2310-164075695584727/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:42.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:42.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:42 np0005593233 python3.9[187031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:18:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:18:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:18:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:18:42 np0005593233 python3.9[187154]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159921.824016-2310-43657394457366/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:43 np0005593233 python3.9[187306]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:44.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:44.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:44 np0005593233 python3.9[187429]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159923.0361872-2310-206813651069785/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:44 np0005593233 python3.9[187581]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:45 np0005593233 podman[187632]: 2026-01-23 09:18:45.102520976 +0000 UTC m=+0.101722443 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 04:18:45 np0005593233 python3.9[187723]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159924.3377047-2310-210571889706582/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:46.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:46.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:46 np0005593233 python3.9[187875]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:46 np0005593233 python3.9[187998]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159925.572057-2310-141042292285078/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:47 np0005593233 python3.9[188150]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:18:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:48.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:18:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:48.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:48 np0005593233 python3.9[188273]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159926.9712758-2310-162763631968469/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:48 np0005593233 python3.9[188423]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:18:49 np0005593233 python3.9[188578]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 04:18:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:50.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:50.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:51 np0005593233 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 04:18:52 np0005593233 podman[188690]: 2026-01-23 09:18:52.100562054 +0000 UTC m=+0.104059530 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:18:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:52.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:52.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:52 np0005593233 python3.9[188757]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:53 np0005593233 python3.9[188912]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:53 np0005593233 python3.9[189064]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:18:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:18:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:54.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:18:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:54.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:54 np0005593233 python3.9[189216]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:55 np0005593233 python3.9[189368]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:55 np0005593233 python3.9[189521]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:56.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:56.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:56 np0005593233 python3.9[189673]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:57 np0005593233 python3.9[189825]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:57 np0005593233 python3.9[189977]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:18:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:18:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:58.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:58.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:58 np0005593233 python3.9[190129]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:59 np0005593233 python3.9[190281]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:18:59 np0005593233 systemd[1]: Reloading.
Jan 23 04:18:59 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:18:59 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:18:59 np0005593233 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 04:18:59 np0005593233 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 04:18:59 np0005593233 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 04:18:59 np0005593233 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 04:18:59 np0005593233 systemd[1]: Starting libvirt logging daemon...
Jan 23 04:18:59 np0005593233 systemd[1]: Started libvirt logging daemon.
Jan 23 04:19:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:19:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:00.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:00.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:00 np0005593233 python3.9[190475]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:00 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:00 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:00 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:00 np0005593233 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 04:19:00 np0005593233 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 04:19:00 np0005593233 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 04:19:00 np0005593233 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 04:19:00 np0005593233 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 04:19:00 np0005593233 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 04:19:00 np0005593233 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 04:19:00 np0005593233 systemd[1]: Started libvirt nodedev daemon.
Jan 23 04:19:01 np0005593233 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 04:19:01 np0005593233 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 04:19:01 np0005593233 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 04:19:01 np0005593233 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 04:19:01 np0005593233 python3.9[190698]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:01 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:01 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:01 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:19:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:02.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:02.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:02 np0005593233 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 04:19:02 np0005593233 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 04:19:02 np0005593233 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 04:19:02 np0005593233 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 04:19:02 np0005593233 systemd[1]: Starting libvirt proxy daemon...
Jan 23 04:19:02 np0005593233 systemd[1]: Started libvirt proxy daemon.
Jan 23 04:19:02 np0005593233 setroubleshoot[190538]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 75cc4bb8-a64e-462c-ab66-fb57e0461773
Jan 23 04:19:02 np0005593233 setroubleshoot[190538]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 04:19:02 np0005593233 setroubleshoot[190538]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 75cc4bb8-a64e-462c-ab66-fb57e0461773
Jan 23 04:19:02 np0005593233 setroubleshoot[190538]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 04:19:03 np0005593233 python3.9[190913]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:03 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:03 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:03 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:03 np0005593233 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 04:19:03 np0005593233 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 04:19:03 np0005593233 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 04:19:03 np0005593233 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 04:19:03 np0005593233 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 04:19:03 np0005593233 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 04:19:03 np0005593233 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 04:19:03 np0005593233 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 04:19:03 np0005593233 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 04:19:03 np0005593233 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 04:19:03 np0005593233 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 04:19:03 np0005593233 systemd[1]: Started libvirt QEMU daemon.
Jan 23 04:19:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:19:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:04.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:04.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:04 np0005593233 python3.9[191129]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:04 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:04 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:04 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:04 np0005593233 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 04:19:04 np0005593233 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 04:19:04 np0005593233 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 04:19:04 np0005593233 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 04:19:04 np0005593233 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 04:19:04 np0005593233 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 04:19:04 np0005593233 systemd[1]: Starting libvirt secret daemon...
Jan 23 04:19:04 np0005593233 systemd[1]: Started libvirt secret daemon.
Jan 23 04:19:05 np0005593233 python3.9[191341]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:06.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:19:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:06.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:06 np0005593233 python3.9[191493]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:19:07 np0005593233 python3.9[191645]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:08.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:08.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:08 np0005593233 python3.9[191799]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:19:09 np0005593233 python3.9[191949]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:09 np0005593233 python3.9[192070]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159948.6382625-3384-235818813716336/.source.xml follow=False _original_basename=secret.xml.j2 checksum=4390443d357de49206cd2f69bdb29495711c4544 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:10.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:10.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:10 np0005593233 python3.9[192295]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine e1533653-0a5a-584c-b34b-8689f0d32e77#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:11 np0005593233 python3.9[192515]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:12.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:12.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:12 np0005593233 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 04:19:12 np0005593233 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.003s CPU time.
Jan 23 04:19:12 np0005593233 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 04:19:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:14 np0005593233 python3.9[192978]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:14.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:14.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:14 np0005593233 python3.9[193130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:15 np0005593233 podman[193225]: 2026-01-23 09:19:15.286698686 +0000 UTC m=+0.068593831 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 04:19:15 np0005593233 python3.9[193272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159954.3377235-3549-141420587276188/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:19:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:19:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:16.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:16.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:16 np0005593233 python3.9[193425]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:17 np0005593233 python3.9[193577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:17 np0005593233 python3.9[193655]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:19:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:18.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:19:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:18.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:18 np0005593233 python3.9[193807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:18 np0005593233 python3.9[193885]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5_cjdtth recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:19 np0005593233 auditd[708]: Audit daemon rotating log files
Jan 23 04:19:19 np0005593233 python3.9[194037]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:20 np0005593233 python3.9[194115]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:20.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:20.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:20 np0005593233 python3.9[194267]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:22.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:22.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:22 np0005593233 podman[194470]: 2026-01-23 09:19:22.376183181 +0000 UTC m=+0.133397322 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:19:22 np0005593233 python3[194471]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:19:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:23 np0005593233 python3.9[194649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:23 np0005593233 python3.9[194727]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:24.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:24.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:24 np0005593233 python3.9[194879]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.016174) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965016595, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1457, "num_deletes": 257, "total_data_size": 3552582, "memory_usage": 3584840, "flush_reason": "Manual Compaction"}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965037041, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2337204, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15870, "largest_seqno": 17321, "table_properties": {"data_size": 2330955, "index_size": 3512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12158, "raw_average_key_size": 18, "raw_value_size": 2318598, "raw_average_value_size": 3567, "num_data_blocks": 159, "num_entries": 650, "num_filter_entries": 650, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159830, "oldest_key_time": 1769159830, "file_creation_time": 1769159965, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 20931 microseconds, and 8809 cpu microseconds.
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.037174) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2337204 bytes OK
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.037226) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.039667) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.039706) EVENT_LOG_v1 {"time_micros": 1769159965039698, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.039739) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 3545798, prev total WAL file size 3545798, number of live WAL files 2.
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.041774) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323534' seq:0, type:0; will stop at (end)
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2282KB)], [30(7704KB)]
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965041955, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 10227018, "oldest_snapshot_seqno": -1}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4287 keys, 9889629 bytes, temperature: kUnknown
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965134386, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 9889629, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9857869, "index_size": 19938, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 106773, "raw_average_key_size": 24, "raw_value_size": 9777050, "raw_average_value_size": 2280, "num_data_blocks": 833, "num_entries": 4287, "num_filter_entries": 4287, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159965, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.134695) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 9889629 bytes
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.136250) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.5 rd, 106.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 7.5 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(8.6) write-amplify(4.2) OK, records in: 4816, records dropped: 529 output_compression: NoCompression
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.136266) EVENT_LOG_v1 {"time_micros": 1769159965136258, "job": 16, "event": "compaction_finished", "compaction_time_micros": 92542, "compaction_time_cpu_micros": 45467, "output_level": 6, "num_output_files": 1, "total_output_size": 9889629, "num_input_records": 4816, "num_output_records": 4287, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965136897, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965138383, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.041529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.138457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.138464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.138466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.138468) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:25.138470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593233 python3.9[195004]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159964.3128765-3816-96658036330081/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:26.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:26.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:26 np0005593233 python3.9[195158]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:27 np0005593233 python3.9[195236]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:27 np0005593233 python3.9[195388]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:28.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:28.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:28 np0005593233 python3.9[195466]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:29 np0005593233 python3.9[195618]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:29 np0005593233 python3.9[195743]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159968.6474097-3933-50365753489864/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:30.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:30.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:30 np0005593233 python3.9[195895]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.882817) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970882870, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 306, "num_deletes": 251, "total_data_size": 165554, "memory_usage": 171480, "flush_reason": "Manual Compaction"}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970885322, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 108869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17326, "largest_seqno": 17627, "table_properties": {"data_size": 106898, "index_size": 199, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5054, "raw_average_key_size": 18, "raw_value_size": 103029, "raw_average_value_size": 374, "num_data_blocks": 9, "num_entries": 275, "num_filter_entries": 275, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159965, "oldest_key_time": 1769159965, "file_creation_time": 1769159970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 2521 microseconds, and 858 cpu microseconds.
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.885353) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 108869 bytes OK
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.885366) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.886454) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.886467) EVENT_LOG_v1 {"time_micros": 1769159970886463, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.886476) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 163341, prev total WAL file size 163341, number of live WAL files 2.
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.887112) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(106KB)], [33(9657KB)]
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970887341, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 9998498, "oldest_snapshot_seqno": -1}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4052 keys, 7956311 bytes, temperature: kUnknown
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970982438, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 7956311, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7927800, "index_size": 17265, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10181, "raw_key_size": 102573, "raw_average_key_size": 25, "raw_value_size": 7852705, "raw_average_value_size": 1937, "num_data_blocks": 712, "num_entries": 4052, "num_filter_entries": 4052, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769159970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.982721) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 7956311 bytes
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.984597) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.1 rd, 83.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 9.4 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(164.9) write-amplify(73.1) OK, records in: 4562, records dropped: 510 output_compression: NoCompression
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.984671) EVENT_LOG_v1 {"time_micros": 1769159970984617, "job": 18, "event": "compaction_finished", "compaction_time_micros": 95139, "compaction_time_cpu_micros": 17375, "output_level": 6, "num_output_files": 1, "total_output_size": 7956311, "num_input_records": 4562, "num_output_records": 4052, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970984845, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970987260, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.886691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.987395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.987406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.987413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.987418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:19:30.987421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:31 np0005593233 python3.9[196047]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:32.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:32.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:32 np0005593233 python3.9[196202]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:33 np0005593233 python3.9[196354]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:34.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:34.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:34 np0005593233 python3.9[196507]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:19:35 np0005593233 python3.9[196661]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:36.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:36.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:36 np0005593233 python3.9[196816]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:37 np0005593233 python3.9[196968]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:38 np0005593233 python3.9[197091]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159976.7379427-4149-108646212961103/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:38.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:38.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:38 np0005593233 python3.9[197243]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:39 np0005593233 python3.9[197366]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159978.3348718-4194-160054707517167/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:40.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:40.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:40 np0005593233 python3.9[197518]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:41 np0005593233 python3.9[197641]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159979.8291144-4240-255431810796172/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:42 np0005593233 python3.9[197793]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:19:42 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:42 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:42 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:42.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:42.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:42 np0005593233 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 04:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:19:42.583 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:19:42.584 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:19:42.584 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:19:43 np0005593233 python3.9[197985]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:19:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:43 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:43 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:43 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:43 np0005593233 systemd[1]: Reloading.
Jan 23 04:19:43 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:43 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:44.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:44.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:44 np0005593233 systemd[1]: session-48.scope: Deactivated successfully.
Jan 23 04:19:44 np0005593233 systemd[1]: session-48.scope: Consumed 3min 46.889s CPU time.
Jan 23 04:19:44 np0005593233 systemd-logind[804]: Session 48 logged out. Waiting for processes to exit.
Jan 23 04:19:44 np0005593233 systemd-logind[804]: Removed session 48.
Jan 23 04:19:46 np0005593233 podman[198082]: 2026-01-23 09:19:46.050124711 +0000 UTC m=+0.065997554 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:19:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:46.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:46.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:19:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:48.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:19:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:48.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:50.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:19:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:50.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:19:50 np0005593233 systemd-logind[804]: New session 49 of user zuul.
Jan 23 04:19:50 np0005593233 systemd[1]: Started Session 49 of User zuul.
Jan 23 04:19:51 np0005593233 python3.9[198257]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:19:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:52.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:52.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:53 np0005593233 podman[198385]: 2026-01-23 09:19:53.103028551 +0000 UTC m=+0.118233822 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:19:53 np0005593233 python3.9[198428]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:19:53 np0005593233 network[198452]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:19:53 np0005593233 network[198453]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:19:53 np0005593233 network[198454]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:19:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:54.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:54.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:56.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:19:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:58.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:19:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:19:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:58.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:58 np0005593233 python3.9[198726]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:19:59 np0005593233 python3.9[198810]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:20:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:00.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:00.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:01 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 04:20:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:02.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:02.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:04.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:04.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:06.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:06.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:06 np0005593233 python3.9[198963]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:08.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:08.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:10.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:10.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:11 np0005593233 python3.9[199115]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:12.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:12.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:14.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:14.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:15 np0005593233 python3.9[199268]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:16 np0005593233 python3.9[199420]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:16.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:20:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:16.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:20:16 np0005593233 podman[199545]: 2026-01-23 09:20:16.678773726 +0000 UTC m=+0.077647921 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 04:20:16 np0005593233 python3.9[199593]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:17 np0005593233 python3.9[199717]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160016.2704716-246-11810408929148/.source.iscsi _original_basename=.x7anpt93 follow=False checksum=9b467c6dfcb179c0f5dc0658c24b26922b317212 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:18.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:18.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:18 np0005593233 python3.9[199869]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:19 np0005593233 python3.9[200021]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:19 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:20:19 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:20:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:20.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:20.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:21 np0005593233 python3.9[200174]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:21 np0005593233 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 04:20:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:22.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:22.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:23 np0005593233 python3.9[200448]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:23 np0005593233 systemd[1]: Reloading.
Jan 23 04:20:23 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:20:23 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:20:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:23 np0005593233 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 04:20:23 np0005593233 systemd[1]: Starting Open-iSCSI...
Jan 23 04:20:23 np0005593233 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 04:20:23 np0005593233 systemd[1]: Started Open-iSCSI.
Jan 23 04:20:23 np0005593233 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 04:20:23 np0005593233 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 04:20:23 np0005593233 podman[200501]: 2026-01-23 09:20:23.617684764 +0000 UTC m=+0.130858406 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:20:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:24.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:24.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:20:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:20:25 np0005593233 python3.9[200689]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:20:25 np0005593233 network[200706]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:20:25 np0005593233 network[200707]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:20:25 np0005593233 network[200708]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:20:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:20:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:26.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:20:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:26.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:28.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:28.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:30.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:30.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:30 np0005593233 python3.9[200980]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:20:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:32.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:32.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:33 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:20:33 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:20:33 np0005593233 systemd[1]: Reloading.
Jan 23 04:20:33 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:20:33 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:20:34 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:20:34 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:20:34 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:20:34 np0005593233 systemd[1]: run-ra67d3a1cb53b4618a7a134bcd7fc8157.service: Deactivated successfully.
Jan 23 04:20:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:34.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:34.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:35 np0005593233 python3.9[201347]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 04:20:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:36.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:36.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:36 np0005593233 python3.9[201499]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 04:20:37 np0005593233 python3.9[201655]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:37 np0005593233 python3.9[201778]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160036.8556173-510-74608165989997/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:38 np0005593233 python3.9[201930]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:40 np0005593233 python3.9[202082]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:20:40 np0005593233 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 04:20:40 np0005593233 systemd[1]: Stopped Load Kernel Modules.
Jan 23 04:20:40 np0005593233 systemd[1]: Stopping Load Kernel Modules...
Jan 23 04:20:40 np0005593233 systemd[1]: Starting Load Kernel Modules...
Jan 23 04:20:40 np0005593233 systemd[1]: Finished Load Kernel Modules.
Jan 23 04:20:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:40.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:40.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:41 np0005593233 python3.9[202238]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:41 np0005593233 python3.9[202391]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:42.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:42.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:20:42.585 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:20:42.586 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:20:42.586 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:20:42 np0005593233 python3.9[202543]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:43 np0005593233 python3.9[202666]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160042.3080587-663-201789049666371/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:44 np0005593233 python3.9[202818]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:44.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:20:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:44.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:20:44 np0005593233 python3.9[202971]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:45 np0005593233 python3.9[203123]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:46.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:46.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:46 np0005593233 python3.9[203275]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:47 np0005593233 podman[203323]: 2026-01-23 09:20:47.090272694 +0000 UTC m=+0.092309243 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:20:47 np0005593233 python3.9[203446]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:48 np0005593233 python3.9[203598]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:48.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:48.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:48 np0005593233 python3.9[203750]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:49 np0005593233 python3.9[203902]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:50 np0005593233 python3.9[204054]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:50.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:50.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:50 np0005593233 python3.9[204208]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:52 np0005593233 python3.9[204361]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:52 np0005593233 systemd[1]: Listening on multipathd control socket.
Jan 23 04:20:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:52.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:52.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:53 np0005593233 python3.9[204517]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:53 np0005593233 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 04:20:53 np0005593233 udevadm[204522]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 04:20:53 np0005593233 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 04:20:53 np0005593233 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 04:20:53 np0005593233 multipathd[204525]: --------start up--------
Jan 23 04:20:53 np0005593233 multipathd[204525]: read /etc/multipath.conf
Jan 23 04:20:53 np0005593233 multipathd[204525]: path checkers start up
Jan 23 04:20:53 np0005593233 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 04:20:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:54 np0005593233 podman[204609]: 2026-01-23 09:20:54.124024705 +0000 UTC m=+0.128706475 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 04:20:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:54.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:54.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:54 np0005593233 python3.9[204710]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 04:20:55 np0005593233 python3.9[204862]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 04:20:55 np0005593233 kernel: Key type psk registered
Jan 23 04:20:55 np0005593233 python3.9[205024]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:56.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:56.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:56 np0005593233 python3.9[205147]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160055.4217465-1053-112516629981173/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:57 np0005593233 python3.9[205299]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:58 np0005593233 python3.9[205451]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:20:58 np0005593233 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 04:20:58 np0005593233 systemd[1]: Stopped Load Kernel Modules.
Jan 23 04:20:58 np0005593233 systemd[1]: Stopping Load Kernel Modules...
Jan 23 04:20:58 np0005593233 systemd[1]: Starting Load Kernel Modules...
Jan 23 04:20:58 np0005593233 systemd[1]: Finished Load Kernel Modules.
Jan 23 04:20:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:20:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:20:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:58.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:20:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:58.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:20:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:59 np0005593233 python3.9[205607]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:21:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:21:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:00.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:00.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:00 np0005593233 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 04:21:02 np0005593233 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 04:21:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:21:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:03 np0005593233 systemd[1]: Reloading.
Jan 23 04:21:03 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:03 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:03 np0005593233 systemd[1]: Reloading.
Jan 23 04:21:03 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:03 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:04 np0005593233 systemd-logind[804]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 04:21:04 np0005593233 systemd-logind[804]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 04:21:04 np0005593233 lvm[205719]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:21:04 np0005593233 lvm[205719]: VG ceph_vg0 finished
Jan 23 04:21:04 np0005593233 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:21:04 np0005593233 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:21:04 np0005593233 systemd[1]: Reloading.
Jan 23 04:21:04 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:04 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:21:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:04.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:04.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:04 np0005593233 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:21:05 np0005593233 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:21:05 np0005593233 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:21:05 np0005593233 systemd[1]: man-db-cache-update.service: Consumed 1.693s CPU time.
Jan 23 04:21:05 np0005593233 systemd[1]: run-r9e00a16e32b649fc88c09cffdafb7dca.service: Deactivated successfully.
Jan 23 04:21:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:21:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:06.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:06.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:06 np0005593233 python3.9[207073]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:21:06 np0005593233 systemd[1]: Stopping Open-iSCSI...
Jan 23 04:21:06 np0005593233 iscsid[200503]: iscsid shutting down.
Jan 23 04:21:06 np0005593233 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 04:21:06 np0005593233 systemd[1]: Stopped Open-iSCSI.
Jan 23 04:21:06 np0005593233 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 04:21:06 np0005593233 systemd[1]: Starting Open-iSCSI...
Jan 23 04:21:06 np0005593233 systemd[1]: Started Open-iSCSI.
Jan 23 04:21:07 np0005593233 python3.9[207229]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:21:07 np0005593233 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 04:21:07 np0005593233 multipathd[204525]: exit (signal)
Jan 23 04:21:07 np0005593233 multipathd[204525]: --------shut down-------
Jan 23 04:21:07 np0005593233 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 04:21:07 np0005593233 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 04:21:07 np0005593233 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 04:21:07 np0005593233 multipathd[207235]: --------start up--------
Jan 23 04:21:07 np0005593233 multipathd[207235]: read /etc/multipath.conf
Jan 23 04:21:07 np0005593233 multipathd[207235]: path checkers start up
Jan 23 04:21:07 np0005593233 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 04:21:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:08.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:21:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:08.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:09 np0005593233 python3.9[207392]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:21:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:21:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:10.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:10.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:10 np0005593233 python3.9[207548]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:11 np0005593233 python3.9[207700]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:21:11 np0005593233 systemd[1]: Reloading.
Jan 23 04:21:11 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:11 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:12.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:12.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:12 np0005593233 python3.9[207884]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:21:12 np0005593233 network[207901]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:21:12 np0005593233 network[207902]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:21:12 np0005593233 network[207903]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:21:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:13 np0005593233 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 04:21:13 np0005593233 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 04:21:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:14.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:14.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:16.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:16.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:17 np0005593233 podman[208150]: 2026-01-23 09:21:17.596214005 +0000 UTC m=+0.057932888 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:21:17 np0005593233 python3.9[208196]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:21:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.8 total, 600.0 interval#012Cumulative writes: 6243 writes, 25K keys, 6243 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6243 writes, 1120 syncs, 5.57 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 449 writes, 666 keys, 449 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s#012Interval WAL: 449 writes, 218 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.8 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.7 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.000196 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.8 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 0.000196 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.8 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtabl
Jan 23 04:21:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:18.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:18 np0005593233 python3.9[208349]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:19 np0005593233 python3.9[208502]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:20 np0005593233 python3.9[208655]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:20.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:20.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:20 np0005593233 python3.9[208808]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:21 np0005593233 python3.9[208961]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:22.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:22.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:22 np0005593233 python3.9[209114]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:23 np0005593233 python3.9[209267]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:24 np0005593233 python3.9[209420]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:24.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:24.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:24 np0005593233 podman[209544]: 2026-01-23 09:21:24.798749076 +0000 UTC m=+0.126880803 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:21:24 np0005593233 python3.9[209592]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:25 np0005593233 python3.9[209749]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:26 np0005593233 python3.9[209901]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:26.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:26.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:26 np0005593233 python3.9[210053]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:27 np0005593233 python3.9[210205]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:28 np0005593233 python3.9[210357]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:28.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:28.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:28 np0005593233 python3.9[210509]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:29 np0005593233 python3.9[210661]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:30.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:30.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:30 np0005593233 python3.9[210813]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:31 np0005593233 python3.9[210965]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:31 np0005593233 python3.9[211123]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:32 np0005593233 python3.9[211425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:32.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:32 np0005593233 python3.9[211663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:21:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:21:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:33 np0005593233 python3.9[211823]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:34.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:34.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:34 np0005593233 python3.9[211975]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:35 np0005593233 python3.9[212127]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:36 np0005593233 python3.9[212279]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:21:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:36.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:36.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:37 np0005593233 python3.9[212431]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:21:37 np0005593233 systemd[1]: Reloading.
Jan 23 04:21:37 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:37 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:38.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:38.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:38 np0005593233 python3.9[212618]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:39 np0005593233 python3.9[212783]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:39 np0005593233 python3.9[212974]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:40.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:40.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:40 np0005593233 python3.9[213127]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:41 np0005593233 python3.9[213280]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:42 np0005593233 python3.9[213433]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:42.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:42.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:21:42.586 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:21:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:21:42.587 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:21:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:21:42.587 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:21:42 np0005593233 python3.9[213586]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:43 np0005593233 python3.9[213739]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:44.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:44.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:45 np0005593233 python3.9[213892]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:46 np0005593233 python3.9[214044]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:46.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:46.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:46 np0005593233 python3.9[214196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:47 np0005593233 python3.9[214348]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:47 np0005593233 podman[214349]: 2026-01-23 09:21:47.727052265 +0000 UTC m=+0.068144838 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:21:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:48.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:48.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:48 np0005593233 python3.9[214521]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:49 np0005593233 python3.9[214673]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:49 np0005593233 python3.9[214825]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:50.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:50.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:50 np0005593233 python3.9[214977]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:51 np0005593233 python3.9[215129]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:51 np0005593233 python3.9[215281]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:52.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:52.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:54.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:54.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:55 np0005593233 podman[215306]: 2026-01-23 09:21:55.103710814 +0000 UTC m=+0.112149306 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:21:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:56.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:21:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:56.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:21:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:21:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3436 writes, 18K keys, 3436 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3436 writes, 3436 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1346 writes, 6563 keys, 1346 commit groups, 1.0 writes per commit group, ingest: 14.74 MB, 0.02 MB/s#012Interval WAL: 1346 writes, 1346 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     51.5      0.41              0.12         9    0.045       0      0       0.0       0.0#012  L6      1/0    7.59 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.1     87.6     72.7      0.91              0.22         8    0.113     35K   4334       0.0       0.0#012 Sum      1/0    7.59 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.1     60.3     66.1      1.31              0.34        17    0.077     35K   4334       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.8     91.3     89.9      0.46              0.17         8    0.058     19K   2541       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     87.6     72.7      0.91              0.22         8    0.113     35K   4334       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     51.8      0.41              0.12         8    0.051       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.07 MB/s write, 0.08 GB read, 0.07 MB/s read, 1.3 seconds#012Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 308.00 MB usage: 4.92 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000125 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(268,4.60 MB,1.49429%) FilterBlock(17,107.73 KB,0.0341589%) IndexBlock(17,214.94 KB,0.0681493%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:21:57 np0005593233 python3.9[215459]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 04:21:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:58.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:21:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:21:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:58.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:21:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:58 np0005593233 python3.9[215612]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:21:59 np0005593233 python3.9[215770]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:22:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:00.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:00.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:01 np0005593233 systemd-logind[804]: New session 50 of user zuul.
Jan 23 04:22:01 np0005593233 systemd[1]: Started Session 50 of User zuul.
Jan 23 04:22:01 np0005593233 systemd[1]: session-50.scope: Deactivated successfully.
Jan 23 04:22:01 np0005593233 systemd-logind[804]: Session 50 logged out. Waiting for processes to exit.
Jan 23 04:22:01 np0005593233 systemd-logind[804]: Removed session 50.
Jan 23 04:22:02 np0005593233 python3.9[215956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:02.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:02.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:02 np0005593233 python3.9[216077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160121.6908684-2660-174068547742952/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:03 np0005593233 python3.9[216227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:03 np0005593233 python3.9[216303]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:04.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:04.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:04 np0005593233 python3.9[216453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:05 np0005593233 python3.9[216574]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160124.0522985-2660-159427052376386/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:05 np0005593233 python3.9[216724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:06.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:06.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:06 np0005593233 python3.9[216845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160125.3762133-2660-261925865599774/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:07 np0005593233 python3.9[216995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:07 np0005593233 python3.9[217116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160126.6373577-2660-129784187596621/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:08 np0005593233 python3.9[217266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:08.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:08.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:09 np0005593233 python3.9[217387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160127.7995405-2660-138776492766181/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:09 np0005593233 python3.9[217539]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:10.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:10.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:10 np0005593233 python3.9[217691]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:11 np0005593233 python3.9[217843]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:12 np0005593233 python3.9[217995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:12.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:12.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:12 np0005593233 python3.9[218118]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769160131.560513-2981-42887157521221/.source _original_basename=.ltgoqwu3 follow=False checksum=8eeef7a8efdf34497b756ca8cca54f3f13c99007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 04:22:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:13 np0005593233 python3.9[218270]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:14 np0005593233 python3.9[218422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:14.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:14.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:14 np0005593233 python3.9[218543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160133.971195-3059-74024874008939/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:15 np0005593233 python3.9[218693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:16 np0005593233 python3.9[218814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160135.1622488-3104-138930815566642/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:16.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:16.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:17 np0005593233 python3.9[218966]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 04:22:18 np0005593233 podman[219015]: 2026-01-23 09:22:18.065173437 +0000 UTC m=+0.071997928 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:22:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:18.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:18.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:18 np0005593233 python3.9[219137]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:22:20 np0005593233 python3[219289]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:22:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:20.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:20.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:22.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:22.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:24.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:24.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:26.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:26.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:28.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:28.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:30.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:30.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:32 np0005593233 ceph-mds[85262]: mds.beacon.cephfs.compute-1.elkrlx missed beacon ack from the monitors
Jan 23 04:22:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:33 np0005593233 podman[219360]: 2026-01-23 09:22:33.541422771 +0000 UTC m=+7.838783005 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:22:33 np0005593233 podman[219304]: 2026-01-23 09:22:33.562503554 +0000 UTC m=+13.490640990 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 04:22:33 np0005593233 podman[219409]: 2026-01-23 09:22:33.788554133 +0000 UTC m=+0.097401154 container create c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2)
Jan 23 04:22:33 np0005593233 podman[219409]: 2026-01-23 09:22:33.71987847 +0000 UTC m=+0.028725491 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 04:22:33 np0005593233 python3[219289]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 04:22:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:34.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:34.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:34 np0005593233 python3.9[219598]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:36 np0005593233 python3.9[219752]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 04:22:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:36.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:36.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:37 np0005593233 python3.9[219904]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:22:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:38.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:38.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:39 np0005593233 python3[220056]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:22:39 np0005593233 podman[220092]: 2026-01-23 09:22:39.278374577 +0000 UTC m=+0.045303905 container create ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:22:39 np0005593233 podman[220092]: 2026-01-23 09:22:39.255320259 +0000 UTC m=+0.022249587 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 04:22:39 np0005593233 python3[220056]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 23 04:22:40 np0005593233 python3.9[220414]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:40.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:40.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:22:41 np0005593233 python3.9[220568]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:41 np0005593233 python3.9[220719]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769160161.172894-3392-163715433938092/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:42 np0005593233 python3.9[220795]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:22:42 np0005593233 systemd[1]: Reloading.
Jan 23 04:22:42 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:22:42 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:22:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:42.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:42.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:22:42.586 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:22:42.588 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:22:42.589 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:43 np0005593233 python3.9[220907]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:22:43 np0005593233 systemd[1]: Reloading.
Jan 23 04:22:43 np0005593233 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:22:43 np0005593233 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:22:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:43 np0005593233 systemd[1]: Starting nova_compute container...
Jan 23 04:22:43 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:22:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:43 np0005593233 podman[220948]: 2026-01-23 09:22:43.841591375 +0000 UTC m=+0.121122472 container init ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3)
Jan 23 04:22:43 np0005593233 podman[220948]: 2026-01-23 09:22:43.849018337 +0000 UTC m=+0.128549414 container start ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:22:43 np0005593233 podman[220948]: nova_compute
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + sudo -E kolla_set_configs
Jan 23 04:22:43 np0005593233 systemd[1]: Started nova_compute container.
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Validating config file
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying service configuration files
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Deleting /etc/ceph
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Creating directory /etc/ceph
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Writing out command to execute
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:43 np0005593233 nova_compute[220963]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:43 np0005593233 nova_compute[220963]: ++ cat /run_command
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + CMD=nova-compute
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + ARGS=
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + sudo kolla_copy_cacerts
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + [[ ! -n '' ]]
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + . kolla_extend_start
Jan 23 04:22:43 np0005593233 nova_compute[220963]: Running command: 'nova-compute'
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + umask 0022
Jan 23 04:22:43 np0005593233 nova_compute[220963]: + exec nova-compute
Jan 23 04:22:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:44.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:44.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.146 220967 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.147 220967 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.147 220967 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.147 220967 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 04:22:46 np0005593233 python3.9[221125]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.285 220967 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.310 220967 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:46 np0005593233 nova_compute[220963]: 2026-01-23 09:22:46.311 220967 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:22:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:46.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:46.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.042 220967 INFO nova.virt.driver [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 04:22:47 np0005593233 python3.9[221279]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.186 220967 INFO nova.compute.provider_config [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.200 220967 DEBUG oslo_concurrency.lockutils [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.200 220967 DEBUG oslo_concurrency.lockutils [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.201 220967 DEBUG oslo_concurrency.lockutils [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.201 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.201 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.201 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.201 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.201 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.202 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.203 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.204 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.205 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.206 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.207 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.208 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.209 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.210 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.211 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.212 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.213 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.214 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.215 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.216 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.217 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.218 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.219 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.220 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.221 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.222 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.223 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.224 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.224 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.224 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.224 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.224 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.224 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.225 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.225 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.225 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.225 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.225 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.225 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.226 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.226 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.226 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.226 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.226 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.227 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.227 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.227 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.227 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.227 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.227 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.228 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.228 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.228 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.228 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.228 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.228 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.229 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.230 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.230 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.230 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.230 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.230 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.230 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.231 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.231 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.231 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.231 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.231 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.231 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.232 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.232 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.232 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.232 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.232 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.233 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.234 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.235 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.236 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.237 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.238 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.238 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.238 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.238 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.238 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.238 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.239 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.239 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.239 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.239 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.239 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.240 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.240 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.240 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.240 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.240 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.240 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.241 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.241 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.241 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.241 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.241 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.241 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.242 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.242 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.242 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.242 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.242 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.242 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.243 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.243 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.243 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.243 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.243 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.244 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.244 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.244 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.244 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.244 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.245 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.245 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.245 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.245 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.245 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.245 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.246 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.246 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.246 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.246 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.246 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.247 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.247 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.247 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.247 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.247 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.248 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.248 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.248 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.248 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.249 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.249 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.249 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.249 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.249 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.250 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.250 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.250 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.250 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.251 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.251 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.251 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.251 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.251 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.252 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.252 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.252 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.252 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.253 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.253 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.253 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.253 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.253 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.254 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.255 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.255 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.255 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.255 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.255 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.255 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.256 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.256 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.256 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.256 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.256 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.256 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.257 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.257 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.257 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.257 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.257 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.258 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.258 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.258 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.258 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.258 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.259 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.259 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.259 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.259 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.259 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.259 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.260 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.261 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.262 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.263 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.263 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.263 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.263 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.263 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.263 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.264 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.264 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.264 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.264 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.264 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.264 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.265 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.266 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.266 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.266 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.266 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.266 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.267 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.267 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.267 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.267 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.267 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.268 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.268 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.268 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.268 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.268 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.268 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.269 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.269 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.269 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.269 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.269 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.270 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.270 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.270 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.270 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.270 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.271 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.271 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.271 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.271 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.272 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.272 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.272 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.272 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.272 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.273 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.273 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.273 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.273 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.273 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.273 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.274 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.274 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.274 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.274 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.274 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.275 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.275 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.275 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.275 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.275 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.276 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.276 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.276 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.277 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.277 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.277 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.277 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.277 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.278 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.278 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.278 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.278 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.278 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.279 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.279 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.279 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.279 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.280 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.280 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.280 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.280 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.280 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.280 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.281 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.281 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.281 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.281 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.282 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.282 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.282 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.282 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.282 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.282 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.283 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.283 220967 WARNING oslo_config.cfg [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 04:22:47 np0005593233 nova_compute[220963]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 04:22:47 np0005593233 nova_compute[220963]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 04:22:47 np0005593233 nova_compute[220963]: and ``live_migration_inbound_addr`` respectively.
Jan 23 04:22:47 np0005593233 nova_compute[220963]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.283 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.283 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.283 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.284 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.284 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.284 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.284 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.284 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.284 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.285 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.286 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.286 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.286 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rbd_secret_uuid        = e1533653-0a5a-584c-b34b-8689f0d32e77 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.286 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.286 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.287 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.288 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.288 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.288 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.288 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.288 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.289 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.289 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.289 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.289 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.289 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.289 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.290 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.291 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.291 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.291 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.291 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.291 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.291 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.292 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.292 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.292 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.292 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.292 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.292 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.293 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.293 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.293 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.293 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.293 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.293 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.294 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.294 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.294 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.294 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.294 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.295 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.295 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.295 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.295 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.295 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.296 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.296 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.296 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.296 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.296 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.296 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.297 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.297 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.297 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.297 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.298 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.298 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.298 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.298 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.298 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.299 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.299 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.299 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.299 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.299 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.300 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.300 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.300 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.300 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.300 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.300 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.301 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.301 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.301 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.301 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.301 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.302 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.302 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.302 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.302 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.302 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.303 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.303 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.303 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.303 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.303 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.304 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.304 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.304 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.304 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.304 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.305 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.305 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.305 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.305 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.305 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.306 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.306 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.306 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.306 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.306 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.307 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.307 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.307 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.307 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.307 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.308 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.308 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.308 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.308 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.308 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.309 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.309 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.309 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.309 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.309 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.310 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.310 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.310 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.310 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.310 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.311 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.311 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.311 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.311 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.312 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.312 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.312 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.312 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.312 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.313 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.313 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.313 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.313 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.313 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.313 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.314 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.314 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.314 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.314 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.314 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.314 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.315 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.316 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.316 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.316 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.316 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.316 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.316 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.317 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.318 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.319 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.319 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.319 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.319 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.319 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.319 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.320 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.321 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.321 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.321 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.321 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.321 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.321 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.322 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.323 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.323 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.323 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.323 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.323 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.323 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.324 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.325 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.326 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.326 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.326 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.326 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.326 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.326 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.327 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.327 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.327 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.327 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.327 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.327 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.328 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.329 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.329 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.329 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.329 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.329 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.330 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.330 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.330 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.330 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.330 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.330 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.331 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.332 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.333 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.333 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.333 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.333 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.333 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.333 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.334 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.334 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.334 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.334 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.334 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.334 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.335 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.335 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.335 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.335 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.335 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.335 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.336 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.336 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.336 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.336 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.336 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.336 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.337 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.337 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.337 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.337 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.338 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.338 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.338 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.338 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.338 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.338 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.339 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.340 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.340 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.340 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.340 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.340 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.340 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.341 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.341 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.341 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.341 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.341 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.341 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.342 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.343 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.343 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.343 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.343 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.343 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.343 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.344 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.345 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.345 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.345 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.345 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.345 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.345 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.346 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.347 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.347 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.347 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.347 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.347 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.347 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.348 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.348 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.348 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.348 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.348 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.348 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.349 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.349 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.349 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.349 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.349 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.350 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.350 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.350 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.350 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.350 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.351 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.351 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.351 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.351 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.351 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.352 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.352 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.352 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.352 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.352 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.353 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.353 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.353 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.353 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.354 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.354 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.354 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.354 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.354 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.355 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.355 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.355 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.355 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.356 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.356 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.356 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.356 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.357 220967 DEBUG oslo_service.service [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.358 220967 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.377 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.378 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.378 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.378 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 04:22:47 np0005593233 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 04:22:47 np0005593233 systemd[1]: Started libvirt QEMU daemon.
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.458 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fac7b4977c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.460 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fac7b4977c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.461 220967 INFO nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.481 220967 WARNING nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 23 04:22:47 np0005593233 nova_compute[220963]: 2026-01-23 09:22:47.482 220967 DEBUG nova.virt.libvirt.volume.mount [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 04:22:48 np0005593233 podman[221414]: 2026-01-23 09:22:48.243293278 +0000 UTC m=+0.066363067 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.400 220967 INFO nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <host>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <uuid>5e159ac4-110b-464c-8264-d020fcde6246</uuid>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <arch>x86_64</arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model>EPYC-Rome-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <vendor>AMD</vendor>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <microcode version='16777317'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <signature family='23' model='49' stepping='0'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='x2apic'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='tsc-deadline'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='osxsave'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='hypervisor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='tsc_adjust'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='spec-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='stibp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='arch-capabilities'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='cmp_legacy'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='topoext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='virt-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='lbrv'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='tsc-scale'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='vmcb-clean'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='pause-filter'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='pfthreshold'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='svme-addr-chk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='rdctl-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='mds-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature name='pschange-mc-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <pages unit='KiB' size='4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <pages unit='KiB' size='2048'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <pages unit='KiB' size='1048576'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <power_management>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <suspend_mem/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </power_management>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <iommu support='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <migration_features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <live/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <uri_transports>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <uri_transport>tcp</uri_transport>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <uri_transport>rdma</uri_transport>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </uri_transports>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </migration_features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <topology>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <cells num='1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <cell id='0'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          <memory unit='KiB'>7864308</memory>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          <pages unit='KiB' size='4'>1966077</pages>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          <distances>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <sibling id='0' value='10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          </distances>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          <cpus num='8'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:          </cpus>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        </cell>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </cells>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </topology>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <cache>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </cache>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <secmodel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model>selinux</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <doi>0</doi>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </secmodel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <secmodel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model>dac</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <doi>0</doi>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </secmodel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </host>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <guest>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <os_type>hvm</os_type>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <arch name='i686'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <wordsize>32</wordsize>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <domain type='qemu'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <domain type='kvm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <pae/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <nonpae/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <apic default='on' toggle='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <cpuselection/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <deviceboot/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <externalSnapshot/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </guest>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <guest>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <os_type>hvm</os_type>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <arch name='x86_64'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <wordsize>64</wordsize>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <domain type='qemu'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <domain type='kvm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <apic default='on' toggle='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <cpuselection/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <deviceboot/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <externalSnapshot/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </guest>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 
Jan 23 04:22:48 np0005593233 nova_compute[220963]: </capabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: #033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.409 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.440 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 04:22:48 np0005593233 nova_compute[220963]: <domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <domain>kvm</domain>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <arch>i686</arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <vcpu max='240'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <iothreads supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <os supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='firmware'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <loader supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>rom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pflash</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='readonly'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>yes</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='secure'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </loader>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </os>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='maximumMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <vendor>AMD</vendor>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='succor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='custom' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <memoryBacking supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='sourceType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>anonymous</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>memfd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </memoryBacking>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <disk supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='diskDevice'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>disk</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cdrom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>floppy</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>lun</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ide</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>fdc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>sata</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </disk>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <graphics supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vnc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egl-headless</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </graphics>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <video supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='modelType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vga</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cirrus</value>
Jan 23 04:22:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>none</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>bochs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ramfb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </video>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hostdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='mode'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>subsystem</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='startupPolicy'>
Jan 23 04:22:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:48.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>mandatory</value>
Jan 23 04:22:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>requisite</value>
Jan 23 04:22:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>optional</value>
Jan 23 04:22:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:48.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='subsysType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pci</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='capsType'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='pciBackend'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hostdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <rng supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>random</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </rng>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <filesystem supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='driverType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>path</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>handle</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtiofs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </filesystem>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tpm supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-tis</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-crb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emulator</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>external</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendVersion'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>2.0</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </tpm>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <redirdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </redirdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <channel supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </channel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <crypto supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </crypto>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <interface supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>passt</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </interface>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <panic supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>isa</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>hyperv</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </panic>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <console supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>null</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dev</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pipe</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stdio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>udp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tcp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu-vdagent</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </console>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <gic supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <genid supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backup supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <async-teardown supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <s390-pv supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <ps2 supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tdx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sev supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sgx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hyperv supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='features'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>relaxed</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vapic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>spinlocks</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vpindex</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>runtime</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>synic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stimer</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reset</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vendor_id</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>frequencies</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reenlightenment</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tlbflush</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ipi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>avic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emsr_bitmap</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>xmm_input</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hyperv>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <launchSecurity supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: </domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.450 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 04:22:48 np0005593233 nova_compute[220963]: <domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <domain>kvm</domain>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <arch>i686</arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <vcpu max='4096'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <iothreads supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <os supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='firmware'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <loader supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>rom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pflash</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='readonly'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>yes</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='secure'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </loader>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </os>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='maximumMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <vendor>AMD</vendor>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='succor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='custom' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <memoryBacking supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='sourceType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>anonymous</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>memfd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </memoryBacking>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <disk supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='diskDevice'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>disk</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cdrom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>floppy</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>lun</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>fdc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>sata</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </disk>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <graphics supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vnc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egl-headless</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </graphics>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <video supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='modelType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vga</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cirrus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>none</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>bochs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ramfb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </video>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hostdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='mode'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>subsystem</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='startupPolicy'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>mandatory</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>requisite</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>optional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='subsysType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pci</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='capsType'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='pciBackend'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hostdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <rng supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>random</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </rng>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <filesystem supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='driverType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>path</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>handle</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtiofs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </filesystem>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tpm supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-tis</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-crb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emulator</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>external</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendVersion'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>2.0</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </tpm>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <redirdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </redirdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <channel supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </channel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <crypto supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </crypto>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <interface supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>passt</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </interface>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <panic supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>isa</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>hyperv</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </panic>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <console supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>null</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dev</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pipe</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stdio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>udp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tcp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu-vdagent</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </console>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <gic supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <genid supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backup supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <async-teardown supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <s390-pv supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <ps2 supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tdx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sev supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sgx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hyperv supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='features'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>relaxed</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vapic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>spinlocks</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vpindex</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>runtime</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>synic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stimer</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reset</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vendor_id</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>frequencies</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reenlightenment</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tlbflush</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ipi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>avic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emsr_bitmap</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>xmm_input</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hyperv>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <launchSecurity supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: </domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.529 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.538 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 04:22:48 np0005593233 nova_compute[220963]: <domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <domain>kvm</domain>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <arch>x86_64</arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <vcpu max='240'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <iothreads supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <os supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='firmware'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <loader supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>rom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pflash</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='readonly'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>yes</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='secure'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </loader>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </os>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='maximumMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <vendor>AMD</vendor>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='succor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='custom' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <memoryBacking supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='sourceType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>anonymous</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>memfd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </memoryBacking>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <disk supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='diskDevice'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>disk</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cdrom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>floppy</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>lun</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ide</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>fdc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>sata</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </disk>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <graphics supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vnc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egl-headless</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </graphics>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <video supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='modelType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vga</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cirrus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>none</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>bochs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ramfb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </video>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hostdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='mode'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>subsystem</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='startupPolicy'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>mandatory</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>requisite</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>optional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='subsysType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pci</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='capsType'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='pciBackend'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hostdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <rng supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>random</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </rng>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <filesystem supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='driverType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>path</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>handle</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtiofs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </filesystem>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tpm supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-tis</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-crb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emulator</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>external</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendVersion'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>2.0</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </tpm>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <redirdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </redirdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <channel supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </channel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <crypto supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </crypto>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <interface supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>passt</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </interface>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <panic supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>isa</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>hyperv</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </panic>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <console supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>null</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dev</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pipe</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stdio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>udp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tcp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu-vdagent</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </console>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <gic supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <genid supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backup supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <async-teardown supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <s390-pv supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <ps2 supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tdx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sev supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sgx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hyperv supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='features'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>relaxed</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vapic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>spinlocks</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vpindex</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>runtime</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>synic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stimer</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reset</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vendor_id</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>frequencies</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reenlightenment</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tlbflush</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ipi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>avic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emsr_bitmap</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>xmm_input</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hyperv>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <launchSecurity supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: </domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.618 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 04:22:48 np0005593233 nova_compute[220963]: <domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <domain>kvm</domain>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <arch>x86_64</arch>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <vcpu max='4096'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <iothreads supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <os supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='firmware'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>efi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <loader supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>rom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pflash</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='readonly'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>yes</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='secure'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>yes</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>no</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </loader>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </os>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='maximumMigratable'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>on</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>off</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <vendor>AMD</vendor>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='succor'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <mode name='custom' supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ddpd-u'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sha512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm3'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sm4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Denverton-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amd-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='auto-ibrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='perfmon-v2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbpb'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='stibp-always-on'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='EPYC-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-128'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-256'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx10-512'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='prefetchiti'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Haswell-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512er'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512pf'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fma4'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tbm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xop'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='amx-tile'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-bf16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-fp16'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bitalg'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrc'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fzrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='la57'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='taa-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ifma'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cmpccxadd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fbsdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='fsrs'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ibrs-all'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='intel-psfd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='lam'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mcdt-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pbrsb-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='psdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='serialize'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vaes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='hle'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='rtm'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512bw'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512cd'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512dq'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512f'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='avx512vl'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='invpcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pcid'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='pku'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='mpx'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='core-capability'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='split-lock-detect'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='cldemote'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='erms'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='gfni'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdir64b'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='movdiri'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='xsaves'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='athlon-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='core2duo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='coreduo-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='n270-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='ss'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <blockers model='phenom-v1'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnow'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <feature name='3dnowext'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </blockers>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </mode>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <memoryBacking supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <enum name='sourceType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>anonymous</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <value>memfd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </memoryBacking>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <disk supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='diskDevice'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>disk</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cdrom</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>floppy</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>lun</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>fdc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>sata</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </disk>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <graphics supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vnc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egl-headless</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </graphics>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <video supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='modelType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vga</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>cirrus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>none</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>bochs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ramfb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </video>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hostdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='mode'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>subsystem</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='startupPolicy'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>mandatory</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>requisite</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>optional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='subsysType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pci</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>scsi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='capsType'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='pciBackend'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hostdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <rng supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtio-non-transitional</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>random</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>egd</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </rng>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <filesystem supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='driverType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>path</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>handle</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>virtiofs</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </filesystem>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tpm supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-tis</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tpm-crb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emulator</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>external</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendVersion'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>2.0</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </tpm>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <redirdev supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='bus'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>usb</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </redirdev>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <channel supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </channel>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <crypto supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendModel'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>builtin</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </crypto>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <interface supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='backendType'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>default</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>passt</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </interface>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <panic supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='model'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>isa</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>hyperv</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </panic>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <console supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='type'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>null</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vc</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pty</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dev</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>file</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>pipe</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stdio</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>udp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tcp</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>unix</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>qemu-vdagent</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>dbus</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </console>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </devices>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <gic supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <genid supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <backup supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <async-teardown supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <s390-pv supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <ps2 supported='yes'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <tdx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sev supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <sgx supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <hyperv supported='yes'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <enum name='features'>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>relaxed</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vapic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>spinlocks</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vpindex</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>runtime</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>synic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>stimer</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reset</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>vendor_id</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>frequencies</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>reenlightenment</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>tlbflush</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>ipi</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>avic</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>emsr_bitmap</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <value>xmm_input</value>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </enum>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      <defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:      </defaults>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    </hyperv>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:    <launchSecurity supported='no'/>
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  </features>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: </domainCapabilities>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.700 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.701 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.701 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.705 220967 INFO nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Secure Boot support detected#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.707 220967 INFO nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.708 220967 INFO nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.723 220967 DEBUG nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] cpu compare xml: <cpu match="exact">
Jan 23 04:22:48 np0005593233 nova_compute[220963]:  <model>Nehalem</model>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: </cpu>
Jan 23 04:22:48 np0005593233 nova_compute[220963]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.803 220967 DEBUG nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.858 220967 INFO nova.virt.node [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Determined node identity 929812a2-38ca-4ee7-9f24-090d633cb42b from /var/lib/nova/compute_id#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.876 220967 WARNING nova.compute.manager [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Compute nodes ['929812a2-38ca-4ee7-9f24-090d633cb42b'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 23 04:22:48 np0005593233 nova_compute[220963]: 2026-01-23 09:22:48.959 220967 INFO nova.compute.manager [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.001 220967 WARNING nova.compute.manager [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.001 220967 DEBUG oslo_concurrency.lockutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.001 220967 DEBUG oslo_concurrency.lockutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.003 220967 DEBUG oslo_concurrency.lockutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.003 220967 DEBUG nova.compute.resource_tracker [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.004 220967 DEBUG oslo_concurrency.processutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:49 np0005593233 python3.9[221561]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:22:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/598960811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.440 220967 DEBUG oslo_concurrency.processutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:49 np0005593233 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 04:22:49 np0005593233 systemd[1]: Started libvirt nodedev daemon.
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.799 220967 WARNING nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.800 220967 DEBUG nova.compute.resource_tracker [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5272MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.801 220967 DEBUG oslo_concurrency.lockutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.801 220967 DEBUG oslo_concurrency.lockutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.815 220967 WARNING nova.compute.resource_tracker [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] No compute node record for compute-1.ctlplane.example.com:929812a2-38ca-4ee7-9f24-090d633cb42b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 929812a2-38ca-4ee7-9f24-090d633cb42b could not be found.#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.832 220967 INFO nova.compute.resource_tracker [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 929812a2-38ca-4ee7-9f24-090d633cb42b#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.878 220967 DEBUG nova.compute.resource_tracker [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:22:49 np0005593233 nova_compute[220963]: 2026-01-23 09:22:49.878 220967 DEBUG nova.compute.resource_tracker [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:22:50 np0005593233 python3.9[221758]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 04:22:50 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:22:50 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:22:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:50.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:50.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:50 np0005593233 nova_compute[220963]: 2026-01-23 09:22:50.674 220967 INFO nova.scheduler.client.report [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] [req-29a7a36d-329e-47f4-8338-415d09470c2d] Created resource provider record via placement API for resource provider with UUID 929812a2-38ca-4ee7-9f24-090d633cb42b and name compute-1.ctlplane.example.com.#033[00m
Jan 23 04:22:50 np0005593233 nova_compute[220963]: 2026-01-23 09:22:50.697 220967 DEBUG oslo_concurrency.processutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:22:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1475833315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:22:51 np0005593233 python3.9[221954]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.182 220967 DEBUG oslo_concurrency.processutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.189 220967 DEBUG nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 23 04:22:51 np0005593233 nova_compute[220963]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.189 220967 INFO nova.virt.libvirt.host [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.190 220967 DEBUG nova.compute.provider_tree [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.191 220967 DEBUG nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.193 220967 DEBUG nova.virt.libvirt.driver [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Libvirt baseline CPU <cpu>
Jan 23 04:22:51 np0005593233 nova_compute[220963]:  <arch>x86_64</arch>
Jan 23 04:22:51 np0005593233 nova_compute[220963]:  <model>Nehalem</model>
Jan 23 04:22:51 np0005593233 nova_compute[220963]:  <vendor>AMD</vendor>
Jan 23 04:22:51 np0005593233 nova_compute[220963]:  <topology sockets="8" cores="1" threads="1"/>
Jan 23 04:22:51 np0005593233 nova_compute[220963]: </cpu>
Jan 23 04:22:51 np0005593233 nova_compute[220963]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 23 04:22:51 np0005593233 systemd[1]: Stopping nova_compute container...
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.275 220967 DEBUG nova.scheduler.client.report [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Updated inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.275 220967 DEBUG nova.compute.provider_tree [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Updating resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.275 220967 DEBUG nova.compute.provider_tree [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.282 220967 DEBUG oslo_concurrency.lockutils [None req-04e7e556-007f-41b8-964d-fe413fb6eb3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.283 220967 DEBUG oslo_concurrency.lockutils [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.283 220967 DEBUG oslo_concurrency.lockutils [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:22:51 np0005593233 nova_compute[220963]: 2026-01-23 09:22:51.283 220967 DEBUG oslo_concurrency.lockutils [None req-5311f1c1-9caa-4a5d-abb4-b8bbf7dd9b27 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:22:51 np0005593233 virtqemud[221325]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 04:22:51 np0005593233 virtqemud[221325]: hostname: compute-1
Jan 23 04:22:51 np0005593233 virtqemud[221325]: End of file while reading data: Input/output error
Jan 23 04:22:51 np0005593233 systemd[1]: libpod-ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5.scope: Deactivated successfully.
Jan 23 04:22:51 np0005593233 systemd[1]: libpod-ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5.scope: Consumed 4.383s CPU time.
Jan 23 04:22:51 np0005593233 podman[221960]: 2026-01-23 09:22:51.711867428 +0000 UTC m=+0.473066269 container died ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 04:22:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5-userdata-shm.mount: Deactivated successfully.
Jan 23 04:22:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay-d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56-merged.mount: Deactivated successfully.
Jan 23 04:22:51 np0005593233 podman[221960]: 2026-01-23 09:22:51.789934288 +0000 UTC m=+0.551133089 container cleanup ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 04:22:51 np0005593233 podman[221960]: nova_compute
Jan 23 04:22:51 np0005593233 podman[221991]: nova_compute
Jan 23 04:22:51 np0005593233 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 04:22:51 np0005593233 systemd[1]: Stopped nova_compute container.
Jan 23 04:22:51 np0005593233 systemd[1]: Starting nova_compute container...
Jan 23 04:22:52 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:22:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1436018966c2082364001fab03da0af5e897796e35870d2d927182d4515cd56/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:52 np0005593233 podman[222003]: 2026-01-23 09:22:52.09237628 +0000 UTC m=+0.219554274 container init ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:22:52 np0005593233 podman[222003]: 2026-01-23 09:22:52.097781655 +0000 UTC m=+0.224959629 container start ea3d12ba607df1cf8cab33a29787dcbdc1f5063270812ac81900c9cb874a89d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + sudo -E kolla_set_configs
Jan 23 04:22:52 np0005593233 podman[222003]: nova_compute
Jan 23 04:22:52 np0005593233 systemd[1]: Started nova_compute container.
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Validating config file
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying service configuration files
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /etc/ceph
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Creating directory /etc/ceph
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Writing out command to execute
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:52 np0005593233 nova_compute[222017]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:52 np0005593233 nova_compute[222017]: ++ cat /run_command
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + CMD=nova-compute
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + ARGS=
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + sudo kolla_copy_cacerts
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + [[ ! -n '' ]]
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + . kolla_extend_start
Jan 23 04:22:52 np0005593233 nova_compute[222017]: Running command: 'nova-compute'
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + umask 0022
Jan 23 04:22:52 np0005593233 nova_compute[222017]: + exec nova-compute
Jan 23 04:22:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:52.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:52.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:54 np0005593233 python3.9[222182]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.212 222021 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.213 222021 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.213 222021 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.213 222021 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 04:22:54 np0005593233 systemd[1]: Started libpod-conmon-c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2.scope.
Jan 23 04:22:54 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:22:54 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c464c6a6e16d892e3adda08b17d6020220357c22662845967c5960a94d91f6d/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:54 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c464c6a6e16d892e3adda08b17d6020220357c22662845967c5960a94d91f6d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:54 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c464c6a6e16d892e3adda08b17d6020220357c22662845967c5960a94d91f6d/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:54 np0005593233 podman[222207]: 2026-01-23 09:22:54.279207946 +0000 UTC m=+0.125648501 container init c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:22:54 np0005593233 podman[222207]: 2026-01-23 09:22:54.288881872 +0000 UTC m=+0.135322417 container start c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:22:54 np0005593233 python3.9[222182]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 04:22:54 np0005593233 nova_compute_init[222228]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 04:22:54 np0005593233 systemd[1]: libpod-c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2.scope: Deactivated successfully.
Jan 23 04:22:54 np0005593233 podman[222229]: 2026-01-23 09:22:54.353400156 +0000 UTC m=+0.030066560 container died c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.352 222021 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.373 222021 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.374 222021 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:22:54 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2-userdata-shm.mount: Deactivated successfully.
Jan 23 04:22:54 np0005593233 systemd[1]: var-lib-containers-storage-overlay-3c464c6a6e16d892e3adda08b17d6020220357c22662845967c5960a94d91f6d-merged.mount: Deactivated successfully.
Jan 23 04:22:54 np0005593233 podman[222239]: 2026-01-23 09:22:54.419606587 +0000 UTC m=+0.055584089 container cleanup c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 04:22:54 np0005593233 systemd[1]: libpod-conmon-c95c976c257ed88abeb6943177e7d567dab07e4c48d9959a34b3a23e5557daa2.scope: Deactivated successfully.
Jan 23 04:22:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:22:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:54.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:22:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:54.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.853 222021 INFO nova.virt.driver [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.966 222021 INFO nova.compute.provider_config [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.978 222021 DEBUG oslo_concurrency.lockutils [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.979 222021 DEBUG oslo_concurrency.lockutils [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.979 222021 DEBUG oslo_concurrency.lockutils [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.979 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.980 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.980 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.980 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.980 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.980 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.980 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.981 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.982 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.983 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.984 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.984 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.984 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.984 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.984 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.984 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.985 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.985 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.985 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.985 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.985 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.985 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.986 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.986 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.986 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.986 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.986 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.987 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.987 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.987 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.987 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.987 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.988 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.988 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.988 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.988 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.988 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.988 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.989 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.989 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.989 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.989 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.989 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.989 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.990 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.991 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.992 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.992 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.992 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.992 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.992 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.992 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.993 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.994 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.995 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.996 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.996 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.996 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.996 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.996 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.996 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.997 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.998 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.998 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.998 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.998 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.998 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.999 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.999 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.999 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:54 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.999 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:54.999 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.000 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.000 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.000 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.000 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.000 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.001 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.001 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.001 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.001 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.001 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.002 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.003 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.003 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.003 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.003 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.003 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.003 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.004 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.004 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.004 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.004 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.004 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.004 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.005 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.006 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.006 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.006 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.006 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.006 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.006 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.007 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.008 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.008 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.008 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.008 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.008 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.008 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.009 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.010 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.011 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.012 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.012 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.012 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.012 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.012 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.012 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.013 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.014 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.015 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.016 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.016 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.016 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.016 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.016 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.016 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.017 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.017 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.017 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.017 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.017 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.017 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.018 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.018 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.018 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.018 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.018 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.018 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.019 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.020 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.020 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.020 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.020 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.020 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.020 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.021 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.022 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.023 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.024 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.024 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.024 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.024 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.024 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.024 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.025 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.026 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.027 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.028 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.029 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.030 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.030 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.030 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.030 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.030 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.030 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.031 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.032 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.032 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.032 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.032 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.032 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.032 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.033 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.034 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.034 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.034 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.034 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.034 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.034 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.035 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.036 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.036 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.036 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.036 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.036 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.036 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.037 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.038 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.039 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.039 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.039 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.039 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.039 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.039 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.040 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.041 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.042 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.043 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.044 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.044 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.044 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.044 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.044 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.044 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.045 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.046 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.047 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.048 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.048 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.048 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.048 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.048 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.048 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.049 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.049 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.049 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.049 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.049 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.049 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.050 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.051 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.052 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.052 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.052 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.052 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.052 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.052 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.053 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.053 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.053 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.053 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.053 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.053 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.054 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.054 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.054 222021 WARNING oslo_config.cfg [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 04:22:55 np0005593233 nova_compute[222017]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 04:22:55 np0005593233 nova_compute[222017]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 04:22:55 np0005593233 nova_compute[222017]: and ``live_migration_inbound_addr`` respectively.
Jan 23 04:22:55 np0005593233 nova_compute[222017]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.054 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.054 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.054 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.055 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.055 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.055 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.055 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.055 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.056 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.056 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.056 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.056 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.056 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.057 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.057 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.057 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.057 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.057 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.057 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rbd_secret_uuid        = e1533653-0a5a-584c-b34b-8689f0d32e77 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.058 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.058 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.058 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.058 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.058 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.058 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.059 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.059 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.059 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.059 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.059 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.059 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.060 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.060 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.060 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.060 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.060 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.061 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.061 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.061 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.061 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.061 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.062 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.062 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.062 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.062 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.062 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.063 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.063 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.063 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.063 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.063 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.063 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.064 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.064 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.064 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.064 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.064 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.064 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.065 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.065 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.065 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.065 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.065 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.066 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.066 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.066 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.066 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.066 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.066 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.067 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.067 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.067 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.067 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.067 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.067 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.068 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.068 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.068 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.068 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.068 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.068 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.069 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.070 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.071 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.071 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.071 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.071 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.071 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.071 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.072 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.072 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.072 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.072 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.072 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.072 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.073 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.074 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.074 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.074 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.074 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.074 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.074 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.075 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.076 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.076 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.076 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.076 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.076 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.076 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.077 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.077 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.077 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.077 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.077 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.077 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.078 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.079 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.080 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.080 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.080 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.080 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.080 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.080 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.081 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.082 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.083 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.084 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.084 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.084 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.084 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.084 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.084 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.085 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.086 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.086 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.086 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.086 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.086 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.086 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.087 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.088 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.088 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.088 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.088 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.088 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.088 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.089 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.090 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.090 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.090 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.090 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.090 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.090 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.091 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.092 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.093 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.094 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.094 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.094 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.094 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.094 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.094 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.095 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.095 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.095 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.095 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.095 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.095 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.096 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.097 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.098 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.099 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.100 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.100 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.100 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.100 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.100 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.100 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.101 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.101 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.101 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.101 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.101 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.101 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.102 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.103 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.104 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.104 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.104 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.104 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.104 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.104 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.105 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.105 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.105 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.105 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.105 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.106 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.106 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.106 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.106 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.106 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.107 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.107 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.107 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.107 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.107 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.108 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.108 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.108 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.108 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.109 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.109 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.109 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.109 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.109 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.110 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.110 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.110 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.110 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.110 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.111 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.111 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.111 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.111 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.111 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.112 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.112 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.112 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.112 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.112 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.113 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.113 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.113 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.113 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.113 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.114 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.114 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.114 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.114 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.114 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.115 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.115 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.115 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.115 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.115 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.116 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.116 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.116 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.116 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.116 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.117 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.117 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.117 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.117 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.117 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.118 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.118 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.118 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.118 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.119 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.119 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.119 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.119 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.119 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.120 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.120 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.120 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.121 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.121 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.121 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.121 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.122 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.122 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.122 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.122 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.122 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.123 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.123 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.123 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.123 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.124 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.124 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.124 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.124 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.125 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.125 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.125 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.125 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.125 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.126 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.126 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.126 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.126 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.126 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.127 222021 DEBUG oslo_service.service [None req-fb1557ce-7f72-4e37-8332-8690ab76d47c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.128 222021 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 04:22:55 np0005593233 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 04:22:55 np0005593233 systemd[1]: session-49.scope: Consumed 2min 5.822s CPU time.
Jan 23 04:22:55 np0005593233 systemd-logind[804]: Session 49 logged out. Waiting for processes to exit.
Jan 23 04:22:55 np0005593233 systemd-logind[804]: Removed session 49.
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.154 222021 INFO nova.virt.node [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Determined node identity 929812a2-38ca-4ee7-9f24-090d633cb42b from /var/lib/nova/compute_id#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.155 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.156 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.156 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.156 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.167 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f514b2e2640> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.169 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f514b2e2640> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.170 222021 INFO nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.177 222021 INFO nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <host>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <uuid>5e159ac4-110b-464c-8264-d020fcde6246</uuid>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <arch>x86_64</arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model>EPYC-Rome-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <vendor>AMD</vendor>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <microcode version='16777317'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <signature family='23' model='49' stepping='0'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='x2apic'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='tsc-deadline'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='osxsave'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='hypervisor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='tsc_adjust'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='spec-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='stibp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='arch-capabilities'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='cmp_legacy'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='topoext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='virt-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='lbrv'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='tsc-scale'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='vmcb-clean'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='pause-filter'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='pfthreshold'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='svme-addr-chk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='rdctl-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='mds-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature name='pschange-mc-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <pages unit='KiB' size='4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <pages unit='KiB' size='2048'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <pages unit='KiB' size='1048576'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <power_management>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <suspend_mem/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </power_management>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <iommu support='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <migration_features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <live/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <uri_transports>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <uri_transport>tcp</uri_transport>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <uri_transport>rdma</uri_transport>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </uri_transports>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </migration_features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <topology>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <cells num='1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <cell id='0'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          <memory unit='KiB'>7864308</memory>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          <pages unit='KiB' size='4'>1966077</pages>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          <distances>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <sibling id='0' value='10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          </distances>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          <cpus num='8'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:          </cpus>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        </cell>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </cells>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </topology>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <cache>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </cache>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <secmodel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model>selinux</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <doi>0</doi>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </secmodel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <secmodel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model>dac</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <doi>0</doi>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </secmodel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </host>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <guest>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <os_type>hvm</os_type>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <arch name='i686'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <wordsize>32</wordsize>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <domain type='qemu'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <domain type='kvm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <pae/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <nonpae/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <apic default='on' toggle='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <cpuselection/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <deviceboot/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <externalSnapshot/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </guest>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <guest>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <os_type>hvm</os_type>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <arch name='x86_64'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <wordsize>64</wordsize>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <domain type='qemu'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <domain type='kvm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <apic default='on' toggle='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <cpuselection/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <deviceboot/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <externalSnapshot/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </guest>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 
Jan 23 04:22:55 np0005593233 nova_compute[222017]: </capabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: #033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.185 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.188 222021 DEBUG nova.virt.libvirt.volume.mount [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.188 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 04:22:55 np0005593233 nova_compute[222017]: <domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <domain>kvm</domain>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <arch>i686</arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <vcpu max='4096'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <iothreads supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <os supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='firmware'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <loader supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>rom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pflash</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='readonly'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>yes</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='secure'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </loader>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='maximumMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <vendor>AMD</vendor>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='succor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='custom' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <memoryBacking supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='sourceType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>anonymous</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>memfd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </memoryBacking>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <disk supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='diskDevice'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>disk</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cdrom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>floppy</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>lun</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>fdc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>sata</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <graphics supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vnc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egl-headless</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </graphics>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <video supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='modelType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vga</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cirrus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>none</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>bochs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ramfb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hostdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='mode'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>subsystem</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='startupPolicy'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>mandatory</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>requisite</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>optional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='subsysType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pci</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='capsType'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='pciBackend'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hostdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <rng supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>random</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <filesystem supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='driverType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>path</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>handle</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtiofs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </filesystem>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tpm supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-tis</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-crb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emulator</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>external</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendVersion'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>2.0</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </tpm>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <redirdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </redirdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <channel supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </channel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <crypto supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </crypto>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <interface supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>passt</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <panic supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>isa</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>hyperv</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </panic>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <console supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>null</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dev</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pipe</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stdio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>udp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tcp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu-vdagent</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </console>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <gic supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <genid supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backup supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <async-teardown supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <s390-pv supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <ps2 supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tdx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sev supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sgx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hyperv supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='features'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>relaxed</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vapic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>spinlocks</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vpindex</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>runtime</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>synic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stimer</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reset</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vendor_id</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>frequencies</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reenlightenment</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tlbflush</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ipi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>avic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emsr_bitmap</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>xmm_input</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hyperv>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <launchSecurity supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: </domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.197 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 04:22:55 np0005593233 nova_compute[222017]: <domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <domain>kvm</domain>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <arch>i686</arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <vcpu max='240'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <iothreads supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <os supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='firmware'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <loader supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>rom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pflash</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='readonly'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>yes</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='secure'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </loader>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='maximumMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <vendor>AMD</vendor>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='succor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='custom' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <memoryBacking supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='sourceType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>anonymous</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>memfd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </memoryBacking>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <disk supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='diskDevice'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>disk</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cdrom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>floppy</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>lun</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ide</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>fdc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>sata</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <graphics supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vnc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egl-headless</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </graphics>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <video supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='modelType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vga</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cirrus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>none</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>bochs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ramfb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hostdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='mode'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>subsystem</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='startupPolicy'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>mandatory</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>requisite</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>optional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='subsysType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pci</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='capsType'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='pciBackend'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hostdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <rng supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>random</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <filesystem supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='driverType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>path</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>handle</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtiofs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </filesystem>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tpm supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-tis</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-crb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emulator</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>external</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendVersion'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>2.0</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </tpm>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <redirdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </redirdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <channel supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </channel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <crypto supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </crypto>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <interface supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>passt</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <panic supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>isa</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>hyperv</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </panic>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <console supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>null</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dev</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pipe</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stdio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>udp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tcp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu-vdagent</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </console>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <gic supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <genid supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backup supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <async-teardown supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <s390-pv supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <ps2 supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tdx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sev supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sgx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hyperv supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='features'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>relaxed</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vapic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>spinlocks</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vpindex</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>runtime</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>synic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stimer</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reset</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vendor_id</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>frequencies</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reenlightenment</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tlbflush</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ipi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>avic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emsr_bitmap</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>xmm_input</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hyperv>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <launchSecurity supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: </domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.265 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.270 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 04:22:55 np0005593233 nova_compute[222017]: <domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <domain>kvm</domain>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <arch>x86_64</arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <vcpu max='240'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <iothreads supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <os supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='firmware'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <loader supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>rom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pflash</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='readonly'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>yes</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='secure'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </loader>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='maximumMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <vendor>AMD</vendor>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='succor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='custom' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <memoryBacking supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='sourceType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>anonymous</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>memfd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </memoryBacking>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <disk supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='diskDevice'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>disk</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cdrom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>floppy</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>lun</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ide</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>fdc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>sata</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <graphics supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vnc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egl-headless</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </graphics>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <video supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='modelType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vga</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cirrus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>none</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>bochs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ramfb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hostdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='mode'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>subsystem</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='startupPolicy'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>mandatory</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>requisite</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>optional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='subsysType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pci</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='capsType'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='pciBackend'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hostdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <rng supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>random</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <filesystem supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='driverType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>path</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>handle</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtiofs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </filesystem>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tpm supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-tis</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-crb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emulator</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>external</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendVersion'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>2.0</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </tpm>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <redirdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </redirdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <channel supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </channel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <crypto supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </crypto>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <interface supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>passt</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <panic supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>isa</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>hyperv</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </panic>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <console supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>null</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dev</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pipe</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stdio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>udp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tcp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu-vdagent</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </console>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <gic supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <genid supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backup supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <async-teardown supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <s390-pv supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <ps2 supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tdx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sev supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sgx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hyperv supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='features'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>relaxed</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vapic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>spinlocks</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vpindex</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>runtime</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>synic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stimer</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reset</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vendor_id</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>frequencies</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reenlightenment</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tlbflush</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ipi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>avic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emsr_bitmap</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>xmm_input</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hyperv>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <launchSecurity supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: </domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.352 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 04:22:55 np0005593233 nova_compute[222017]: <domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <domain>kvm</domain>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <arch>x86_64</arch>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <vcpu max='4096'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <iothreads supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <os supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='firmware'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>efi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <loader supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>rom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pflash</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='readonly'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>yes</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='secure'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>yes</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>no</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </loader>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='maximumMigratable'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>on</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>off</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <vendor>AMD</vendor>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='succor'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <mode name='custom' supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ddpd-u'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sha512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm3'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sm4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Denverton-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amd-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='auto-ibrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='perfmon-v2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbpb'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='stibp-always-on'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='EPYC-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-128'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-256'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx10-512'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='prefetchiti'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Haswell-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512er'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512pf'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fma4'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tbm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xop'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='amx-tile'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-bf16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-fp16'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bitalg'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrc'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fzrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='la57'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='taa-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ifma'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cmpccxadd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fbsdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='fsrs'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ibrs-all'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='intel-psfd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='lam'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mcdt-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pbrsb-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='psdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='serialize'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vaes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='hle'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='rtm'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512bw'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512cd'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512dq'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512f'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='avx512vl'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='invpcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pcid'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='pku'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='mpx'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='core-capability'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='split-lock-detect'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='cldemote'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='erms'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='gfni'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdir64b'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='movdiri'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='xsaves'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='athlon-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='core2duo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='coreduo-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='n270-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='ss'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <blockers model='phenom-v1'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnow'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <feature name='3dnowext'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </blockers>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </mode>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <memoryBacking supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <enum name='sourceType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>anonymous</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <value>memfd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </memoryBacking>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <disk supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='diskDevice'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>disk</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cdrom</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>floppy</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>lun</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>fdc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>sata</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <graphics supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vnc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egl-headless</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </graphics>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <video supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='modelType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vga</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>cirrus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>none</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>bochs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ramfb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hostdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='mode'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>subsystem</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='startupPolicy'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>mandatory</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>requisite</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>optional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='subsysType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pci</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>scsi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='capsType'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='pciBackend'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hostdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <rng supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtio-non-transitional</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>random</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>egd</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <filesystem supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='driverType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>path</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>handle</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>virtiofs</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </filesystem>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tpm supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-tis</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tpm-crb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emulator</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>external</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendVersion'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>2.0</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </tpm>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <redirdev supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='bus'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>usb</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </redirdev>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <channel supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </channel>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <crypto supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendModel'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>builtin</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </crypto>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <interface supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='backendType'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>default</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>passt</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <panic supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='model'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>isa</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>hyperv</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </panic>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <console supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='type'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>null</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vc</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pty</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dev</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>file</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>pipe</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stdio</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>udp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tcp</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>unix</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>qemu-vdagent</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>dbus</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </console>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <gic supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <genid supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <backup supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <async-teardown supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <s390-pv supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <ps2 supported='yes'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <tdx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sev supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <sgx supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <hyperv supported='yes'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <enum name='features'>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>relaxed</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vapic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>spinlocks</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vpindex</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>runtime</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>synic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>stimer</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reset</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>vendor_id</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>frequencies</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>reenlightenment</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>tlbflush</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>ipi</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>avic</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>emsr_bitmap</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <value>xmm_input</value>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </enum>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      <defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:      </defaults>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    </hyperv>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:    <launchSecurity supported='no'/>
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: </domainCapabilities>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.417 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.418 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.418 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.421 222021 INFO nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Secure Boot support detected#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.423 222021 INFO nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.424 222021 INFO nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.433 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] cpu compare xml: <cpu match="exact">
Jan 23 04:22:55 np0005593233 nova_compute[222017]:  <model>Nehalem</model>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: </cpu>
Jan 23 04:22:55 np0005593233 nova_compute[222017]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.436 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.466 222021 INFO nova.virt.node [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Determined node identity 929812a2-38ca-4ee7-9f24-090d633cb42b from /var/lib/nova/compute_id#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.487 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Verified node 929812a2-38ca-4ee7-9f24-090d633cb42b matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.514 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.593 222021 DEBUG oslo_concurrency.lockutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.594 222021 DEBUG oslo_concurrency.lockutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.594 222021 DEBUG oslo_concurrency.lockutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.594 222021 DEBUG nova.compute.resource_tracker [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:22:55 np0005593233 nova_compute[222017]: 2026-01-23 09:22:55.595 222021 DEBUG oslo_concurrency.processutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:22:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3208160034' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.030 222021 DEBUG oslo_concurrency.processutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.238 222021 WARNING nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.240 222021 DEBUG nova.compute.resource_tracker [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5264MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.240 222021 DEBUG oslo_concurrency.lockutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.240 222021 DEBUG oslo_concurrency.lockutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.470 222021 DEBUG nova.compute.resource_tracker [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.470 222021 DEBUG nova.compute.resource_tracker [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.499 222021 DEBUG nova.scheduler.client.report [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.567 222021 DEBUG nova.scheduler.client.report [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.567 222021 DEBUG nova.compute.provider_tree [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.586 222021 DEBUG nova.scheduler.client.report [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.600 222021 DEBUG nova.scheduler.client.report [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:22:57 np0005593233 nova_compute[222017]: 2026-01-23 09:22:57.625 222021 DEBUG oslo_concurrency.processutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:22:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/898954787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.117 222021 DEBUG oslo_concurrency.processutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.123 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 23 04:22:58 np0005593233 nova_compute[222017]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.123 222021 INFO nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.124 222021 DEBUG nova.compute.provider_tree [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.125 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.128 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Libvirt baseline CPU <cpu>
Jan 23 04:22:58 np0005593233 nova_compute[222017]:  <arch>x86_64</arch>
Jan 23 04:22:58 np0005593233 nova_compute[222017]:  <model>Nehalem</model>
Jan 23 04:22:58 np0005593233 nova_compute[222017]:  <vendor>AMD</vendor>
Jan 23 04:22:58 np0005593233 nova_compute[222017]:  <topology sockets="8" cores="1" threads="1"/>
Jan 23 04:22:58 np0005593233 nova_compute[222017]: </cpu>
Jan 23 04:22:58 np0005593233 nova_compute[222017]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.173 222021 DEBUG nova.scheduler.client.report [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.261 222021 DEBUG nova.compute.provider_tree [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Updating resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.307 222021 DEBUG nova.compute.resource_tracker [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.307 222021 DEBUG oslo_concurrency.lockutils [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.307 222021 DEBUG nova.service [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.385 222021 DEBUG nova.service [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.386 222021 DEBUG nova.servicegroup.drivers.db [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.388 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:22:58 np0005593233 nova_compute[222017]: 2026-01-23 09:22:58.413 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:22:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:22:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:58.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:22:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:22:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:58.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:22:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:23:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:23:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:23:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:23:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:02.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:23:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:23:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:02.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:23:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:04 np0005593233 podman[222353]: 2026-01-23 09:23:04.10378815 +0000 UTC m=+0.110674684 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Jan 23 04:23:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:23:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:04.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:23:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:04.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:23:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:23:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:23:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:23:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:08.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:23:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:23:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:08.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:11 np0005593233 nova_compute[222017]: 2026-01-23 09:30:11.677 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:11.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:11.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:11 np0005593233 rsyslogd[1009]: imjournal: 3155 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 23 04:30:13 np0005593233 nova_compute[222017]: 2026-01-23 09:30:13.168 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:13.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:13.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.354 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f3277436-85d0-4674-aa69-d7a50448a5d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.354 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.379 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.657 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.658 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.666 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.666 222021 INFO nova.compute.claims [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:30:14 np0005593233 nova_compute[222017]: 2026-01-23 09:30:14.784 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1082442477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.259 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.265 222021 DEBUG nova.compute.provider_tree [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.302 222021 DEBUG nova.scheduler.client.report [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.356 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.358 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.431 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.432 222021 DEBUG nova.network.neutron [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.454 222021 INFO nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.478 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.597 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.598 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.598 222021 INFO nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Creating image(s)#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.632 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.668 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.701 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.706 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:15.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:15.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.778 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.779 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.780 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.780 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.814 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:15 np0005593233 nova_compute[222017]: 2026-01-23 09:30:15.819 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f3277436-85d0-4674-aa69-d7a50448a5d0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:16 np0005593233 nova_compute[222017]: 2026-01-23 09:30:16.526 222021 DEBUG nova.network.neutron [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:30:16 np0005593233 nova_compute[222017]: 2026-01-23 09:30:16.526 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:30:16 np0005593233 nova_compute[222017]: 2026-01-23 09:30:16.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:16 np0005593233 nova_compute[222017]: 2026-01-23 09:30:16.804 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f3277436-85d0-4674-aa69-d7a50448a5d0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.985s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:16 np0005593233 nova_compute[222017]: 2026-01-23 09:30:16.888 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] resizing rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.025 222021 DEBUG nova.objects.instance [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.049 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.049 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Ensure instance console log exists: /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.050 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.050 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.050 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.052 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.057 222021 WARNING nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.062 222021 DEBUG nova.virt.libvirt.host [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.063 222021 DEBUG nova.virt.libvirt.host [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.067 222021 DEBUG nova.virt.libvirt.host [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.068 222021 DEBUG nova.virt.libvirt.host [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.069 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.070 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.070 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.070 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.070 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.071 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.071 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.071 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.071 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.071 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.071 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.072 222021 DEBUG nova.virt.hardware [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.075 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:30:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/370373190' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.574 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.606 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:17 np0005593233 nova_compute[222017]: 2026-01-23 09:30:17.612 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:17.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:17.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:30:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/664398266' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.087 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.089 222021 DEBUG nova.objects.instance [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'pci_devices' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:18 np0005593233 podman[226038]: 2026-01-23 09:30:18.099102428 +0000 UTC m=+0.108459558 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.110 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <uuid>f3277436-85d0-4674-aa69-d7a50448a5d0</uuid>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <name>instance-00000008</name>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:name>tempest-MigrationsAdminTest-server-1070439771</nova:name>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:30:17</nova:creationTime>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <entry name="serial">f3277436-85d0-4674-aa69-d7a50448a5d0</entry>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <entry name="uuid">f3277436-85d0-4674-aa69-d7a50448a5d0</entry>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/f3277436-85d0-4674-aa69-d7a50448a5d0_disk">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/console.log" append="off"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:30:18 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:30:18 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:30:18 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:30:18 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.170 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.186 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.186 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.186 222021 INFO nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Using config drive#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.216 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.523 222021 INFO nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Creating config drive at /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/disk.config#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.529 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn69kujqd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.664 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn69kujqd" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.696 222021 DEBUG nova.storage.rbd_utils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.701 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/disk.config f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.867 222021 DEBUG oslo_concurrency.processutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/disk.config f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:18 np0005593233 nova_compute[222017]: 2026-01-23 09:30:18.869 222021 INFO nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Deleting local config drive /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/disk.config because it was imported into RBD.#033[00m
Jan 23 04:30:18 np0005593233 systemd-machined[190954]: New machine qemu-3-instance-00000008.
Jan 23 04:30:18 np0005593233 systemd[1]: Started Virtual Machine qemu-3-instance-00000008.
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.420 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160619.419688, f3277436-85d0-4674-aa69-d7a50448a5d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.421 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.424 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.425 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.429 222021 INFO nova.virt.libvirt.driver [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance spawned successfully.#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.429 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.455 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.460 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.461 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.461 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.462 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.462 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.463 222021 DEBUG nova.virt.libvirt.driver [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.469 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.515 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.516 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160619.4210172, f3277436-85d0-4674-aa69-d7a50448a5d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.516 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] VM Started (Lifecycle Event)#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.547 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.551 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.559 222021 INFO nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Took 3.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.560 222021 DEBUG nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.576 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.639 222021 INFO nova.compute.manager [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Took 5.02 seconds to build instance.#033[00m
Jan 23 04:30:19 np0005593233 nova_compute[222017]: 2026-01-23 09:30:19.658 222021 DEBUG oslo_concurrency.lockutils [None req-244f761d-fae2-4c5b-be3b-65c206a3c7e2 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:19.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:19.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:21 np0005593233 nova_compute[222017]: 2026-01-23 09:30:21.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:21.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:23 np0005593233 nova_compute[222017]: 2026-01-23 09:30:23.172 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:30:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:23.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:30:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:23.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:24 np0005593233 nova_compute[222017]: 2026-01-23 09:30:24.581 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:30:24 np0005593233 nova_compute[222017]: 2026-01-23 09:30:24.582 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquired lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:30:24 np0005593233 nova_compute[222017]: 2026-01-23 09:30:24.582 222021 DEBUG nova.network.neutron [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:30:24 np0005593233 nova_compute[222017]: 2026-01-23 09:30:24.782 222021 DEBUG nova.network.neutron [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.307 222021 DEBUG nova.network.neutron [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.348 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Releasing lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.483 222021 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.484 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Creating file /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/a1ce760d14f34f788ac27c31d60ed090.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.485 222021 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/a1ce760d14f34f788ac27c31d60ed090.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:25.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:25.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.929 222021 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/a1ce760d14f34f788ac27c31d60ed090.tmp" returned: 1 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.930 222021 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/a1ce760d14f34f788ac27c31d60ed090.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.931 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Creating directory /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 04:30:25 np0005593233 nova_compute[222017]: 2026-01-23 09:30:25.931 222021 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:26 np0005593233 nova_compute[222017]: 2026-01-23 09:30:26.137 222021 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:26 np0005593233 nova_compute[222017]: 2026-01-23 09:30:26.143 222021 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:30:26 np0005593233 nova_compute[222017]: 2026-01-23 09:30:26.685 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:27.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:30:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:27.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:30:28 np0005593233 nova_compute[222017]: 2026-01-23 09:30:28.175 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:29.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:29.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:31 np0005593233 nova_compute[222017]: 2026-01-23 09:30:31.688 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:31.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:31.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:32 np0005593233 podman[226186]: 2026-01-23 09:30:32.042849405 +0000 UTC m=+0.061507859 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:30:33 np0005593233 nova_compute[222017]: 2026-01-23 09:30:33.180 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:33.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:33.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:35.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:35.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:36 np0005593233 nova_compute[222017]: 2026-01-23 09:30:36.201 222021 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:30:36 np0005593233 nova_compute[222017]: 2026-01-23 09:30:36.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.377379) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637377486, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2407, "num_deletes": 251, "total_data_size": 5756428, "memory_usage": 5818688, "flush_reason": "Manual Compaction"}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637432000, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3768804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22783, "largest_seqno": 25185, "table_properties": {"data_size": 3758978, "index_size": 6192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20542, "raw_average_key_size": 20, "raw_value_size": 3739190, "raw_average_value_size": 3735, "num_data_blocks": 274, "num_entries": 1001, "num_filter_entries": 1001, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160432, "oldest_key_time": 1769160432, "file_creation_time": 1769160637, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 54720 microseconds, and 13248 cpu microseconds.
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.432097) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3768804 bytes OK
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.432160) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.442119) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.442167) EVENT_LOG_v1 {"time_micros": 1769160637442158, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.442206) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5745616, prev total WAL file size 5745616, number of live WAL files 2.
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.444612) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3680KB)], [48(7436KB)]
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637444727, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 11384095, "oldest_snapshot_seqno": -1}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 4868 keys, 9350133 bytes, temperature: kUnknown
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637542323, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 9350133, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9316298, "index_size": 20541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 122855, "raw_average_key_size": 25, "raw_value_size": 9226811, "raw_average_value_size": 1895, "num_data_blocks": 842, "num_entries": 4868, "num_filter_entries": 4868, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769160637, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.542737) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 9350133 bytes
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.556200) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 116.4 rd, 95.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 7.3 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 5389, records dropped: 521 output_compression: NoCompression
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.556246) EVENT_LOG_v1 {"time_micros": 1769160637556230, "job": 28, "event": "compaction_finished", "compaction_time_micros": 97763, "compaction_time_cpu_micros": 30326, "output_level": 6, "num_output_files": 1, "total_output_size": 9350133, "num_input_records": 5389, "num_output_records": 4868, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637557243, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637558684, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.444397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.559025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.559037) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.559039) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.559041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:30:37.559043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:37.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:37.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:38 np0005593233 nova_compute[222017]: 2026-01-23 09:30:38.183 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:41 np0005593233 nova_compute[222017]: 2026-01-23 09:30:41.695 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:41.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:41.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:42 np0005593233 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 23 04:30:42 np0005593233 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 14.434s CPU time.
Jan 23 04:30:42 np0005593233 systemd-machined[190954]: Machine qemu-3-instance-00000008 terminated.
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.302 222021 INFO nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance shutdown successfully after 16 seconds.#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.309 222021 INFO nova.virt.libvirt.driver [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance destroyed successfully.#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.314 222021 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.315 222021 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.550 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.550 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.568 222021 INFO nova.compute.rpcapi [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.569 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:30:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:30:42.596 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:30:42.597 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:30:42.597 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.600 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "f3277436-85d0-4674-aa69-d7a50448a5d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.600 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:42 np0005593233 nova_compute[222017]: 2026-01-23 09:30:42.601 222021 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:43 np0005593233 nova_compute[222017]: 2026-01-23 09:30:43.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:43.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:43.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:45.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:45.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 23 04:30:46 np0005593233 nova_compute[222017]: 2026-01-23 09:30:46.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:47.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:47.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:48 np0005593233 nova_compute[222017]: 2026-01-23 09:30:48.225 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:49 np0005593233 podman[226209]: 2026-01-23 09:30:49.138772897 +0000 UTC m=+0.133033649 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 04:30:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:49.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:49.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:50 np0005593233 nova_compute[222017]: 2026-01-23 09:30:50.478 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f3277436-85d0-4674-aa69-d7a50448a5d0" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:50 np0005593233 nova_compute[222017]: 2026-01-23 09:30:50.478 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:50 np0005593233 nova_compute[222017]: 2026-01-23 09:30:50.479 222021 DEBUG nova.compute.manager [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.061 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.062 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.062 222021 DEBUG nova.network.neutron [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.062 222021 DEBUG nova.objects.instance [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'info_cache' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.440 222021 DEBUG nova.network.neutron [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.700 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:51.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:51.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.824 222021 DEBUG nova.network.neutron [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.843 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:30:51 np0005593233 nova_compute[222017]: 2026-01-23 09:30:51.844 222021 DEBUG nova.objects.instance [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:52 np0005593233 nova_compute[222017]: 2026-01-23 09:30:52.147 222021 DEBUG nova.storage.rbd_utils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] removing snapshot(nova-resize) on rbd image(f3277436-85d0-4674-aa69-d7a50448a5d0_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:30:53 np0005593233 nova_compute[222017]: 2026-01-23 09:30:53.228 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 23 04:30:53 np0005593233 nova_compute[222017]: 2026-01-23 09:30:53.518 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:53 np0005593233 nova_compute[222017]: 2026-01-23 09:30:53.519 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:54 np0005593233 nova_compute[222017]: 2026-01-23 09:30:54.056 222021 DEBUG oslo_concurrency.processutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/189163241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:54 np0005593233 nova_compute[222017]: 2026-01-23 09:30:54.617 222021 DEBUG oslo_concurrency.processutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.561s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:54 np0005593233 nova_compute[222017]: 2026-01-23 09:30:54.626 222021 DEBUG nova.compute.provider_tree [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:30:54 np0005593233 nova_compute[222017]: 2026-01-23 09:30:54.646 222021 DEBUG nova.scheduler.client.report [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:30:54 np0005593233 nova_compute[222017]: 2026-01-23 09:30:54.700 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:30:54 np0005593233 nova_compute[222017]: 2026-01-23 09:30:54.805 222021 INFO nova.scheduler.client.report [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Deleted allocation for migration 0d413dd7-db9d-4d9c-972f-46c175c8f097#033[00m
Jan 23 04:30:55 np0005593233 nova_compute[222017]: 2026-01-23 09:30:55.039 222021 DEBUG oslo_concurrency.lockutils [None req-1e63d9f5-5e48-4ea9-9ba3-7e19fc0e989f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:55.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:30:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:55.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:56 np0005593233 ovn_controller[130653]: 2026-01-23T09:30:56Z|00035|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 04:30:56 np0005593233 nova_compute[222017]: 2026-01-23 09:30:56.703 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:57 np0005593233 nova_compute[222017]: 2026-01-23 09:30:57.304 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160642.301884, f3277436-85d0-4674-aa69-d7a50448a5d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:30:57 np0005593233 nova_compute[222017]: 2026-01-23 09:30:57.304 222021 INFO nova.compute.manager [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:30:57 np0005593233 nova_compute[222017]: 2026-01-23 09:30:57.538 222021 DEBUG nova.compute.manager [None req-6d767f85-36be-48df-b1f1-312f2b4c2068 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:57.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:57.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:58 np0005593233 nova_compute[222017]: 2026-01-23 09:30:58.230 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.410 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.411 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:59.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:30:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:30:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:59.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:30:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1838908960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:59 np0005593233 nova_compute[222017]: 2026-01-23 09:30:59.853 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.061 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.063 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4980MB free_disk=20.922042846679688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.063 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.064 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.224 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.225 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.248 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:00 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1860995013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.793 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:00 np0005593233 nova_compute[222017]: 2026-01-23 09:31:00.800 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:01 np0005593233 nova_compute[222017]: 2026-01-23 09:31:01.027 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:01 np0005593233 nova_compute[222017]: 2026-01-23 09:31:01.083 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:31:01 np0005593233 nova_compute[222017]: 2026-01-23 09:31:01.083 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:01 np0005593233 nova_compute[222017]: 2026-01-23 09:31:01.707 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:01.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:01.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:02 np0005593233 nova_compute[222017]: 2026-01-23 09:31:02.084 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:02 np0005593233 nova_compute[222017]: 2026-01-23 09:31:02.084 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:02 np0005593233 nova_compute[222017]: 2026-01-23 09:31:02.084 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:02 np0005593233 nova_compute[222017]: 2026-01-23 09:31:02.085 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:31:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:02.217 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:31:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:02.217 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:31:02 np0005593233 nova_compute[222017]: 2026-01-23 09:31:02.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:02 np0005593233 nova_compute[222017]: 2026-01-23 09:31:02.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:03 np0005593233 podman[226586]: 2026-01-23 09:31:03.051849644 +0000 UTC m=+0.061097468 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.232 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:31:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 23 04:31:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:03.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:31:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:03.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.851 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.852 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:03 np0005593233 nova_compute[222017]: 2026-01-23 09:31:03.995 222021 DEBUG nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Creating tmpfile /var/lib/nova/instances/tmpvzlwa358 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 23 04:31:04 np0005593233 nova_compute[222017]: 2026-01-23 09:31:04.185 222021 DEBUG nova.compute.manager [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvzlwa358',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.381 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "32e0c90d-7129-4fbd-a6b7-9360775df43a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.382 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "32e0c90d-7129-4fbd-a6b7-9360775df43a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.401 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.478 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.479 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.486 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.487 222021 INFO nova.compute.claims [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.684 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:31:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:31:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:05.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:05.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.846 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:05 np0005593233 nova_compute[222017]: 2026-01-23 09:31:05.981 222021 DEBUG nova.compute.manager [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvzlwa358',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='f62791ad-fc40-451f-b02a-ba991f2dbc32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 23 04:31:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2953194909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.151 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.157 222021 DEBUG nova.compute.provider_tree [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.203 222021 DEBUG nova.scheduler.client.report [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.228 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.229 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquired lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.229 222021 DEBUG nova.network.neutron [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.248 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.249 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.597 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.597 222021 DEBUG nova.network.neutron [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.709 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.902 222021 INFO nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.912 222021 DEBUG nova.network.neutron [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.913 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:31:06 np0005593233 nova_compute[222017]: 2026-01-23 09:31:06.948 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.328 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.330 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.330 222021 INFO nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Creating image(s)#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.365 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.414 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.451 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.456 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.522 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.524 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.526 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.526 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.555 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.560 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:07 np0005593233 nova_compute[222017]: 2026-01-23 09:31:07.659 222021 DEBUG nova.network.neutron [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Updating instance_info_cache with network_info: [{"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:07.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:31:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:07.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:31:08 np0005593233 nova_compute[222017]: 2026-01-23 09:31:08.172 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.613s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:08 np0005593233 nova_compute[222017]: 2026-01-23 09:31:08.243 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:08 np0005593233 nova_compute[222017]: 2026-01-23 09:31:08.249 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] resizing rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:31:08 np0005593233 nova_compute[222017]: 2026-01-23 09:31:08.379 222021 DEBUG nova.objects.instance [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lazy-loading 'migration_context' on Instance uuid 32e0c90d-7129-4fbd-a6b7-9360775df43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:09.220 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:09.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:09.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.945 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.945 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Ensure instance console log exists: /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.946 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.946 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.947 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.949 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.954 222021 WARNING nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.961 222021 DEBUG nova.virt.libvirt.host [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.961 222021 DEBUG nova.virt.libvirt.host [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.964 222021 DEBUG nova.virt.libvirt.host [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.965 222021 DEBUG nova.virt.libvirt.host [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.966 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.967 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.967 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.968 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.968 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.968 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.968 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.969 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.969 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.969 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.970 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.970 222021 DEBUG nova.virt.hardware [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:31:09 np0005593233 nova_compute[222017]: 2026-01-23 09:31:09.974 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.349 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Releasing lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.354 222021 DEBUG nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvzlwa358',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='f62791ad-fc40-451f-b02a-ba991f2dbc32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.355 222021 DEBUG nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Creating instance directory: /var/lib/nova/instances/f62791ad-fc40-451f-b02a-ba991f2dbc32 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.356 222021 DEBUG nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Ensure instance console log exists: /var/lib/nova/instances/f62791ad-fc40-451f-b02a-ba991f2dbc32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.357 222021 DEBUG nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.359 222021 DEBUG nova.virt.libvirt.vif [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1280958077',display_name='tempest-LiveMigrationTest-server-1280958077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1280958077',id=9,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:30:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-xkzmzoa6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:30:58Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=f62791ad-fc40-451f-b02a-ba991f2dbc32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.360 222021 DEBUG nova.network.os_vif_util [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converting VIF {"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.361 222021 DEBUG nova.network.os_vif_util [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:aa:f3,bridge_name='br-int',has_traffic_filtering=True,id=857f8a0c-0bda-43ca-85aa-7f22568eddc7,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap857f8a0c-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.362 222021 DEBUG os_vif [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:aa:f3,bridge_name='br-int',has_traffic_filtering=True,id=857f8a0c-0bda-43ca-85aa-7f22568eddc7,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap857f8a0c-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.363 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.364 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.365 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.372 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.372 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap857f8a0c-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.373 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap857f8a0c-0b, col_values=(('external_ids', {'iface-id': '857f8a0c-0bda-43ca-85aa-7f22568eddc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:aa:f3', 'vm-uuid': 'f62791ad-fc40-451f-b02a-ba991f2dbc32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.376 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:10 np0005593233 NetworkManager[48871]: <info>  [1769160670.3777] manager: (tap857f8a0c-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 23 04:31:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:31:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/305041940' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.386 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.388 222021 INFO os_vif [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:aa:f3,bridge_name='br-int',has_traffic_filtering=True,id=857f8a0c-0bda-43ca-85aa-7f22568eddc7,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap857f8a0c-0b')#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.389 222021 DEBUG nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.389 222021 DEBUG nova.compute.manager [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvzlwa358',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='f62791ad-fc40-451f-b02a-ba991f2dbc32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.399 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.431 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.435 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:31:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1177116516' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.923 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:10 np0005593233 nova_compute[222017]: 2026-01-23 09:31:10.925 222021 DEBUG nova.objects.instance [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lazy-loading 'pci_devices' on Instance uuid 32e0c90d-7129-4fbd-a6b7-9360775df43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:11.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:11.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.204 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <uuid>32e0c90d-7129-4fbd-a6b7-9360775df43a</uuid>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <name>instance-0000000b</name>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-827553491</nova:name>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:31:09</nova:creationTime>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:user uuid="df813884128e4b66ae3f50d9a6b010f9">tempest-ServerDiagnosticsNegativeTest-1446524109-project-member</nova:user>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <nova:project uuid="5928cc98f1644a1b9b56765e97ea8bbb">tempest-ServerDiagnosticsNegativeTest-1446524109</nova:project>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <entry name="serial">32e0c90d-7129-4fbd-a6b7-9360775df43a</entry>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <entry name="uuid">32e0c90d-7129-4fbd-a6b7-9360775df43a</entry>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/32e0c90d-7129-4fbd-a6b7-9360775df43a_disk">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/32e0c90d-7129-4fbd-a6b7-9360775df43a_disk.config">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/console.log" append="off"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:31:12 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:31:12 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:31:12 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:31:12 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.303 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.303 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.304 222021 INFO nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Using config drive#033[00m
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.331 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.875 222021 INFO nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Creating config drive at /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/disk.config#033[00m
Jan 23 04:31:12 np0005593233 nova_compute[222017]: 2026-01-23 09:31:12.882 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4knp9vc_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:13 np0005593233 nova_compute[222017]: 2026-01-23 09:31:13.020 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4knp9vc_" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:13 np0005593233 nova_compute[222017]: 2026-01-23 09:31:13.052 222021 DEBUG nova.storage.rbd_utils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] rbd image 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:13 np0005593233 nova_compute[222017]: 2026-01-23 09:31:13.057 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/disk.config 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:13 np0005593233 nova_compute[222017]: 2026-01-23 09:31:13.235 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:13 np0005593233 nova_compute[222017]: 2026-01-23 09:31:13.259 222021 DEBUG oslo_concurrency.processutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/disk.config 32e0c90d-7129-4fbd-a6b7-9360775df43a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:13 np0005593233 nova_compute[222017]: 2026-01-23 09:31:13.261 222021 INFO nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Deleting local config drive /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a/disk.config because it was imported into RBD.#033[00m
Jan 23 04:31:13 np0005593233 systemd-machined[190954]: New machine qemu-4-instance-0000000b.
Jan 23 04:31:13 np0005593233 systemd[1]: Started Virtual Machine qemu-4-instance-0000000b.
Jan 23 04:31:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:13.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:13.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.350 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160674.3493176, 32e0c90d-7129-4fbd-a6b7-9360775df43a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.351 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.353 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.354 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.358 222021 INFO nova.virt.libvirt.driver [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Instance spawned successfully.#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.358 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.380 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.386 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.388 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.389 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.389 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.390 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.390 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.391 222021 DEBUG nova.virt.libvirt.driver [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.421 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.421 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160674.352941, 32e0c90d-7129-4fbd-a6b7-9360775df43a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.422 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] VM Started (Lifecycle Event)#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.508 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.513 222021 INFO nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Took 7.18 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.514 222021 DEBUG nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.516 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.550 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.599 222021 INFO nova.compute.manager [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Took 9.14 seconds to build instance.#033[00m
Jan 23 04:31:14 np0005593233 nova_compute[222017]: 2026-01-23 09:31:14.643 222021 DEBUG oslo_concurrency.lockutils [None req-cac99c75-38b0-4d18-b03d-02cceaeb8c4f df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "32e0c90d-7129-4fbd-a6b7-9360775df43a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.420 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.423 222021 DEBUG nova.network.neutron [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Port 857f8a0c-0bda-43ca-85aa-7f22568eddc7 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.425 222021 DEBUG nova.compute.manager [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvzlwa358',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='f62791ad-fc40-451f-b02a-ba991f2dbc32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 23 04:31:15 np0005593233 systemd[1]: Starting libvirt proxy daemon...
Jan 23 04:31:15 np0005593233 systemd[1]: Started libvirt proxy daemon.
Jan 23 04:31:15 np0005593233 kernel: tap857f8a0c-0b: entered promiscuous mode
Jan 23 04:31:15 np0005593233 systemd-udevd[227021]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:31:15 np0005593233 NetworkManager[48871]: <info>  [1769160675.8093] manager: (tap857f8a0c-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 23 04:31:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:15Z|00036|binding|INFO|Claiming lport 857f8a0c-0bda-43ca-85aa-7f22568eddc7 for this additional chassis.
Jan 23 04:31:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:15Z|00037|binding|INFO|857f8a0c-0bda-43ca-85aa-7f22568eddc7: Claiming fa:16:3e:d9:aa:f3 10.100.0.10
Jan 23 04:31:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:15Z|00038|binding|INFO|Claiming lport 7dc28ada-b6f3-4524-9e75-42c4d4604d63 for this additional chassis.
Jan 23 04:31:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:15Z|00039|binding|INFO|7dc28ada-b6f3-4524-9e75-42c4d4604d63: Claiming fa:16:3e:4b:1d:32 19.80.0.19
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.811 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.815 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:15 np0005593233 NetworkManager[48871]: <info>  [1769160675.8314] device (tap857f8a0c-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:31:15 np0005593233 NetworkManager[48871]: <info>  [1769160675.8329] device (tap857f8a0c-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:31:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:15.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:15.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:15 np0005593233 systemd-machined[190954]: New machine qemu-5-instance-00000009.
Jan 23 04:31:15 np0005593233 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.922 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:15Z|00040|binding|INFO|Setting lport 857f8a0c-0bda-43ca-85aa-7f22568eddc7 ovn-installed in OVS
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.930 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:15 np0005593233 nova_compute[222017]: 2026-01-23 09:31:15.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.742 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160676.741999, f62791ad-fc40-451f-b02a-ba991f2dbc32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.742 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] VM Started (Lifecycle Event)#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.770 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.969 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "32e0c90d-7129-4fbd-a6b7-9360775df43a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.971 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "32e0c90d-7129-4fbd-a6b7-9360775df43a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.971 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "32e0c90d-7129-4fbd-a6b7-9360775df43a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.971 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "32e0c90d-7129-4fbd-a6b7-9360775df43a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.972 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "32e0c90d-7129-4fbd-a6b7-9360775df43a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.973 222021 INFO nova.compute.manager [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Terminating instance#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.974 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "refresh_cache-32e0c90d-7129-4fbd-a6b7-9360775df43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.974 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquired lock "refresh_cache-32e0c90d-7129-4fbd-a6b7-9360775df43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:16 np0005593233 nova_compute[222017]: 2026-01-23 09:31:16.974 222021 DEBUG nova.network.neutron [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.265 222021 DEBUG nova.network.neutron [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.280 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160677.2793705, f62791ad-fc40-451f-b02a-ba991f2dbc32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.281 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.323 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.327 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.358 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 23 04:31:17 np0005593233 nova_compute[222017]: 2026-01-23 09:31:17.713 222021 DEBUG nova.network.neutron [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:17.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:17.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:31:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.8 total, 600.0 interval#012Cumulative writes: 8725 writes, 34K keys, 8725 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8725 writes, 2034 syncs, 4.29 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2482 writes, 9431 keys, 2482 commit groups, 1.0 writes per commit group, ingest: 11.62 MB, 0.02 MB/s#012Interval WAL: 2482 writes, 914 syncs, 2.72 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 04:31:18 np0005593233 nova_compute[222017]: 2026-01-23 09:31:18.185 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Releasing lock "refresh_cache-32e0c90d-7129-4fbd-a6b7-9360775df43a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:18 np0005593233 nova_compute[222017]: 2026-01-23 09:31:18.186 222021 DEBUG nova.compute.manager [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:31:18 np0005593233 nova_compute[222017]: 2026-01-23 09:31:18.237 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:18 np0005593233 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 23 04:31:18 np0005593233 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 4.944s CPU time.
Jan 23 04:31:18 np0005593233 systemd-machined[190954]: Machine qemu-4-instance-0000000b terminated.
Jan 23 04:31:18 np0005593233 nova_compute[222017]: 2026-01-23 09:31:18.409 222021 INFO nova.virt.libvirt.driver [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Instance destroyed successfully.#033[00m
Jan 23 04:31:18 np0005593233 nova_compute[222017]: 2026-01-23 09:31:18.409 222021 DEBUG nova.objects.instance [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lazy-loading 'resources' on Instance uuid 32e0c90d-7129-4fbd-a6b7-9360775df43a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.188 222021 INFO nova.virt.libvirt.driver [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Deleting instance files /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a_del#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.189 222021 INFO nova.virt.libvirt.driver [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Deletion of /var/lib/nova/instances/32e0c90d-7129-4fbd-a6b7-9360775df43a_del complete#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.416 222021 INFO nova.compute.manager [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Took 1.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.417 222021 DEBUG oslo.service.loopingcall [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.418 222021 DEBUG nova.compute.manager [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.418 222021 DEBUG nova.network.neutron [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.683 222021 DEBUG nova.network.neutron [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.702 222021 DEBUG nova.network.neutron [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.728 222021 INFO nova.compute.manager [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Took 0.31 seconds to deallocate network for instance.#033[00m
Jan 23 04:31:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:19.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.881 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.882 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:19 np0005593233 nova_compute[222017]: 2026-01-23 09:31:19.988 222021 DEBUG oslo_concurrency.processutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:20 np0005593233 podman[227128]: 2026-01-23 09:31:20.100804008 +0000 UTC m=+0.103709395 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 04:31:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1440797754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.425 222021 DEBUG oslo_concurrency.processutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.431 222021 DEBUG nova.compute.provider_tree [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.450 222021 DEBUG nova.scheduler.client.report [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.478 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.809 222021 INFO nova.scheduler.client.report [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Deleted allocations for instance 32e0c90d-7129-4fbd-a6b7-9360775df43a#033[00m
Jan 23 04:31:20 np0005593233 nova_compute[222017]: 2026-01-23 09:31:20.925 222021 DEBUG oslo_concurrency.lockutils [None req-1d50f714-c3e8-4a08-9823-b0810878c1d9 df813884128e4b66ae3f50d9a6b010f9 5928cc98f1644a1b9b56765e97ea8bbb - - default default] Lock "32e0c90d-7129-4fbd-a6b7-9360775df43a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.954s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00041|binding|INFO|Claiming lport 857f8a0c-0bda-43ca-85aa-7f22568eddc7 for this chassis.
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00042|binding|INFO|857f8a0c-0bda-43ca-85aa-7f22568eddc7: Claiming fa:16:3e:d9:aa:f3 10.100.0.10
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00043|binding|INFO|Claiming lport 7dc28ada-b6f3-4524-9e75-42c4d4604d63 for this chassis.
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00044|binding|INFO|7dc28ada-b6f3-4524-9e75-42c4d4604d63: Claiming fa:16:3e:4b:1d:32 19.80.0.19
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00045|binding|INFO|Setting lport 857f8a0c-0bda-43ca-85aa-7f22568eddc7 up in Southbound
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00046|binding|INFO|Setting lport 7dc28ada-b6f3-4524-9e75-42c4d4604d63 up in Southbound
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.337 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:aa:f3 10.100.0.10'], port_security=['fa:16:3e:d9:aa:f3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-412021528', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f62791ad-fc40-451f-b02a-ba991f2dbc32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-412021528', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cabb3d88-013b-4542-b789-52d49c567d53, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=857f8a0c-0bda-43ca-85aa-7f22568eddc7) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.340 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:1d:32 19.80.0.19'], port_security=['fa:16:3e:4b:1d:32 19.80.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['857f8a0c-0bda-43ca-85aa-7f22568eddc7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1183347964', 'neutron:cidrs': '19.80.0.19/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48c9624b-33de-47f9-a720-02dd9028b5ea', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1183347964', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=7ac2005c-13d2-4227-8eb4-3d332da8f5d6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7dc28ada-b6f3-4524-9e75-42c4d4604d63) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.361 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 857f8a0c-0bda-43ca-85aa-7f22568eddc7 in datapath 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 bound to our chassis#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.364 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.382 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd1df8f-b06c-4d12-9143-cbc7eb47f809]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.383 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap385e7a4d-f1 in ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.387 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap385e7a4d-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.387 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5477b908-c40e-44f4-8220-3cdce06b0b42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.388 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2eee6c24-b199-4257-a3ea-e521b299f22b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.406 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[2bf599cd-b079-4020-920d-efe0dd3c14f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.423 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[58bef9d2-53f5-4fac-aa37-0c794be5fb58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.471 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f477681e-7ba8-47b7-ad25-3360b2bf914b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 NetworkManager[48871]: <info>  [1769160681.4808] manager: (tap385e7a4d-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.480 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[89f47f4b-b227-4ca2-a8e2-c49ccc6a24d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 systemd-udevd[227183]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.526 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[662b33fd-20c7-425b-ad35-e42504c1f43b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.530 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b40b8e18-9eb2-454c-b7aa-d76056ad7b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 NetworkManager[48871]: <info>  [1769160681.5597] device (tap385e7a4d-f0): carrier: link connected
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.563 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b8f3f3-99d5-4025-9ae2-c13d7b2b4321]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 nova_compute[222017]: 2026-01-23 09:31:21.572 222021 INFO nova.compute.manager [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Post operation of migration started#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.580 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1af4ab-1660-45c1-ab8f-0f609f28d3a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e7a4d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:a3:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459419, 'reachable_time': 37182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227202, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.601 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[69af1999-d8ea-4bc5-b4a9-9f57664effd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:a31a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459419, 'tstamp': 459419}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227203, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.620 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d936f65a-94c9-4592-b319-158b0dc3d907]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e7a4d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:a3:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459419, 'reachable_time': 37182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227204, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.649 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[728cb38f-d3ae-4c5e-b1ad-61a0c114a15f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.717 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[063598b3-e11d-4cee-9932-491246925037]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.719 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e7a4d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.719 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.720 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e7a4d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:21 np0005593233 kernel: tap385e7a4d-f0: entered promiscuous mode
Jan 23 04:31:21 np0005593233 nova_compute[222017]: 2026-01-23 09:31:21.722 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:21 np0005593233 NetworkManager[48871]: <info>  [1769160681.7234] manager: (tap385e7a4d-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 23 04:31:21 np0005593233 nova_compute[222017]: 2026-01-23 09:31:21.724 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.725 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e7a4d-f0, col_values=(('external_ids', {'iface-id': '7b93c40e-1f44-4d5a-9bad-e23468f98d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:21 np0005593233 nova_compute[222017]: 2026-01-23 09:31:21.726 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:21Z|00047|binding|INFO|Releasing lport 7b93c40e-1f44-4d5a-9bad-e23468f98d69 from this chassis (sb_readonly=0)
Jan 23 04:31:21 np0005593233 nova_compute[222017]: 2026-01-23 09:31:21.740 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.741 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/385e7a4d-f87e-44c5-9fc0-5a322eecd4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/385e7a4d-f87e-44c5-9fc0-5a322eecd4b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.742 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6251cb1d-f504-446d-8fcc-9158e3e116ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.743 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/385e7a4d-f87e-44c5-9fc0-5a322eecd4b4.pid.haproxy
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:31:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:21.744 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'env', 'PROCESS_TAG=haproxy-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/385e7a4d-f87e-44c5-9fc0-5a322eecd4b4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:31:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:21.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:21.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:22 np0005593233 podman[227236]: 2026-01-23 09:31:22.120477179 +0000 UTC m=+0.049983165 container create 0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:31:22 np0005593233 systemd[1]: Started libpod-conmon-0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2.scope.
Jan 23 04:31:22 np0005593233 podman[227236]: 2026-01-23 09:31:22.090308851 +0000 UTC m=+0.019814857 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:31:22 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:31:22 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91c39e435e75570e48611ebab9c88856bae214abb7864a9771c9ad0c72717b90/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:31:22 np0005593233 podman[227236]: 2026-01-23 09:31:22.222731812 +0000 UTC m=+0.152237848 container init 0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:31:22 np0005593233 podman[227236]: 2026-01-23 09:31:22.242056605 +0000 UTC m=+0.171562601 container start 0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:31:22 np0005593233 nova_compute[222017]: 2026-01-23 09:31:22.246 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:22 np0005593233 nova_compute[222017]: 2026-01-23 09:31:22.248 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquired lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:22 np0005593233 nova_compute[222017]: 2026-01-23 09:31:22.248 222021 DEBUG nova.network.neutron [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:22 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [NOTICE]   (227256) : New worker (227258) forked
Jan 23 04:31:22 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [NOTICE]   (227256) : Loading success.
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.329 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 7dc28ada-b6f3-4524-9e75-42c4d4604d63 in datapath 48c9624b-33de-47f9-a720-02dd9028b5ea unbound from our chassis#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.331 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48c9624b-33de-47f9-a720-02dd9028b5ea#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.345 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[064fce66-bec6-4be1-a89b-0bd7281467b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.347 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48c9624b-31 in ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.348 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48c9624b-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.348 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b405f8f3-8843-4d9f-b5af-013282cbd1b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.349 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a9633904-3035-4c55-b39d-6744f138a7ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.362 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[35b384d9-01c4-452f-a62e-356a0f4ec93a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.389 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[64a3cd06-6dcc-497e-b269-910669a6da4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.433 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fdeba76e-aece-4b1f-8230-21481af54c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 NetworkManager[48871]: <info>  [1769160682.4412] manager: (tap48c9624b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 04:31:22 np0005593233 systemd-udevd[227193]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.441 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[22406430-8fcd-4644-a9c3-c0094e83af35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.483 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e244b9aa-65ad-4cd1-ac6c-6fdc64ab6cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.487 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fa42e440-70b1-458c-bbc4-f9adf5cedbf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 NetworkManager[48871]: <info>  [1769160682.5229] device (tap48c9624b-30): carrier: link connected
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.528 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f4a205-a415-4dbd-b6c2-b84ea02d7f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.551 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f81f15-9b6b-40c5-b0f8-271a73355703]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48c9624b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:8f:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459515, 'reachable_time': 31330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227277, 'error': None, 'target': 'ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.573 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1cceb8-dc5a-4a39-b997-cf2a3b3cf86e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:8fae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459515, 'tstamp': 459515}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227278, 'error': None, 'target': 'ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.597 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1760d746-6a0d-42bb-905b-83446f391b85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48c9624b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:8f:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459515, 'reachable_time': 31330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227279, 'error': None, 'target': 'ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.633 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aad3762c-5555-428b-b5dc-3dbba1353f35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.713 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8eed2191-2205-44c1-8edc-eb8f6bcc08d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.714 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48c9624b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.715 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.715 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48c9624b-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:22 np0005593233 NetworkManager[48871]: <info>  [1769160682.7177] manager: (tap48c9624b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 04:31:22 np0005593233 nova_compute[222017]: 2026-01-23 09:31:22.717 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:22 np0005593233 kernel: tap48c9624b-30: entered promiscuous mode
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.720 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48c9624b-30, col_values=(('external_ids', {'iface-id': '8e19ba82-19a8-44be-8cf0-66f5e53af8a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:22 np0005593233 nova_compute[222017]: 2026-01-23 09:31:22.721 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:31:22Z|00048|binding|INFO|Releasing lport 8e19ba82-19a8-44be-8cf0-66f5e53af8a2 from this chassis (sb_readonly=0)
Jan 23 04:31:22 np0005593233 nova_compute[222017]: 2026-01-23 09:31:22.736 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.738 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48c9624b-33de-47f9-a720-02dd9028b5ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48c9624b-33de-47f9-a720-02dd9028b5ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.738 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[50a7d3b2-3204-4d4a-bb95-7047178bec09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.739 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-48c9624b-33de-47f9-a720-02dd9028b5ea
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/48c9624b-33de-47f9-a720-02dd9028b5ea.pid.haproxy
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 48c9624b-33de-47f9-a720-02dd9028b5ea
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:31:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:22.740 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea', 'env', 'PROCESS_TAG=haproxy-48c9624b-33de-47f9-a720-02dd9028b5ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48c9624b-33de-47f9-a720-02dd9028b5ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:31:23 np0005593233 podman[227311]: 2026-01-23 09:31:23.138264252 +0000 UTC m=+0.061895000 container create 9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:31:23 np0005593233 systemd[1]: Started libpod-conmon-9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6.scope.
Jan 23 04:31:23 np0005593233 podman[227311]: 2026-01-23 09:31:23.104452822 +0000 UTC m=+0.028083600 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:31:23 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:31:23 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a33d0d5827298b94d81918e38e4f2f185514ed892094c19f6d562f6d1836fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:31:23 np0005593233 podman[227311]: 2026-01-23 09:31:23.227503019 +0000 UTC m=+0.151133777 container init 9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:31:23 np0005593233 podman[227311]: 2026-01-23 09:31:23.233605581 +0000 UTC m=+0.157236329 container start 9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 04:31:23 np0005593233 nova_compute[222017]: 2026-01-23 09:31:23.240 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:23 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [NOTICE]   (227330) : New worker (227332) forked
Jan 23 04:31:23 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [NOTICE]   (227330) : Loading success.
Jan 23 04:31:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:23.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:23.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:25 np0005593233 nova_compute[222017]: 2026-01-23 09:31:25.425 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:25.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:25.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:26 np0005593233 nova_compute[222017]: 2026-01-23 09:31:26.405 222021 DEBUG nova.network.neutron [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Updating instance_info_cache with network_info: [{"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:26 np0005593233 nova_compute[222017]: 2026-01-23 09:31:26.451 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Releasing lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:26 np0005593233 nova_compute[222017]: 2026-01-23 09:31:26.483 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:26 np0005593233 nova_compute[222017]: 2026-01-23 09:31:26.484 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:26 np0005593233 nova_compute[222017]: 2026-01-23 09:31:26.484 222021 DEBUG oslo_concurrency.lockutils [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:26 np0005593233 nova_compute[222017]: 2026-01-23 09:31:26.491 222021 INFO nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 23 04:31:26 np0005593233 virtqemud[221325]: Domain id=5 name='instance-00000009' uuid=f62791ad-fc40-451f-b02a-ba991f2dbc32 is tainted: custom-monitor
Jan 23 04:31:27 np0005593233 nova_compute[222017]: 2026-01-23 09:31:27.503 222021 INFO nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 23 04:31:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:27.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:27.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:28 np0005593233 nova_compute[222017]: 2026-01-23 09:31:28.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:28 np0005593233 nova_compute[222017]: 2026-01-23 09:31:28.510 222021 INFO nova.virt.libvirt.driver [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 23 04:31:28 np0005593233 nova_compute[222017]: 2026-01-23 09:31:28.516 222021 DEBUG nova.compute.manager [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:28 np0005593233 nova_compute[222017]: 2026-01-23 09:31:28.606 222021 DEBUG nova.objects.instance [None req-3587debb-5fd3-47d7-8686-c5801ff95138 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:31:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:29.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:29.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:30 np0005593233 nova_compute[222017]: 2026-01-23 09:31:30.430 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:31.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:31.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:33 np0005593233 nova_compute[222017]: 2026-01-23 09:31:33.244 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:33 np0005593233 nova_compute[222017]: 2026-01-23 09:31:33.408 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160678.4066732, 32e0c90d-7129-4fbd-a6b7-9360775df43a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:33 np0005593233 nova_compute[222017]: 2026-01-23 09:31:33.409 222021 INFO nova.compute.manager [-] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:31:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:33.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:33.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:34 np0005593233 podman[227341]: 2026-01-23 09:31:34.059604949 +0000 UTC m=+0.066919881 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 04:31:34 np0005593233 nova_compute[222017]: 2026-01-23 09:31:34.506 222021 DEBUG nova.compute.manager [None req-ed42b6e2-f633-4e26-b27b-a14f99d9b714 - - - - - -] [instance: 32e0c90d-7129-4fbd-a6b7-9360775df43a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:35 np0005593233 nova_compute[222017]: 2026-01-23 09:31:35.434 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:35.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:35.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 23 04:31:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:31:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:37.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:31:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:31:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:37.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:38 np0005593233 nova_compute[222017]: 2026-01-23 09:31:38.269 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:31:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:39.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:31:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:39.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:31:40 np0005593233 nova_compute[222017]: 2026-01-23 09:31:40.438 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.659 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.659 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.703 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.800 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.800 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.809 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.810 222021 INFO nova.compute.claims [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:31:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:41.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:41.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:41 np0005593233 nova_compute[222017]: 2026-01-23 09:31:41.962 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 23 04:31:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3872601250' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.464 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.474 222021 DEBUG nova.compute.provider_tree [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.504 222021 DEBUG nova.scheduler.client.report [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.546 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.547 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:42.598 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:42.599 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:31:42.600 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.605 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.606 222021 DEBUG nova.network.neutron [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.632 222021 INFO nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.653 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.813 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.815 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.815 222021 INFO nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Creating image(s)#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.844 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.877 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.911 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.917 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.987 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.988 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.989 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:42 np0005593233 nova_compute[222017]: 2026-01-23 09:31:42.990 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.018 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.023 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.270 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.325 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.301s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.386 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] resizing rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.493 222021 DEBUG nova.objects.instance [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lazy-loading 'migration_context' on Instance uuid e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.594 222021 DEBUG nova.network.neutron [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:31:43 np0005593233 nova_compute[222017]: 2026-01-23 09:31:43.594 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:31:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:43.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:31:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:43.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.125 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.126 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Ensure instance console log exists: /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.126 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.127 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.127 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.129 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.135 222021 WARNING nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.142 222021 DEBUG nova.virt.libvirt.host [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.143 222021 DEBUG nova.virt.libvirt.host [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.147 222021 DEBUG nova.virt.libvirt.host [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.148 222021 DEBUG nova.virt.libvirt.host [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.150 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.150 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.150 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.151 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.151 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.151 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.152 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.152 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.152 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.152 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.153 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.153 222021 DEBUG nova.virt.hardware [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.159 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:31:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/191109387' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.624 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.657 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:44 np0005593233 nova_compute[222017]: 2026-01-23 09:31:44.664 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:31:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3769284428' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.123 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.126 222021 DEBUG nova.objects.instance [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.215 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <uuid>e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1</uuid>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <name>instance-0000000c</name>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerExternalEventsTest-server-230237051</nova:name>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:31:44</nova:creationTime>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:user uuid="f5658542eccb4fdbb62e6e57f281272e">tempest-ServerExternalEventsTest-977186534-project-member</nova:user>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <nova:project uuid="6e20561e5ad0466194057af7c0a9f4a9">tempest-ServerExternalEventsTest-977186534</nova:project>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <entry name="serial">e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1</entry>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <entry name="uuid">e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1</entry>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk.config">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/console.log" append="off"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:31:45 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:31:45 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:31:45 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:31:45 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.500 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.501 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.501 222021 INFO nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Using config drive#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.525 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.754 222021 INFO nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Creating config drive at /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/disk.config#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.760 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxcupp41v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:45.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:45.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.905 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxcupp41v" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.939 222021 DEBUG nova.storage.rbd_utils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] rbd image e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:45 np0005593233 nova_compute[222017]: 2026-01-23 09:31:45.945 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/disk.config e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:46 np0005593233 nova_compute[222017]: 2026-01-23 09:31:46.138 222021 DEBUG oslo_concurrency.processutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/disk.config e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:46 np0005593233 nova_compute[222017]: 2026-01-23 09:31:46.140 222021 INFO nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Deleting local config drive /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1/disk.config because it was imported into RBD.#033[00m
Jan 23 04:31:46 np0005593233 systemd-machined[190954]: New machine qemu-6-instance-0000000c.
Jan 23 04:31:46 np0005593233 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.106 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160707.105422, e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.107 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.110 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.110 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.114 222021 INFO nova.virt.libvirt.driver [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance spawned successfully.#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.115 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.160 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.164 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.174 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.174 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.175 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.175 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.175 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.176 222021 DEBUG nova.virt.libvirt.driver [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.210 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.210 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160707.1097288, e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.211 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] VM Started (Lifecycle Event)#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.260 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.263 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.273 222021 INFO nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Took 4.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.273 222021 DEBUG nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.350 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.413 222021 INFO nova.compute.manager [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Took 5.65 seconds to build instance.#033[00m
Jan 23 04:31:47 np0005593233 nova_compute[222017]: 2026-01-23 09:31:47.435 222021 DEBUG oslo_concurrency.lockutils [None req-642bf90c-cc44-4ca8-97aa-fd42ec838bb2 f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 04:31:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:31:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:47.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:31:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:47.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:48 np0005593233 nova_compute[222017]: 2026-01-23 09:31:48.272 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 23 04:31:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.318 222021 DEBUG nova.compute.manager [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.319 222021 DEBUG nova.compute.manager [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.320 222021 DEBUG oslo_concurrency.lockutils [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] Acquiring lock "refresh_cache-e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.320 222021 DEBUG oslo_concurrency.lockutils [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] Acquired lock "refresh_cache-e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.320 222021 DEBUG nova.network.neutron [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.702 222021 DEBUG nova.network.neutron [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.835 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.836 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.836 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.837 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.837 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.839 222021 INFO nova.compute.manager [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Terminating instance#033[00m
Jan 23 04:31:49 np0005593233 nova_compute[222017]: 2026-01-23 09:31:49.841 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "refresh_cache-e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:49.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:49.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:50 np0005593233 nova_compute[222017]: 2026-01-23 09:31:50.447 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:51 np0005593233 podman[227727]: 2026-01-23 09:31:51.175742994 +0000 UTC m=+0.148912303 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:31:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:51.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:51.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:52 np0005593233 nova_compute[222017]: 2026-01-23 09:31:52.774 222021 DEBUG nova.network.neutron [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:53 np0005593233 nova_compute[222017]: 2026-01-23 09:31:53.024 222021 DEBUG oslo_concurrency.lockutils [None req-d42d3f8a-97c2-485a-a30f-49030024287b aca028dbea9e40a9ade9e68e551fc40e 2423d7d0b0634e6b97f578c7862a502a - - default default] Releasing lock "refresh_cache-e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:53 np0005593233 nova_compute[222017]: 2026-01-23 09:31:53.025 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquired lock "refresh_cache-e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:53 np0005593233 nova_compute[222017]: 2026-01-23 09:31:53.025 222021 DEBUG nova.network.neutron [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:53 np0005593233 nova_compute[222017]: 2026-01-23 09:31:53.275 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:53.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:53.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:54 np0005593233 nova_compute[222017]: 2026-01-23 09:31:54.123 222021 DEBUG nova.network.neutron [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:54 np0005593233 nova_compute[222017]: 2026-01-23 09:31:54.612 222021 DEBUG nova.network.neutron [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:54 np0005593233 nova_compute[222017]: 2026-01-23 09:31:54.651 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Releasing lock "refresh_cache-e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:54 np0005593233 nova_compute[222017]: 2026-01-23 09:31:54.652 222021 DEBUG nova.compute.manager [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:31:54 np0005593233 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 23 04:31:54 np0005593233 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 8.557s CPU time.
Jan 23 04:31:54 np0005593233 systemd-machined[190954]: Machine qemu-6-instance-0000000c terminated.
Jan 23 04:31:54 np0005593233 nova_compute[222017]: 2026-01-23 09:31:54.881 222021 INFO nova.virt.libvirt.driver [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance destroyed successfully.#033[00m
Jan 23 04:31:54 np0005593233 nova_compute[222017]: 2026-01-23 09:31:54.882 222021 DEBUG nova.objects.instance [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lazy-loading 'resources' on Instance uuid e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:55 np0005593233 nova_compute[222017]: 2026-01-23 09:31:55.451 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:55.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:55.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.196 222021 INFO nova.virt.libvirt.driver [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Deleting instance files /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_del#033[00m
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.198 222021 INFO nova.virt.libvirt.driver [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Deletion of /var/lib/nova/instances/e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1_del complete#033[00m
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.429 222021 INFO nova.compute.manager [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Took 1.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.430 222021 DEBUG oslo.service.loopingcall [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.430 222021 DEBUG nova.compute.manager [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.430 222021 DEBUG nova.network.neutron [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:31:56 np0005593233 nova_compute[222017]: 2026-01-23 09:31:56.653 222021 DEBUG nova.network.neutron [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:31:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 4882 writes, 26K keys, 4882 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 4882 writes, 4882 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1446 writes, 7165 keys, 1446 commit groups, 1.0 writes per commit group, ingest: 15.41 MB, 0.03 MB/s#012Interval WAL: 1446 writes, 1446 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     57.4      0.54              0.16        14    0.039       0      0       0.0       0.0#012  L6      1/0    8.92 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5     94.7     78.6      1.38              0.36        13    0.106     61K   6860       0.0       0.0#012 Sum      1/0    8.92 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     68.0     72.6      1.92              0.52        27    0.071     61K   6860       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.2     84.5     86.6      0.61              0.18        10    0.061     25K   2526       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     94.7     78.6      1.38              0.36        13    0.106     61K   6860       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     57.7      0.54              0.16        13    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.030, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.08 MB/s write, 0.13 GB read, 0.07 MB/s read, 1.9 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 12.46 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000274 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(715,11.95 MB,3.93098%) FilterBlock(27,177.23 KB,0.0569344%) IndexBlock(27,340.78 KB,0.109472%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:31:57 np0005593233 nova_compute[222017]: 2026-01-23 09:31:57.318 222021 DEBUG nova.network.neutron [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:57 np0005593233 nova_compute[222017]: 2026-01-23 09:31:57.345 222021 INFO nova.compute.manager [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Took 0.91 seconds to deallocate network for instance.#033[00m
Jan 23 04:31:57 np0005593233 nova_compute[222017]: 2026-01-23 09:31:57.415 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:57 np0005593233 nova_compute[222017]: 2026-01-23 09:31:57.416 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:57 np0005593233 nova_compute[222017]: 2026-01-23 09:31:57.566 222021 DEBUG oslo_concurrency.processutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:57.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:57.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1751752404' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.058 222021 DEBUG oslo_concurrency.processutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.067 222021 DEBUG nova.compute.provider_tree [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.278 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.377 222021 DEBUG nova.scheduler.client.report [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.432 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.485 222021 INFO nova.scheduler.client.report [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Deleted allocations for instance e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1#033[00m
Jan 23 04:31:58 np0005593233 nova_compute[222017]: 2026-01-23 09:31:58.589 222021 DEBUG oslo_concurrency.lockutils [None req-27d47cfb-4e5f-4767-a994-45b55d4228fa f5658542eccb4fdbb62e6e57f281272e 6e20561e5ad0466194057af7c0a9f4a9 - - default default] Lock "e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:31:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:59.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:31:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:31:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:59.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.456 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.507 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.507 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.508 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.508 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:32:00 np0005593233 nova_compute[222017]: 2026-01-23 09:32:00.508 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2593428371' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.044 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.703 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.704 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.877 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.878 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4788MB free_disk=20.82556915283203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.878 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:01 np0005593233 nova_compute[222017]: 2026-01-23 09:32:01.879 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:01.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:01.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:03 np0005593233 nova_compute[222017]: 2026-01-23 09:32:03.280 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:03.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:03.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:03 np0005593233 nova_compute[222017]: 2026-01-23 09:32:03.996 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance f62791ad-fc40-451f-b02a-ba991f2dbc32 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:32:04 np0005593233 nova_compute[222017]: 2026-01-23 09:32:04.481 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0fb415e8-9c82-4021-9088-cfd399d453a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 23 04:32:04 np0005593233 nova_compute[222017]: 2026-01-23 09:32:04.482 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:32:04 np0005593233 nova_compute[222017]: 2026-01-23 09:32:04.482 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:32:04 np0005593233 nova_compute[222017]: 2026-01-23 09:32:04.693 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:05 np0005593233 podman[227840]: 2026-01-23 09:32:05.052886291 +0000 UTC m=+0.063385661 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 23 04:32:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3650967416' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:05 np0005593233 nova_compute[222017]: 2026-01-23 09:32:05.134 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:05 np0005593233 nova_compute[222017]: 2026-01-23 09:32:05.141 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:05 np0005593233 nova_compute[222017]: 2026-01-23 09:32:05.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:05.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:05.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.121 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "0fb415e8-9c82-4021-9088-cfd399d453a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.122 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.129 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.673 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.674 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.682 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.682 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.682 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.774 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.859 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.860 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.868 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.868 222021 INFO nova.compute.claims [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:32:06 np0005593233 nova_compute[222017]: 2026-01-23 09:32:06.883 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.103 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.453152764 +0000 UTC m=+0.046795505 container create 334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mahavira, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True)
Jan 23 04:32:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:07 np0005593233 systemd[1]: Started libpod-conmon-334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088.scope.
Jan 23 04:32:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4081176151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.429394897 +0000 UTC m=+0.023037698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:32:07 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.532 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.541 222021 DEBUG nova.compute.provider_tree [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.550261903 +0000 UTC m=+0.143904654 container init 334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mahavira, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.559949375 +0000 UTC m=+0.153592096 container start 334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mahavira, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.564884533 +0000 UTC m=+0.158527284 container attach 334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mahavira, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:32:07 np0005593233 priceless_mahavira[228288]: 167 167
Jan 23 04:32:07 np0005593233 systemd[1]: libpod-334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088.scope: Deactivated successfully.
Jan 23 04:32:07 np0005593233 conmon[228288]: conmon 334ca0a506266fa062c4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088.scope/container/memory.events
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.571053807 +0000 UTC m=+0.164696558 container died 334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mahavira, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:32:07 np0005593233 systemd[1]: var-lib-containers-storage-overlay-917259efb0c2f371a7ea447c5d479fef9b2dbde84628d67decd1f873744b9711-merged.mount: Deactivated successfully.
Jan 23 04:32:07 np0005593233 podman[228271]: 2026-01-23 09:32:07.616208825 +0000 UTC m=+0.209851546 container remove 334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=priceless_mahavira, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:32:07 np0005593233 systemd[1]: libpod-conmon-334ca0a506266fa062c435e6d5e11c85309864c223b6abfc5e948de0476a3088.scope: Deactivated successfully.
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.684 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.685 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.685 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.686 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.690 222021 DEBUG nova.scheduler.client.report [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.731 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.740 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.742 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.746 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.755 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.755 222021 INFO nova.compute.claims [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:32:07 np0005593233 podman[228313]: 2026-01-23 09:32:07.802149529 +0000 UTC m=+0.046669412 container create c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_kapitsa, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.842 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:32:07 np0005593233 nova_compute[222017]: 2026-01-23 09:32:07.843 222021 DEBUG nova.network.neutron [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:32:07 np0005593233 systemd[1]: Started libpod-conmon-c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95.scope.
Jan 23 04:32:07 np0005593233 podman[228313]: 2026-01-23 09:32:07.78081477 +0000 UTC m=+0.025334643 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:32:07 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:32:07 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe925ff4f9914e4f924e26002be9d42f0de0c23426fa9450459c78394aa18810/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:32:07 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe925ff4f9914e4f924e26002be9d42f0de0c23426fa9450459c78394aa18810/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:32:07 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe925ff4f9914e4f924e26002be9d42f0de0c23426fa9450459c78394aa18810/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:32:07 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe925ff4f9914e4f924e26002be9d42f0de0c23426fa9450459c78394aa18810/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:32:07 np0005593233 podman[228313]: 2026-01-23 09:32:07.915997678 +0000 UTC m=+0.160517561 container init c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_kapitsa, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 23 04:32:07 np0005593233 podman[228313]: 2026-01-23 09:32:07.929524798 +0000 UTC m=+0.174044661 container start c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 04:32:07 np0005593233 podman[228313]: 2026-01-23 09:32:07.934371924 +0000 UTC m=+0.178891817 container attach c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_kapitsa, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 04:32:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:07.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:07.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.281 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.500 222021 INFO nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:32:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.535 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.536 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.536 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.536 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f62791ad-fc40-451f-b02a-ba991f2dbc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.543 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.685 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.720 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.722 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.723 222021 INFO nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Creating image(s)#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.753 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.788 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.822 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.826 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.901 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.902 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.904 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.904 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.934 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.939 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0fb415e8-9c82-4021-9088-cfd399d453a0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.974 222021 DEBUG nova.network.neutron [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:32:08 np0005593233 nova_compute[222017]: 2026-01-23 09:32:08.974 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:32:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1237163258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.266 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0fb415e8-9c82-4021-9088-cfd399d453a0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:09.299 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:32:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:09.301 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.304 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.306 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.621s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]: [
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:    {
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "available": false,
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "ceph_device": false,
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "lsm_data": {},
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "lvs": [],
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "path": "/dev/sr0",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "rejected_reasons": [
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "Has a FileSystem",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "Insufficient space (<5GB)"
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        ],
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        "sys_api": {
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "actuators": null,
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "device_nodes": "sr0",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "devname": "sr0",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "human_readable_size": "482.00 KB",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "id_bus": "ata",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "model": "QEMU DVD-ROM",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "nr_requests": "2",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "parent": "/dev/sr0",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "partitions": {},
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "path": "/dev/sr0",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "removable": "1",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "rev": "2.5+",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "ro": "0",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "rotational": "1",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "sas_address": "",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "sas_device_handle": "",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "scheduler_mode": "mq-deadline",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "sectors": 0,
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "sectorsize": "2048",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "size": 493568.0,
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "support_discard": "2048",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "type": "disk",
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:            "vendor": "QEMU"
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:        }
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]:    }
Jan 23 04:32:09 np0005593233 quizzical_kapitsa[228329]: ]
Jan 23 04:32:09 np0005593233 systemd[1]: libpod-c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95.scope: Deactivated successfully.
Jan 23 04:32:09 np0005593233 systemd[1]: libpod-c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95.scope: Consumed 1.406s CPU time.
Jan 23 04:32:09 np0005593233 podman[228313]: 2026-01-23 09:32:09.344111989 +0000 UTC m=+1.588631872 container died c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_kapitsa, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.353 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] resizing rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:32:09 np0005593233 systemd[1]: var-lib-containers-storage-overlay-fe925ff4f9914e4f924e26002be9d42f0de0c23426fa9450459c78394aa18810-merged.mount: Deactivated successfully.
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.406 222021 DEBUG nova.compute.provider_tree [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.493 222021 DEBUG nova.objects.instance [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:09 np0005593233 podman[228313]: 2026-01-23 09:32:09.562188916 +0000 UTC m=+1.806708789 container remove c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quizzical_kapitsa, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 23 04:32:09 np0005593233 systemd[1]: libpod-conmon-c4e55411da0d25c54f1848f16ebf44ca11cda560e6ddb0173b4397eb306fcb95.scope: Deactivated successfully.
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.628 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.629 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Ensure instance console log exists: /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.630 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.630 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.630 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.631 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.633 222021 DEBUG nova.scheduler.client.report [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.640 222021 WARNING nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.695 222021 DEBUG nova.virt.libvirt.host [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.696 222021 DEBUG nova.virt.libvirt.host [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.736 222021 DEBUG nova.virt.libvirt.host [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.737 222021 DEBUG nova.virt.libvirt.host [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.739 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.739 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:31:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9e6893c-615e-4884-93d8-c083db8837da',id=16,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1163562706',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.739 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.740 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.740 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.740 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.740 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.740 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.741 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.741 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.741 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.741 222021 DEBUG nova.virt.hardware [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.745 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.773 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.775 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.880 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160714.8799827, e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.881 222021 INFO nova.compute.manager [-] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.896 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.897 222021 DEBUG nova.network.neutron [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.922 222021 DEBUG nova.compute.manager [None req-ae20c649-3a75-43f5-b08e-fd896f7ee5ca - - - - - -] [instance: e0a13c86-2ed4-4bf5-b4c7-6ba521a0dec1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.940 222021 INFO nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:32:09 np0005593233 nova_compute[222017]: 2026-01-23 09:32:09.965 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:32:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:09.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:09.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.015 222021 INFO nova.virt.block_device [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Booting with volume 2c9770c1-d351-43fa-b18d-aaf9291801fe at /dev/vda#033[00m
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4039454467' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.283 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:10.304 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.310 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.313 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.421 222021 DEBUG os_brick.utils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.423 222021 INFO oslo.privsep.daemon [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpcurjqa5i/privsep.sock']#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.465 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1565788129' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.756 222021 DEBUG nova.policy [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a43b680a6019491aafe42c0a10e648df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.758 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.761 222021 DEBUG nova.objects.instance [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.785 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <uuid>0fb415e8-9c82-4021-9088-cfd399d453a0</uuid>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <name>instance-0000000e</name>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:name>tempest-MigrationsAdminTest-server-2110965880</nova:name>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:32:09</nova:creationTime>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:flavor name="tempest-test_resize_flavor_-1163562706">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <entry name="serial">0fb415e8-9c82-4021-9088-cfd399d453a0</entry>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <entry name="uuid">0fb415e8-9c82-4021-9088-cfd399d453a0</entry>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0fb415e8-9c82-4021-9088-cfd399d453a0_disk">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/console.log" append="off"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:32:10 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:32:10 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:32:10 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:32:10 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.850 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.850 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.851 222021 INFO nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Using config drive#033[00m
Jan 23 04:32:10 np0005593233 nova_compute[222017]: 2026-01-23 09:32:10.876 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.174 222021 INFO oslo.privsep.daemon [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.012 229882 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.017 229882 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.020 229882 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.020 229882 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229882#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.179 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[22928367-c2a8-41f3-8a1e-992173d5ed27]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.303 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.324 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.324 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[1aba4b03-83b5-43b5-a37b-8d1e324d86f2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.326 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.334 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.335 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e7d27d-83f6-4621-aa9a-9a01af982212]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.337 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.350 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.350 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[53394306-8698-4c76-b6a3-e62f44912367]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.353 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[9f729449-94ac-45d5-b036-b21c231c0b0d]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.353 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.375 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.378 222021 DEBUG os_brick.initiator.connectors.lightos [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.378 222021 DEBUG os_brick.initiator.connectors.lightos [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.379 222021 DEBUG os_brick.initiator.connectors.lightos [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.379 222021 DEBUG os_brick.utils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] <== get_connector_properties: return (956ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.379 222021 DEBUG nova.virt.block_device [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating existing volume attachment record: 0f3f1f70-9837-4df7-bac9-a17bfd4c3a5f _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.607 222021 INFO nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Creating config drive at /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/disk.config#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.616 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6tsk0u8q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.763 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6tsk0u8q" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.810 222021 DEBUG nova.storage.rbd_utils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:11 np0005593233 nova_compute[222017]: 2026-01-23 09:32:11.818 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/disk.config 0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:11.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:11.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.040 222021 DEBUG oslo_concurrency.processutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/disk.config 0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.042 222021 INFO nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Deleting local config drive /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/disk.config because it was imported into RBD.#033[00m
Jan 23 04:32:12 np0005593233 systemd-machined[190954]: New machine qemu-7-instance-0000000e.
Jan 23 04:32:12 np0005593233 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.265 222021 DEBUG nova.network.neutron [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Successfully created port: edc7d28f-eaba-44b8-9916-f2089618ca70 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.429 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Updating instance_info_cache with network_info: [{"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.461 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-f62791ad-fc40-451f-b02a-ba991f2dbc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.461 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.461 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.462 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.462 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.462 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.462 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.669 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160732.6688497, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.670 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.672 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.672 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.676 222021 INFO nova.virt.libvirt.driver [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance spawned successfully.#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.676 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.711 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.711 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.711 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.712 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.713 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.713 222021 DEBUG nova.virt.libvirt.driver [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.717 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.721 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.762 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.763 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160732.6690378, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.763 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.787 222021 INFO nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Took 4.07 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.789 222021 DEBUG nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.799 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.802 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.842 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.879 222021 INFO nova.compute.manager [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Took 6.06 seconds to build instance.#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.921 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.923 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.923 222021 INFO nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Creating image(s)#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.923 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.924 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Ensure instance console log exists: /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.924 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.924 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.925 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:12 np0005593233 nova_compute[222017]: 2026-01-23 09:32:12.925 222021 DEBUG oslo_concurrency.lockutils [None req-b0cbf811-28cf-407b-82dd-7fb9daf9c3e6 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:13 np0005593233 nova_compute[222017]: 2026-01-23 09:32:13.283 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:13 np0005593233 nova_compute[222017]: 2026-01-23 09:32:13.379 222021 DEBUG nova.network.neutron [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Successfully updated port: edc7d28f-eaba-44b8-9916-f2089618ca70 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:32:13 np0005593233 nova_compute[222017]: 2026-01-23 09:32:13.720 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:13 np0005593233 nova_compute[222017]: 2026-01-23 09:32:13.720 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquired lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:13 np0005593233 nova_compute[222017]: 2026-01-23 09:32:13.720 222021 DEBUG nova.network.neutron [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:32:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:13.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:13.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:14 np0005593233 nova_compute[222017]: 2026-01-23 09:32:14.081 222021 DEBUG nova.compute.manager [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-changed-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:14 np0005593233 nova_compute[222017]: 2026-01-23 09:32:14.081 222021 DEBUG nova.compute.manager [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Refreshing instance network info cache due to event network-changed-edc7d28f-eaba-44b8-9916-f2089618ca70. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:32:14 np0005593233 nova_compute[222017]: 2026-01-23 09:32:14.081 222021 DEBUG oslo_concurrency.lockutils [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:14 np0005593233 nova_compute[222017]: 2026-01-23 09:32:14.434 222021 DEBUG nova.network.neutron [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:32:15 np0005593233 nova_compute[222017]: 2026-01-23 09:32:15.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:15.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:15.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.368 222021 DEBUG nova.network.neutron [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating instance_info_cache with network_info: [{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.399 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Releasing lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.399 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Instance network_info: |[{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.400 222021 DEBUG oslo_concurrency.lockutils [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.400 222021 DEBUG nova.network.neutron [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Refreshing network info cache for port edc7d28f-eaba-44b8-9916-f2089618ca70 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.403 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Start _get_guest_xml network_info=[{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-2c9770c1-d351-43fa-b18d-aaf9291801fe', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '2c9770c1-d351-43fa-b18d-aaf9291801fe', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520', 'attached_at': '', 'detached_at': '', 'volume_id': '2c9770c1-d351-43fa-b18d-aaf9291801fe', 'serial': '2c9770c1-d351-43fa-b18d-aaf9291801fe'}, 'delete_on_termination': True, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '0f3f1f70-9837-4df7-bac9-a17bfd4c3a5f', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.409 222021 WARNING nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.415 222021 DEBUG nova.virt.libvirt.host [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.416 222021 DEBUG nova.virt.libvirt.host [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.429 222021 DEBUG nova.virt.libvirt.host [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.430 222021 DEBUG nova.virt.libvirt.host [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.431 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.431 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.431 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.432 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.432 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.432 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.432 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.432 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.433 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.433 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.433 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.433 222021 DEBUG nova.virt.hardware [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.463 222021 DEBUG nova.storage.rbd_utils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] rbd image 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.467 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2109513688' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.931 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.932 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.933 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.934 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.993 222021 DEBUG nova.virt.libvirt.vif [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:32:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1815710456',display_name='tempest-LiveMigrationTest-server-1815710456',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1815710456',id=15,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-mz7qsyn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:10Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.994 222021 DEBUG nova.network.os_vif_util [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converting VIF {"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.995 222021 DEBUG nova.network.os_vif_util [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:32:16 np0005593233 nova_compute[222017]: 2026-01-23 09:32:16.996 222021 DEBUG nova.objects.instance [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lazy-loading 'pci_devices' on Instance uuid 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.017 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <uuid>54a1ad4e-6fc9-42dc-aa4c-99d3f1297520</uuid>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <name>instance-0000000f</name>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <nova:name>tempest-LiveMigrationTest-server-1815710456</nova:name>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:32:16</nova:creationTime>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:user uuid="a43b680a6019491aafe42c0a10e648df">tempest-LiveMigrationTest-1903931568-project-member</nova:user>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:project uuid="c56e53b3339e4e4db30b7a9d330bc380">tempest-LiveMigrationTest-1903931568</nova:project>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <nova:port uuid="edc7d28f-eaba-44b8-9916-f2089618ca70">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <entry name="serial">54a1ad4e-6fc9-42dc-aa4c-99d3f1297520</entry>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <entry name="uuid">54a1ad4e-6fc9-42dc-aa4c-99d3f1297520</entry>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_disk.config">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-2c9770c1-d351-43fa-b18d-aaf9291801fe">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <serial>2c9770c1-d351-43fa-b18d-aaf9291801fe</serial>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:c7:78:59"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <target dev="tapedc7d28f-ea"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/console.log" append="off"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:32:17 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:32:17 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:32:17 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:32:17 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.019 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Preparing to wait for external event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.019 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.020 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.020 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.021 222021 DEBUG nova.virt.libvirt.vif [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:32:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1815710456',display_name='tempest-LiveMigrationTest-server-1815710456',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1815710456',id=15,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-mz7qsyn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:32:10Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.021 222021 DEBUG nova.network.os_vif_util [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converting VIF {"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.022 222021 DEBUG nova.network.os_vif_util [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.023 222021 DEBUG os_vif [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.023 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.024 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.025 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.028 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.028 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapedc7d28f-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.029 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapedc7d28f-ea, col_values=(('external_ids', {'iface-id': 'edc7d28f-eaba-44b8-9916-f2089618ca70', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:78:59', 'vm-uuid': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:17 np0005593233 NetworkManager[48871]: <info>  [1769160737.0345] manager: (tapedc7d28f-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.034 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.040 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.041 222021 INFO os_vif [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea')#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.237 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.238 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.238 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] No VIF found with MAC fa:16:3e:c7:78:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.238 222021 INFO nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Using config drive#033[00m
Jan 23 04:32:17 np0005593233 nova_compute[222017]: 2026-01-23 09:32:17.283 222021 DEBUG nova.storage.rbd_utils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] rbd image 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:17.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:17.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.029 222021 INFO nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Creating config drive at /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/disk.config#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.035 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx53o8uou execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.165 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx53o8uou" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.212 222021 DEBUG nova.storage.rbd_utils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] rbd image 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.218 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/disk.config 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.286 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.437 222021 DEBUG oslo_concurrency.processutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/disk.config 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.439 222021 INFO nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deleting local config drive /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/disk.config because it was imported into RBD.#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.478 222021 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.479 222021 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.479 222021 DEBUG nova.network.neutron [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:32:18 np0005593233 NetworkManager[48871]: <info>  [1769160738.4883] manager: (tapedc7d28f-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 04:32:18 np0005593233 kernel: tapedc7d28f-ea: entered promiscuous mode
Jan 23 04:32:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:18Z|00049|binding|INFO|Claiming lport edc7d28f-eaba-44b8-9916-f2089618ca70 for this chassis.
Jan 23 04:32:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:18Z|00050|binding|INFO|edc7d28f-eaba-44b8-9916-f2089618ca70: Claiming fa:16:3e:c7:78:59 10.100.0.14
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.494 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.506 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:78:59 10.100.0.14'], port_security=['fa:16:3e:c7:78:59 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cabb3d88-013b-4542-b789-52d49c567d53, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=edc7d28f-eaba-44b8-9916-f2089618ca70) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.509 140224 INFO neutron.agent.ovn.metadata.agent [-] Port edc7d28f-eaba-44b8-9916-f2089618ca70 in datapath 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 bound to our chassis#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.513 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4#033[00m
Jan 23 04:32:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:18Z|00051|binding|INFO|Setting lport edc7d28f-eaba-44b8-9916-f2089618ca70 ovn-installed in OVS
Jan 23 04:32:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:18Z|00052|binding|INFO|Setting lport edc7d28f-eaba-44b8-9916-f2089618ca70 up in Southbound
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.533 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:18 np0005593233 systemd-machined[190954]: New machine qemu-8-instance-0000000f.
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.536 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[908aca2e-2b4e-4ca6-bbfb-df5aed3642c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:18 np0005593233 systemd-udevd[230153]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:32:18 np0005593233 systemd[1]: Started Virtual Machine qemu-8-instance-0000000f.
Jan 23 04:32:18 np0005593233 NetworkManager[48871]: <info>  [1769160738.5628] device (tapedc7d28f-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:32:18 np0005593233 NetworkManager[48871]: <info>  [1769160738.5635] device (tapedc7d28f-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.590 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[441ca703-9d91-4a2a-891b-d2284be2156d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.597 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e67ebb7c-46f9-42a0-8501-736a489c19ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.631 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c7485e-cd2a-4e31-b314-a40774ed31d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.648 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[62f1dda3-07e0-419a-9b76-c604ef898254]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e7a4d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:a3:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 5, 'rx_bytes': 1162, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459419, 'reachable_time': 37182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230165, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.662 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8a135421-166f-46c1-a516-8a2b631b122f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459431, 'tstamp': 459431}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230167, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459434, 'tstamp': 459434}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230167, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.665 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e7a4d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.667 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.669 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e7a4d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.669 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.669 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e7a4d-f0, col_values=(('external_ids', {'iface-id': '7b93c40e-1f44-4d5a-9bad-e23468f98d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:18.670 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.682 222021 DEBUG nova.network.neutron [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:32:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.835 222021 DEBUG nova.compute.manager [req-f4e877d8-9ba8-4dac-b820-84ecd57b29b2 req-fa574e42-49ee-4d07-bcc3-396139ec01d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.836 222021 DEBUG oslo_concurrency.lockutils [req-f4e877d8-9ba8-4dac-b820-84ecd57b29b2 req-fa574e42-49ee-4d07-bcc3-396139ec01d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.836 222021 DEBUG oslo_concurrency.lockutils [req-f4e877d8-9ba8-4dac-b820-84ecd57b29b2 req-fa574e42-49ee-4d07-bcc3-396139ec01d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.836 222021 DEBUG oslo_concurrency.lockutils [req-f4e877d8-9ba8-4dac-b820-84ecd57b29b2 req-fa574e42-49ee-4d07-bcc3-396139ec01d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.836 222021 DEBUG nova.compute.manager [req-f4e877d8-9ba8-4dac-b820-84ecd57b29b2 req-fa574e42-49ee-4d07-bcc3-396139ec01d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Processing event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.967 222021 DEBUG nova.network.neutron [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updated VIF entry in instance network info cache for port edc7d28f-eaba-44b8-9916-f2089618ca70. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:32:18 np0005593233 nova_compute[222017]: 2026-01-23 09:32:18.967 222021 DEBUG nova.network.neutron [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating instance_info_cache with network_info: [{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.004 222021 DEBUG oslo_concurrency.lockutils [req-cdbb8614-e951-4c9f-b0b1-299e827ac4ed req-e3af3395-2b77-4c7f-90f6-b215569723e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.013 222021 DEBUG nova.network.neutron [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.038 222021 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.161 222021 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.161 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Creating file /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/34ef351643654aab968caf7baf4e0407.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.162 222021 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/34ef351643654aab968caf7baf4e0407.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.353 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160739.352799, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.354 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.357 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.366 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.371 222021 INFO nova.virt.libvirt.driver [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Instance spawned successfully.#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.371 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.377 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.380 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.391 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.391 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.391 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.392 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.392 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.392 222021 DEBUG nova.virt.libvirt.driver [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.399 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.399 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160739.3531735, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.400 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.434 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.440 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160739.365531, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.441 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.484 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.488 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.499 222021 INFO nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Took 6.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.500 222021 DEBUG nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.534 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.551 222021 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/34ef351643654aab968caf7baf4e0407.tmp" returned: 1 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.552 222021 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/34ef351643654aab968caf7baf4e0407.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.552 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Creating directory /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.553 222021 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.583 222021 INFO nova.compute.manager [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Took 12.75 seconds to build instance.#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.603 222021 DEBUG oslo_concurrency.lockutils [None req-8691330d-a474-4bce-b8c6-a828e8075a25 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.753 222021 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:19 np0005593233 nova_compute[222017]: 2026-01-23 09:32:19.757 222021 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:32:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:19.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:32:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:19.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:32:20 np0005593233 nova_compute[222017]: 2026-01-23 09:32:20.975 222021 DEBUG nova.compute.manager [req-c28943e6-ed56-4aa1-ba15-af6b3a3d90df req-8f8efdc6-fc72-43dd-9925-bc07d71c9052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:20 np0005593233 nova_compute[222017]: 2026-01-23 09:32:20.976 222021 DEBUG oslo_concurrency.lockutils [req-c28943e6-ed56-4aa1-ba15-af6b3a3d90df req-8f8efdc6-fc72-43dd-9925-bc07d71c9052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:20 np0005593233 nova_compute[222017]: 2026-01-23 09:32:20.976 222021 DEBUG oslo_concurrency.lockutils [req-c28943e6-ed56-4aa1-ba15-af6b3a3d90df req-8f8efdc6-fc72-43dd-9925-bc07d71c9052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:20 np0005593233 nova_compute[222017]: 2026-01-23 09:32:20.977 222021 DEBUG oslo_concurrency.lockutils [req-c28943e6-ed56-4aa1-ba15-af6b3a3d90df req-8f8efdc6-fc72-43dd-9925-bc07d71c9052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:20 np0005593233 nova_compute[222017]: 2026-01-23 09:32:20.977 222021 DEBUG nova.compute.manager [req-c28943e6-ed56-4aa1-ba15-af6b3a3d90df req-8f8efdc6-fc72-43dd-9925-bc07d71c9052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:20 np0005593233 nova_compute[222017]: 2026-01-23 09:32:20.977 222021 WARNING nova.compute.manager [req-c28943e6-ed56-4aa1-ba15-af6b3a3d90df req-8f8efdc6-fc72-43dd-9925-bc07d71c9052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:32:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:21.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:22.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:22 np0005593233 nova_compute[222017]: 2026-01-23 09:32:22.033 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:22 np0005593233 podman[230212]: 2026-01-23 09:32:22.125576641 +0000 UTC m=+0.127677238 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:32:23 np0005593233 nova_compute[222017]: 2026-01-23 09:32:23.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:23.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:24.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:25 np0005593233 nova_compute[222017]: 2026-01-23 09:32:25.637 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Check if temp file /var/lib/nova/instances/tmpc8knh3fr exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 23 04:32:25 np0005593233 nova_compute[222017]: 2026-01-23 09:32:25.638 222021 DEBUG nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc8knh3fr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='54a1ad4e-6fc9-42dc-aa4c-99d3f1297520',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 23 04:32:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:25.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:26.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:27 np0005593233 nova_compute[222017]: 2026-01-23 09:32:27.036 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:27.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:28.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:28 np0005593233 nova_compute[222017]: 2026-01-23 09:32:28.290 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:29 np0005593233 nova_compute[222017]: 2026-01-23 09:32:29.810 222021 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:32:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:29.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:30.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:32.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:32.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:32 np0005593233 nova_compute[222017]: 2026-01-23 09:32:32.039 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:32 np0005593233 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 23 04:32:32 np0005593233 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 14.394s CPU time.
Jan 23 04:32:32 np0005593233 systemd-machined[190954]: Machine qemu-7-instance-0000000e terminated.
Jan 23 04:32:32 np0005593233 nova_compute[222017]: 2026-01-23 09:32:32.837 222021 INFO nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:32:32 np0005593233 nova_compute[222017]: 2026-01-23 09:32:32.845 222021 INFO nova.virt.libvirt.driver [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance destroyed successfully.#033[00m
Jan 23 04:32:32 np0005593233 nova_compute[222017]: 2026-01-23 09:32:32.850 222021 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:32 np0005593233 nova_compute[222017]: 2026-01-23 09:32:32.851 222021 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:32Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c7:78:59 10.100.0.14
Jan 23 04:32:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:32Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c7:78:59 10.100.0.14
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.291 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.669 222021 DEBUG nova.compute.manager [req-9493c7bd-6a7d-4a4b-81a7-5d042bdf55c9 req-96777062-be95-48b7-8e04-a2ce63f0c8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.670 222021 DEBUG oslo_concurrency.lockutils [req-9493c7bd-6a7d-4a4b-81a7-5d042bdf55c9 req-96777062-be95-48b7-8e04-a2ce63f0c8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.670 222021 DEBUG oslo_concurrency.lockutils [req-9493c7bd-6a7d-4a4b-81a7-5d042bdf55c9 req-96777062-be95-48b7-8e04-a2ce63f0c8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.671 222021 DEBUG oslo_concurrency.lockutils [req-9493c7bd-6a7d-4a4b-81a7-5d042bdf55c9 req-96777062-be95-48b7-8e04-a2ce63f0c8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.671 222021 DEBUG nova.compute.manager [req-9493c7bd-6a7d-4a4b-81a7-5d042bdf55c9 req-96777062-be95-48b7-8e04-a2ce63f0c8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.671 222021 DEBUG nova.compute.manager [req-9493c7bd-6a7d-4a4b-81a7-5d042bdf55c9 req-96777062-be95-48b7-8e04-a2ce63f0c8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:32:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.998 222021 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "0fb415e8-9c82-4021-9088-cfd399d453a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.998 222021 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:33 np0005593233 nova_compute[222017]: 2026-01-23 09:32:33.999 222021 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:32:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:34.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:32:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:34.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.747 222021 INFO nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Took 8.54 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.748 222021 DEBUG nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.765 222021 DEBUG nova.compute.manager [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.766 222021 DEBUG oslo_concurrency.lockutils [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.767 222021 DEBUG oslo_concurrency.lockutils [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.767 222021 DEBUG oslo_concurrency.lockutils [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.768 222021 DEBUG nova.compute.manager [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.768 222021 WARNING nova.compute.manager [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.769 222021 DEBUG nova.compute.manager [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-changed-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.769 222021 DEBUG nova.compute.manager [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Refreshing instance network info cache due to event network-changed-edc7d28f-eaba-44b8-9916-f2089618ca70. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.770 222021 DEBUG oslo_concurrency.lockutils [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.770 222021 DEBUG oslo_concurrency.lockutils [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.771 222021 DEBUG nova.network.neutron [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Refreshing network info cache for port edc7d28f-eaba-44b8-9916-f2089618ca70 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.801 222021 DEBUG nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc8knh3fr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='54a1ad4e-6fc9-42dc-aa4c-99d3f1297520',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(6d5a9d9d-3672-4056-a4a9-72cc0752bcae),old_vol_attachment_ids={2c9770c1-d351-43fa-b18d-aaf9291801fe='0f3f1f70-9837-4df7-bac9-a17bfd4c3a5f'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.806 222021 DEBUG nova.objects.instance [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lazy-loading 'migration_context' on Instance uuid 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.809 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.813 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.813 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.833 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Find same serial number: pos=1, serial=2c9770c1-d351-43fa-b18d-aaf9291801fe _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.835 222021 DEBUG nova.virt.libvirt.vif [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1815710456',display_name='tempest-LiveMigrationTest-server-1815710456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1815710456',id=15,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-mz7qsyn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:19Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.836 222021 DEBUG nova.network.os_vif_util [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converting VIF {"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.837 222021 DEBUG nova.network.os_vif_util [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.837 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating guest XML with vif config: <interface type="ethernet">
Jan 23 04:32:35 np0005593233 nova_compute[222017]:  <mac address="fa:16:3e:c7:78:59"/>
Jan 23 04:32:35 np0005593233 nova_compute[222017]:  <model type="virtio"/>
Jan 23 04:32:35 np0005593233 nova_compute[222017]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:32:35 np0005593233 nova_compute[222017]:  <mtu size="1442"/>
Jan 23 04:32:35 np0005593233 nova_compute[222017]:  <target dev="tapedc7d28f-ea"/>
Jan 23 04:32:35 np0005593233 nova_compute[222017]: </interface>
Jan 23 04:32:35 np0005593233 nova_compute[222017]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 23 04:32:35 np0005593233 nova_compute[222017]: 2026-01-23 09:32:35.838 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 23 04:32:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:36.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:36.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 23 04:32:36 np0005593233 podman[230241]: 2026-01-23 09:32:36.084967219 +0000 UTC m=+0.080198614 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:32:36 np0005593233 nova_compute[222017]: 2026-01-23 09:32:36.317 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:32:36 np0005593233 nova_compute[222017]: 2026-01-23 09:32:36.317 222021 INFO nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 23 04:32:36 np0005593233 nova_compute[222017]: 2026-01-23 09:32:36.570 222021 INFO nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.041 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.074 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.076 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.461 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160757.4599755, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.461 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.495 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.501 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.543 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.579 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.580 222021 DEBUG nova.virt.libvirt.migration [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 04:32:37 np0005593233 kernel: tapedc7d28f-ea (unregistering): left promiscuous mode
Jan 23 04:32:37 np0005593233 NetworkManager[48871]: <info>  [1769160757.6551] device (tapedc7d28f-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:32:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:37Z|00053|binding|INFO|Releasing lport edc7d28f-eaba-44b8-9916-f2089618ca70 from this chassis (sb_readonly=0)
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.671 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:37Z|00054|binding|INFO|Setting lport edc7d28f-eaba-44b8-9916-f2089618ca70 down in Southbound
Jan 23 04:32:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:37Z|00055|binding|INFO|Removing iface tapedc7d28f-ea ovn-installed in OVS
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.702 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 23 04:32:37 np0005593233 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Consumed 14.568s CPU time.
Jan 23 04:32:37 np0005593233 systemd-machined[190954]: Machine qemu-8-instance-0000000f terminated.
Jan 23 04:32:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:37Z|00056|binding|INFO|Releasing lport 8e19ba82-19a8-44be-8cf0-66f5e53af8a2 from this chassis (sb_readonly=0)
Jan 23 04:32:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:32:37Z|00057|binding|INFO|Releasing lport 7b93c40e-1f44-4d5a-9bad-e23468f98d69 from this chassis (sb_readonly=0)
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.729 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:78:59 10.100.0.14'], port_security=['fa:16:3e:c7:78:59 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd80bc768-e67f-4e48-bcf3-42912cda98f1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cabb3d88-013b-4542-b789-52d49c567d53, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=edc7d28f-eaba-44b8-9916-f2089618ca70) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.730 140224 INFO neutron.agent.ovn.metadata.agent [-] Port edc7d28f-eaba-44b8-9916-f2089618ca70 in datapath 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 unbound from our chassis#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.731 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.749 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a2abfb-75fe-40b6-bd97-7cd528b6a8d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.784 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee40c5e-51a8-42f4-a1cd-befd28df5a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.788 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c06ac34d-674a-42a1-88be-4fa0dd012801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:37 np0005593233 virtqemud[221325]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-2c9770c1-d351-43fa-b18d-aaf9291801fe: No such file or directory
Jan 23 04:32:37 np0005593233 virtqemud[221325]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-2c9770c1-d351-43fa-b18d-aaf9291801fe: No such file or directory
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.820 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[df633b44-81d7-48fc-bca8-cca7714ee74c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.833 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.846 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[074688ae-9c8f-4ded-818e-384623ff171c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e7a4d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:a3:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1162, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459419, 'reachable_time': 37182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230277, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.849 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.849 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.849 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.868 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b43880-0fc8-495c-a21d-80a6066d6303]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459431, 'tstamp': 459431}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230284, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459434, 'tstamp': 459434}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230284, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.871 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e7a4d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 nova_compute[222017]: 2026-01-23 09:32:37.878 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.879 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e7a4d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.879 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.879 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e7a4d-f0, col_values=(('external_ids', {'iface-id': '7b93c40e-1f44-4d5a-9bad-e23468f98d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:37.879 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:32:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:32:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:38.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:32:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:38.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:38 np0005593233 nova_compute[222017]: 2026-01-23 09:32:38.083 222021 DEBUG nova.virt.libvirt.guest [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520' (instance-0000000f) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 23 04:32:38 np0005593233 nova_compute[222017]: 2026-01-23 09:32:38.084 222021 INFO nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migration operation has completed#033[00m
Jan 23 04:32:38 np0005593233 nova_compute[222017]: 2026-01-23 09:32:38.084 222021 INFO nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] _post_live_migration() is started..#033[00m
Jan 23 04:32:38 np0005593233 nova_compute[222017]: 2026-01-23 09:32:38.293 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.248 222021 DEBUG nova.compute.manager [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.249 222021 DEBUG oslo_concurrency.lockutils [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.250 222021 DEBUG oslo_concurrency.lockutils [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.250 222021 DEBUG oslo_concurrency.lockutils [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.251 222021 DEBUG nova.compute.manager [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.251 222021 DEBUG nova.compute.manager [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.252 222021 DEBUG nova.compute.manager [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.252 222021 DEBUG oslo_concurrency.lockutils [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.253 222021 DEBUG oslo_concurrency.lockutils [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.253 222021 DEBUG oslo_concurrency.lockutils [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.254 222021 DEBUG nova.compute.manager [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:39 np0005593233 nova_compute[222017]: 2026-01-23 09:32:39.254 222021 WARNING nova.compute.manager [req-c6000ec0-895b-4e21-a1fe-010d70de42ac req-037cdd6e-419a-43f1-a765-4d0deb340b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:32:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:40.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:40.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.054 222021 DEBUG nova.network.neutron [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updated VIF entry in instance network info cache for port edc7d28f-eaba-44b8-9916-f2089618ca70. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.055 222021 DEBUG nova.network.neutron [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating instance_info_cache with network_info: [{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.091 222021 DEBUG nova.compute.manager [req-ecd7da13-c80d-4972-ac8a-7c0ce9e3a963 req-ad1106d9-31ba-480a-8c17-eefbbb8fa00f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.092 222021 DEBUG oslo_concurrency.lockutils [req-ecd7da13-c80d-4972-ac8a-7c0ce9e3a963 req-ad1106d9-31ba-480a-8c17-eefbbb8fa00f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.092 222021 DEBUG oslo_concurrency.lockutils [req-ecd7da13-c80d-4972-ac8a-7c0ce9e3a963 req-ad1106d9-31ba-480a-8c17-eefbbb8fa00f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.093 222021 DEBUG oslo_concurrency.lockutils [req-ecd7da13-c80d-4972-ac8a-7c0ce9e3a963 req-ad1106d9-31ba-480a-8c17-eefbbb8fa00f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.093 222021 DEBUG nova.compute.manager [req-ecd7da13-c80d-4972-ac8a-7c0ce9e3a963 req-ad1106d9-31ba-480a-8c17-eefbbb8fa00f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.093 222021 DEBUG nova.compute.manager [req-ecd7da13-c80d-4972-ac8a-7c0ce9e3a963 req-ad1106d9-31ba-480a-8c17-eefbbb8fa00f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.096 222021 DEBUG oslo_concurrency.lockutils [req-7e581bbb-5114-4d80-a6cb-3321744e190f req-ec5c55dc-3212-41f6-ae22-4a4a550583c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.232 222021 DEBUG nova.network.neutron [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Activated binding for port edc7d28f-eaba-44b8-9916-f2089618ca70 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.233 222021 DEBUG nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.234 222021 DEBUG nova.virt.libvirt.vif [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:32:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1815710456',display_name='tempest-LiveMigrationTest-server-1815710456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1815710456',id=15,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-mz7qsyn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:24Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.235 222021 DEBUG nova.network.os_vif_util [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converting VIF {"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.236 222021 DEBUG nova.network.os_vif_util [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.237 222021 DEBUG os_vif [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.240 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.241 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapedc7d28f-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.243 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.250 222021 INFO os_vif [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea')#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.250 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.251 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.251 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.252 222021 DEBUG nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.253 222021 INFO nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deleting instance files /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_del#033[00m
Jan 23 04:32:40 np0005593233 nova_compute[222017]: 2026-01-23 09:32:40.253 222021 INFO nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deletion of /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_del complete#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.397 222021 DEBUG nova.compute.manager [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.397 222021 DEBUG oslo_concurrency.lockutils [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.397 222021 DEBUG oslo_concurrency.lockutils [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.398 222021 DEBUG oslo_concurrency.lockutils [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.398 222021 DEBUG nova.compute.manager [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.398 222021 WARNING nova.compute.manager [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.398 222021 DEBUG nova.compute.manager [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.399 222021 DEBUG oslo_concurrency.lockutils [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.399 222021 DEBUG oslo_concurrency.lockutils [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.399 222021 DEBUG oslo_concurrency.lockutils [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.399 222021 DEBUG nova.compute.manager [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:41 np0005593233 nova_compute[222017]: 2026-01-23 09:32:41.399 222021 WARNING nova.compute.manager [req-6049068b-fbe7-48b0-9348-0d5b5c8c805c req-d36a845a-57b9-4881-9502-5eb530542a42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:32:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:42.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:42.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:42.599 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:42.600 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:32:42.601 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.295 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.376 222021 INFO nova.compute.manager [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Swapping old allocation on dict_keys(['929812a2-38ca-4ee7-9f24-090d633cb42b']) held by migration 47980d7d-34af-4e57-9dbf-4f58fd30ae9c for instance#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.428 222021 DEBUG nova.scheduler.client.report [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Overwriting current allocation {'allocations': {'89873210-bee9-46e9-9f9d-0cd7a156c3a8': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 12}}, 'project_id': '11def90dfdc14cfe928302bec2835794', 'user_id': '7536fa2e625541fba613dc32a49a4c5b', 'consumer_generation': 1} on consumer 0fb415e8-9c82-4021-9088-cfd399d453a0 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.602 222021 DEBUG nova.compute.manager [req-ec411157-f755-41ac-92cd-aac0bb30d6d1 req-46ef9b3d-1874-421f-8f80-fd8b62317304 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.603 222021 DEBUG oslo_concurrency.lockutils [req-ec411157-f755-41ac-92cd-aac0bb30d6d1 req-46ef9b3d-1874-421f-8f80-fd8b62317304 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.605 222021 DEBUG oslo_concurrency.lockutils [req-ec411157-f755-41ac-92cd-aac0bb30d6d1 req-46ef9b3d-1874-421f-8f80-fd8b62317304 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.605 222021 DEBUG oslo_concurrency.lockutils [req-ec411157-f755-41ac-92cd-aac0bb30d6d1 req-46ef9b3d-1874-421f-8f80-fd8b62317304 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.605 222021 DEBUG nova.compute.manager [req-ec411157-f755-41ac-92cd-aac0bb30d6d1 req-46ef9b3d-1874-421f-8f80-fd8b62317304 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.606 222021 WARNING nova.compute.manager [req-ec411157-f755-41ac-92cd-aac0bb30d6d1 req-46ef9b3d-1874-421f-8f80-fd8b62317304 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:32:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.807 222021 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.808 222021 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:43 np0005593233 nova_compute[222017]: 2026-01-23 09:32:43.808 222021 DEBUG nova.network.neutron [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:32:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:44.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:44.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:44 np0005593233 nova_compute[222017]: 2026-01-23 09:32:44.062 222021 DEBUG nova.network.neutron [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:32:45 np0005593233 nova_compute[222017]: 2026-01-23 09:32:45.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:45 np0005593233 nova_compute[222017]: 2026-01-23 09:32:45.703 222021 DEBUG nova.network.neutron [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:45 np0005593233 nova_compute[222017]: 2026-01-23 09:32:45.732 222021 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:45 np0005593233 nova_compute[222017]: 2026-01-23 09:32:45.734 222021 DEBUG nova.virt.libvirt.driver [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 23 04:32:45 np0005593233 nova_compute[222017]: 2026-01-23 09:32:45.831 222021 DEBUG nova.storage.rbd_utils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rolling back rbd image(0fb415e8-9c82-4021-9088-cfd399d453a0_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 23 04:32:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:46.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:46.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.110 222021 DEBUG nova.storage.rbd_utils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] removing snapshot(nova-resize) on rbd image(0fb415e8-9c82-4021-9088-cfd399d453a0_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:32:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.224 222021 DEBUG nova.virt.libvirt.driver [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.229 222021 WARNING nova.virt.libvirt.driver [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.238 222021 DEBUG nova.virt.libvirt.host [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.239 222021 DEBUG nova.virt.libvirt.host [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.243 222021 DEBUG nova.virt.libvirt.host [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.244 222021 DEBUG nova.virt.libvirt.host [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.245 222021 DEBUG nova.virt.libvirt.driver [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.245 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:31:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c9e6893c-615e-4884-93d8-c083db8837da',id=16,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1163562706',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.246 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.246 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.247 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.247 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.248 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.248 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.248 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.248 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.249 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.249 222021 DEBUG nova.virt.hardware [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.249 222021 DEBUG nova.objects.instance [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.269 222021 DEBUG oslo_concurrency.processutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/972677850' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.740 222021 DEBUG oslo_concurrency.processutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:46 np0005593233 nova_compute[222017]: 2026-01-23 09:32:46.779 222021 DEBUG oslo_concurrency.processutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.040 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.041 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.041 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.076 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.077 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.077 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.077 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.078 222021 DEBUG oslo_concurrency.processutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2217005123' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.233 222021 DEBUG oslo_concurrency.processutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.239 222021 DEBUG nova.virt.libvirt.driver [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <uuid>0fb415e8-9c82-4021-9088-cfd399d453a0</uuid>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <name>instance-0000000e</name>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:name>tempest-MigrationsAdminTest-server-2110965880</nova:name>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:32:46</nova:creationTime>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:flavor name="tempest-test_resize_flavor_-1163562706">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <entry name="serial">0fb415e8-9c82-4021-9088-cfd399d453a0</entry>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <entry name="uuid">0fb415e8-9c82-4021-9088-cfd399d453a0</entry>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0fb415e8-9c82-4021-9088-cfd399d453a0_disk">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/console.log" append="off"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:32:47 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:32:47 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:32:47 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:32:47 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:47 np0005593233 systemd-machined[190954]: New machine qemu-9-instance-0000000e.
Jan 23 04:32:47 np0005593233 systemd[1]: Started Virtual Machine qemu-9-instance-0000000e.
Jan 23 04:32:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4122582786' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.546 222021 DEBUG oslo_concurrency.processutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.586 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160752.5855188, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.587 222021 INFO nova.compute.manager [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.615 222021 DEBUG nova.compute.manager [None req-23bdbc13-c8df-4cfb-a789-cb9263f711a5 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.639 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.639 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.643 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.644 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.860 222021 WARNING nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.861 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4693MB free_disk=20.75955581665039GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.862 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.862 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:47 np0005593233 nova_compute[222017]: 2026-01-23 09:32:47.976 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Migration for instance 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 04:32:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:48.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:48.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.064 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.093 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Instance f62791ad-fc40-451f-b02a-ba991f2dbc32 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.094 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Instance 0fb415e8-9c82-4021-9088-cfd399d453a0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.094 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Migration 6d5a9d9d-3672-4056-a4a9-72cc0752bcae is active on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.094 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.095 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.264 222021 DEBUG oslo_concurrency.processutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.655 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160768.6545238, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.656 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.661 222021 DEBUG nova.compute.manager [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.665 222021 INFO nova.virt.libvirt.driver [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance running successfully.#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.665 222021 DEBUG nova.virt.libvirt.driver [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.736 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.744 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.788 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.789 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160768.656258, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.789 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.820 222021 INFO nova.compute.manager [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance to original state: 'active'#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.828 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.834 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.871 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 04:32:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3689296051' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.903 222021 DEBUG oslo_concurrency.processutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.639s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.910 222021 DEBUG nova.compute.provider_tree [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.937 222021 DEBUG nova.scheduler.client.report [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.986 222021 DEBUG nova.compute.resource_tracker [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.986 222021 DEBUG oslo_concurrency.lockutils [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:48 np0005593233 nova_compute[222017]: 2026-01-23 09:32:48.994 222021 INFO nova.compute.manager [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 23 04:32:49 np0005593233 nova_compute[222017]: 2026-01-23 09:32:49.112 222021 INFO nova.scheduler.client.report [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Deleted allocation for migration 6d5a9d9d-3672-4056-a4a9-72cc0752bcae#033[00m
Jan 23 04:32:49 np0005593233 nova_compute[222017]: 2026-01-23 09:32:49.112 222021 DEBUG nova.virt.libvirt.driver [None req-4686ec0f-b2f4-43d4-9b84-cf40e2256e17 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 23 04:32:49 np0005593233 nova_compute[222017]: 2026-01-23 09:32:49.729 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Creating tmpfile /var/lib/nova/instances/tmpo7nk37a9 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 23 04:32:49 np0005593233 nova_compute[222017]: 2026-01-23 09:32:49.732 222021 DEBUG nova.compute.manager [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo7nk37a9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 23 04:32:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:50.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:50.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:50 np0005593233 nova_compute[222017]: 2026-01-23 09:32:50.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:52.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.047 222021 DEBUG nova.compute.manager [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo7nk37a9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='54a1ad4e-6fc9-42dc-aa4c-99d3f1297520',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.106 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.106 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquired lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.107 222021 DEBUG nova.network.neutron [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.847 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160757.8464081, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.848 222021 INFO nova.compute.manager [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:32:52 np0005593233 nova_compute[222017]: 2026-01-23 09:32:52.881 222021 DEBUG nova.compute.manager [None req-5f4f76a8-129b-4990-ad7a-ee2894d022d8 - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:53 np0005593233 podman[230505]: 2026-01-23 09:32:53.102622485 +0000 UTC m=+0.109084456 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:32:53 np0005593233 nova_compute[222017]: 2026-01-23 09:32:53.300 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 23 04:32:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:32:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:54.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:32:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:32:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:54.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.769 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.770 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.791 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.881 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.882 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.891 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:32:54 np0005593233 nova_compute[222017]: 2026-01-23 09:32:54.892 222021 INFO nova.compute.claims [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.087 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.161 222021 DEBUG nova.network.neutron [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating instance_info_cache with network_info: [{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.217 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Releasing lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.220 222021 DEBUG os_brick.utils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.221 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.236 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.237 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[e83a79b6-55bf-44d4-9923-7b23e00e7d2b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.239 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.247 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.247 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c827ebdd-2676-4580-92cf-35de5634f93d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.250 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.254 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.261 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.261 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[98e46b53-ad91-4b63-abde-580ea4d12fed]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.263 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c69a7308-81bb-46fc-bc57-e444ad78623a]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.263 222021 DEBUG oslo_concurrency.processutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.285 222021 DEBUG oslo_concurrency.processutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] CMD "nvme version" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.288 222021 DEBUG os_brick.initiator.connectors.lightos [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.289 222021 DEBUG os_brick.initiator.connectors.lightos [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.289 222021 DEBUG os_brick.initiator.connectors.lightos [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.290 222021 DEBUG os_brick.utils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:32:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2430134281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.533 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.540 222021 DEBUG nova.compute.provider_tree [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.566 222021 DEBUG nova.scheduler.client.report [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.609 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.609 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.754 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.755 222021 DEBUG nova.network.neutron [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.865 222021 INFO nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:32:55 np0005593233 nova_compute[222017]: 2026-01-23 09:32:55.899 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:32:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:56.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:56.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.107 222021 DEBUG nova.network.neutron [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.107 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.285 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.287 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.288 222021 INFO nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Creating image(s)#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.318 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.350 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.388 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.393 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.449 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.451 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.452 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.452 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.481 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.486 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.760 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo7nk37a9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='54a1ad4e-6fc9-42dc-aa4c-99d3f1297520',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={2c9770c1-d351-43fa-b18d-aaf9291801fe='27ad6cc5-03f7-4ff3-b319-d6a2b0f2119f'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.762 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Creating instance directory: /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.763 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Ensure instance console log exists: /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.765 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.772 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.775 222021 DEBUG nova.virt.libvirt.vif [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:32:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1815710456',display_name='tempest-LiveMigrationTest-server-1815710456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1815710456',id=15,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-mz7qsyn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:32:46Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.777 222021 DEBUG nova.network.os_vif_util [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converting VIF {"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.779 222021 DEBUG nova.network.os_vif_util [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.780 222021 DEBUG os_vif [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.782 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.784 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.785 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.793 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapedc7d28f-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.794 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapedc7d28f-ea, col_values=(('external_ids', {'iface-id': 'edc7d28f-eaba-44b8-9916-f2089618ca70', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:78:59', 'vm-uuid': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:56 np0005593233 NetworkManager[48871]: <info>  [1769160776.7988] manager: (tapedc7d28f-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.811 222021 INFO os_vif [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea')#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.816 222021 DEBUG nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 23 04:32:56 np0005593233 nova_compute[222017]: 2026-01-23 09:32:56.817 222021 DEBUG nova.compute.manager [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo7nk37a9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='54a1ad4e-6fc9-42dc-aa4c-99d3f1297520',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={2c9770c1-d351-43fa-b18d-aaf9291801fe='27ad6cc5-03f7-4ff3-b319-d6a2b0f2119f'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.031 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.100 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] resizing rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.205 222021 DEBUG nova.objects.instance [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.238 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.239 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Ensure instance console log exists: /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.240 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.240 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.241 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.242 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.246 222021 WARNING nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.251 222021 DEBUG nova.virt.libvirt.host [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.252 222021 DEBUG nova.virt.libvirt.host [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.254 222021 DEBUG nova.virt.libvirt.host [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.255 222021 DEBUG nova.virt.libvirt.host [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.256 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.256 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.257 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.257 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.257 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.258 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.258 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.258 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.259 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.259 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.259 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.259 222021 DEBUG nova.virt.hardware [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.262 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/230036688' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.706 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.741 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:57 np0005593233 nova_compute[222017]: 2026-01-23 09:32:57.748 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:58.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:32:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:58.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/517985940' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.225 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.228 222021 DEBUG nova.objects.instance [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'pci_devices' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.249 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <uuid>07c9ba0d-ab3d-4079-ab97-46e91de4911a</uuid>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <name>instance-00000012</name>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:name>tempest-MigrationsAdminTest-server-2012645134</nova:name>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:32:57</nova:creationTime>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <entry name="serial">07c9ba0d-ab3d-4079-ab97-46e91de4911a</entry>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <entry name="uuid">07c9ba0d-ab3d-4079-ab97-46e91de4911a</entry>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/console.log" append="off"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:32:58 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:32:58 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:32:58 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:32:58 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.302 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.349 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.350 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.350 222021 INFO nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Using config drive#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.380 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.429 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.430 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.431 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid f62791ad-fc40-451f-b02a-ba991f2dbc32 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "0fb415e8-9c82-4021-9088-cfd399d453a0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "f62791ad-fc40-451f-b02a-ba991f2dbc32" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.488 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.491 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.744 222021 INFO nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Creating config drive at /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/disk.config#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.752 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp45x8n58b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.891 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp45x8n58b" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.937 222021 DEBUG nova.storage.rbd_utils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:58 np0005593233 nova_compute[222017]: 2026-01-23 09:32:58.943 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/disk.config 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.107 222021 DEBUG oslo_concurrency.processutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/disk.config 07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.109 222021 INFO nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Deleting local config drive /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/disk.config because it was imported into RBD.#033[00m
Jan 23 04:32:59 np0005593233 systemd-machined[190954]: New machine qemu-10-instance-00000012.
Jan 23 04:32:59 np0005593233 systemd[1]: Started Virtual Machine qemu-10-instance-00000012.
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.598 222021 DEBUG nova.network.neutron [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Port edc7d28f-eaba-44b8-9916-f2089618ca70 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.640 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160779.6402106, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.641 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.644 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.645 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.649 222021 INFO nova.virt.libvirt.driver [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance spawned successfully.#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.650 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.674 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.677 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.678 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.678 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.678 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.679 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.679 222021 DEBUG nova.virt.libvirt.driver [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.684 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.714 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.715 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160779.64366, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.715 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.766 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.772 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.796 222021 INFO nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Took 3.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.796 222021 DEBUG nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.814 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.923 222021 DEBUG nova.compute.manager [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo7nk37a9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='54a1ad4e-6fc9-42dc-aa4c-99d3f1297520',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={2c9770c1-d351-43fa-b18d-aaf9291801fe='27ad6cc5-03f7-4ff3-b319-d6a2b0f2119f'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.956 222021 INFO nova.compute.manager [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Took 5.11 seconds to build instance.#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.979 222021 DEBUG oslo_concurrency.lockutils [None req-d3ca7ec8-7c5a-435d-9b98-eb5644899854 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.979 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.979 222021 INFO nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:59 np0005593233 nova_compute[222017]: 2026-01-23 09:32:59.980 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:00.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:00.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:00 np0005593233 kernel: tapedc7d28f-ea: entered promiscuous mode
Jan 23 04:33:00 np0005593233 systemd-udevd[230904]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:33:00 np0005593233 NetworkManager[48871]: <info>  [1769160780.1392] manager: (tapedc7d28f-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 04:33:00 np0005593233 nova_compute[222017]: 2026-01-23 09:33:00.142 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:00Z|00058|binding|INFO|Claiming lport edc7d28f-eaba-44b8-9916-f2089618ca70 for this additional chassis.
Jan 23 04:33:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:00Z|00059|binding|INFO|edc7d28f-eaba-44b8-9916-f2089618ca70: Claiming fa:16:3e:c7:78:59 10.100.0.14
Jan 23 04:33:00 np0005593233 NetworkManager[48871]: <info>  [1769160780.1559] device (tapedc7d28f-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:33:00 np0005593233 NetworkManager[48871]: <info>  [1769160780.1585] device (tapedc7d28f-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:33:00 np0005593233 nova_compute[222017]: 2026-01-23 09:33:00.160 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:00Z|00060|binding|INFO|Setting lport edc7d28f-eaba-44b8-9916-f2089618ca70 ovn-installed in OVS
Jan 23 04:33:00 np0005593233 nova_compute[222017]: 2026-01-23 09:33:00.163 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:00 np0005593233 systemd-machined[190954]: New machine qemu-11-instance-0000000f.
Jan 23 04:33:00 np0005593233 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Jan 23 04:33:01 np0005593233 nova_compute[222017]: 2026-01-23 09:33:01.796 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:02.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:02.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.140 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160782.1397333, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.140 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Started (Lifecycle Event)#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.342 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.425 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.425 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.425 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.425 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.426 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.463 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.463 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.464 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.464 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.464 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.817 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160782.816504, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.817 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.898 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.901 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1424490116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.957 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 23 04:33:02 np0005593233 nova_compute[222017]: 2026-01-23 09:33:02.975 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:02 np0005593233 systemd[1]: Starting dnf makecache...
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.202 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.203 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.209 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.211 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000000f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 dnf[230992]: Metadata cache refreshed recently.
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.216 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000012 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.217 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000012 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.221 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.221 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593233 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 04:33:03 np0005593233 systemd[1]: Finished dnf makecache.
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.306 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.421 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.422 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4229MB free_disk=20.749217987060547GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.518 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration for instance 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.565 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating resource usage from migration b15a37ef-d8e5-44a3-85e1-37de93fc8299#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.565 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Starting to track incoming migration b15a37ef-d8e5-44a3-85e1-37de93fc8299 with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.565 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating resource usage from migration 36643aa1-ddbd-427a-beef-052fd4db42bf#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.704 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance f62791ad-fc40-451f-b02a-ba991f2dbc32 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.705 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0fb415e8-9c82-4021-9088-cfd399d453a0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.798 222021 WARNING nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration 36643aa1-ddbd-427a-beef-052fd4db42bf is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:33:03 np0005593233 nova_compute[222017]: 2026-01-23 09:33:03.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:33:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:04.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:04.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:04 np0005593233 nova_compute[222017]: 2026-01-23 09:33:04.247 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1740544057' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:04 np0005593233 nova_compute[222017]: 2026-01-23 09:33:04.717 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:04 np0005593233 nova_compute[222017]: 2026-01-23 09:33:04.726 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:04 np0005593233 nova_compute[222017]: 2026-01-23 09:33:04.764 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:04 np0005593233 nova_compute[222017]: 2026-01-23 09:33:04.808 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:33:04 np0005593233 nova_compute[222017]: 2026-01-23 09:33:04.809 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.498 222021 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.499 222021 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquired lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.499 222021 DEBUG nova.network.neutron [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.741 222021 DEBUG nova.network.neutron [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.769 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:05 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:05Z|00061|binding|INFO|Claiming lport edc7d28f-eaba-44b8-9916-f2089618ca70 for this chassis.
Jan 23 04:33:05 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:05Z|00062|binding|INFO|edc7d28f-eaba-44b8-9916-f2089618ca70: Claiming fa:16:3e:c7:78:59 10.100.0.14
Jan 23 04:33:05 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:05Z|00063|binding|INFO|Setting lport edc7d28f-eaba-44b8-9916-f2089618ca70 up in Southbound
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.836 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:78:59 10.100.0.14'], port_security=['fa:16:3e:c7:78:59 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '21', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cabb3d88-013b-4542-b789-52d49c567d53, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=edc7d28f-eaba-44b8-9916-f2089618ca70) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.838 140224 INFO neutron.agent.ovn.metadata.agent [-] Port edc7d28f-eaba-44b8-9916-f2089618ca70 in datapath 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 bound to our chassis#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.839 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.843 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.843 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.843 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.859 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9a316f-c3b5-4807-896b-43fe00b9a141]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.897 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7917391f-a306-4a22-8b19-fb873b5b7248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.902 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2d91c323-130f-4f24-997f-ceedd77501d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.928 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[19a7efbc-8408-4f71-9b2f-be1c462fede4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.946 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d17fb07e-9ef4-434f-aa00-c31ccf3922d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e7a4d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:a3:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 25, 'tx_packets': 9, 'rx_bytes': 1330, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 25, 'tx_packets': 9, 'rx_bytes': 1330, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459419, 'reachable_time': 37182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231021, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.961 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[005f2d3c-f150-4637-b49a-9a14f5e6cf81]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459431, 'tstamp': 459431}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231022, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459434, 'tstamp': 459434}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231022, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.964 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e7a4d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.967 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e7a4d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.967 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:33:05 np0005593233 nova_compute[222017]: 2026-01-23 09:33:05.967 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.968 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e7a4d-f0, col_values=(('external_ids', {'iface-id': '7b93c40e-1f44-4d5a-9bad-e23468f98d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:05.968 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:33:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000084s ======
Jan 23 04:33:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:06.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Jan 23 04:33:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:06.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.183 222021 INFO nova.compute.manager [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Post operation of migration started#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.748 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.749 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.750 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.750 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.754 222021 DEBUG nova.network.neutron [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.786 222021 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Releasing lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.972 222021 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.972 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Creating file /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/395fa08e6d094b9b8492fc18cc79f331.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 04:33:06 np0005593233 nova_compute[222017]: 2026-01-23 09:33:06.973 222021 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/395fa08e6d094b9b8492fc18cc79f331.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.113 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.141 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.142 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquired lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.142 222021 DEBUG nova.network.neutron [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:07 np0005593233 podman[231023]: 2026-01-23 09:33:07.144822498 +0000 UTC m=+0.129606872 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.435 222021 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/395fa08e6d094b9b8492fc18cc79f331.tmp" returned: 1 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.436 222021 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/395fa08e6d094b9b8492fc18cc79f331.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.436 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Creating directory /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.436 222021 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:07 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.640 222021 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.648 222021 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.913 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.941 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.942 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.943 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.943 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:07 np0005593233 nova_compute[222017]: 2026-01-23 09:33:07.943 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:33:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:08.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:08.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:08 np0005593233 nova_compute[222017]: 2026-01-23 09:33:08.308 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:08 np0005593233 nova_compute[222017]: 2026-01-23 09:33:08.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:08 np0005593233 nova_compute[222017]: 2026-01-23 09:33:08.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:08 np0005593233 nova_compute[222017]: 2026-01-23 09:33:08.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:33:08 np0005593233 nova_compute[222017]: 2026-01-23 09:33:08.413 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:33:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:09 np0005593233 nova_compute[222017]: 2026-01-23 09:33:09.792 222021 DEBUG nova.network.neutron [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating instance_info_cache with network_info: [{"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:09 np0005593233 nova_compute[222017]: 2026-01-23 09:33:09.819 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Releasing lock "refresh_cache-54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:09 np0005593233 nova_compute[222017]: 2026-01-23 09:33:09.842 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:09 np0005593233 nova_compute[222017]: 2026-01-23 09:33:09.844 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:09 np0005593233 nova_compute[222017]: 2026-01-23 09:33:09.844 222021 DEBUG oslo_concurrency.lockutils [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:09 np0005593233 nova_compute[222017]: 2026-01-23 09:33:09.849 222021 INFO nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 23 04:33:09 np0005593233 virtqemud[221325]: Domain id=11 name='instance-0000000f' uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 is tainted: custom-monitor
Jan 23 04:33:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:10.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:33:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:10.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:33:10 np0005593233 nova_compute[222017]: 2026-01-23 09:33:10.858 222021 INFO nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 23 04:33:11 np0005593233 nova_compute[222017]: 2026-01-23 09:33:11.801 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:11 np0005593233 nova_compute[222017]: 2026-01-23 09:33:11.866 222021 INFO nova.virt.libvirt.driver [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 23 04:33:11 np0005593233 nova_compute[222017]: 2026-01-23 09:33:11.870 222021 DEBUG nova.compute.manager [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:11 np0005593233 nova_compute[222017]: 2026-01-23 09:33:11.917 222021 DEBUG nova.objects.instance [None req-9959f827-27bc-4670-9623-69661ee17463 5a5194678c634c8fb09b5397d1ed31fe 985865a35e144fc6b78d4b87561eb207 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:33:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:12.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:12.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:13 np0005593233 nova_compute[222017]: 2026-01-23 09:33:13.310 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:13 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:13Z|00064|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 23 04:33:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:14.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:14.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.776 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.777 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.778 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.778 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.778 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.781 222021 INFO nova.compute.manager [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Terminating instance#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.783 222021 DEBUG nova.compute.manager [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:33:15 np0005593233 kernel: tapedc7d28f-ea (unregistering): left promiscuous mode
Jan 23 04:33:15 np0005593233 NetworkManager[48871]: <info>  [1769160795.8467] device (tapedc7d28f-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:33:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:15Z|00065|binding|INFO|Releasing lport edc7d28f-eaba-44b8-9916-f2089618ca70 from this chassis (sb_readonly=0)
Jan 23 04:33:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:15Z|00066|binding|INFO|Setting lport edc7d28f-eaba-44b8-9916-f2089618ca70 down in Southbound
Jan 23 04:33:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:15Z|00067|binding|INFO|Removing iface tapedc7d28f-ea ovn-installed in OVS
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.868 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:78:59 10.100.0.14'], port_security=['fa:16:3e:c7:78:59 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '54a1ad4e-6fc9-42dc-aa4c-99d3f1297520', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cabb3d88-013b-4542-b789-52d49c567d53, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=edc7d28f-eaba-44b8-9916-f2089618ca70) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.870 140224 INFO neutron.agent.ovn.metadata.agent [-] Port edc7d28f-eaba-44b8-9916-f2089618ca70 in datapath 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 unbound from our chassis#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.871 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.874 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.891 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b795f65b-74da-4e78-87ba-d9e14ab01c0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:15 np0005593233 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 23 04:33:15 np0005593233 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 3.177s CPU time.
Jan 23 04:33:15 np0005593233 systemd-machined[190954]: Machine qemu-11-instance-0000000f terminated.
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.927 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[acd83c52-c9ad-40f2-8f47-c7c3dfa7f473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.930 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[10767975-2eb8-487b-9347-8c6accca65d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.965 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[37cd7d77-e5ed-42b1-9751-b42bf1fd70cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.981 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b43935-6c77-4888-ba71-fb1e269c1ff5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap385e7a4d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:a3:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 40, 'tx_packets': 11, 'rx_bytes': 1960, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 40, 'tx_packets': 11, 'rx_bytes': 1960, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459419, 'reachable_time': 37182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231056, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.995 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fa566c38-5487-43e4-b84d-eacc0e439d20]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459431, 'tstamp': 459431}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231057, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap385e7a4d-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459434, 'tstamp': 459434}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231057, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:15.996 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e7a4d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:15 np0005593233 nova_compute[222017]: 2026-01-23 09:33:15.998 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.005 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:16.005 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap385e7a4d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:16.006 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:33:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:16.006 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap385e7a4d-f0, col_values=(('external_ids', {'iface-id': '7b93c40e-1f44-4d5a-9bad-e23468f98d69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:16.006 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.032 222021 INFO nova.virt.libvirt.driver [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Instance destroyed successfully.#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.032 222021 DEBUG nova.objects.instance [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lazy-loading 'resources' on Instance uuid 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:16.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.099 222021 DEBUG nova.virt.libvirt.vif [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:32:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1815710456',display_name='tempest-LiveMigrationTest-server-1815710456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1815710456',id=15,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:32:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-mz7qsyn0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:33:11Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=54a1ad4e-6fc9-42dc-aa4c-99d3f1297520,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.100 222021 DEBUG nova.network.os_vif_util [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converting VIF {"id": "edc7d28f-eaba-44b8-9916-f2089618ca70", "address": "fa:16:3e:c7:78:59", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapedc7d28f-ea", "ovs_interfaceid": "edc7d28f-eaba-44b8-9916-f2089618ca70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.101 222021 DEBUG nova.network.os_vif_util [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.101 222021 DEBUG os_vif [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.104 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapedc7d28f-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.106 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.111 222021 INFO os_vif [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:78:59,bridge_name='br-int',has_traffic_filtering=True,id=edc7d28f-eaba-44b8-9916-f2089618ca70,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapedc7d28f-ea')#033[00m
Jan 23 04:33:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:16.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.378 222021 INFO nova.virt.libvirt.driver [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deleting instance files /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_del#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.379 222021 INFO nova.virt.libvirt.driver [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deletion of /var/lib/nova/instances/54a1ad4e-6fc9-42dc-aa4c-99d3f1297520_del complete#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.514 222021 INFO nova.compute.manager [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.515 222021 DEBUG oslo.service.loopingcall [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.515 222021 DEBUG nova.compute.manager [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:33:16 np0005593233 nova_compute[222017]: 2026-01-23 09:33:16.515 222021 DEBUG nova.network.neutron [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.698 222021 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:33:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:17.965 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.966 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:17.966 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.986 222021 DEBUG nova.compute.manager [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.987 222021 DEBUG oslo_concurrency.lockutils [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.987 222021 DEBUG oslo_concurrency.lockutils [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.987 222021 DEBUG oslo_concurrency.lockutils [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.988 222021 DEBUG nova.compute.manager [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.988 222021 DEBUG nova.compute.manager [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-unplugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.989 222021 DEBUG nova.compute.manager [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.989 222021 DEBUG oslo_concurrency.lockutils [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.989 222021 DEBUG oslo_concurrency.lockutils [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.989 222021 DEBUG oslo_concurrency.lockutils [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.990 222021 DEBUG nova.compute.manager [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] No waiting events found dispatching network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:33:17 np0005593233 nova_compute[222017]: 2026-01-23 09:33:17.990 222021 WARNING nova.compute.manager [req-74c66f86-d177-4b4e-bf35-28c3db403f4a req-29cf73a1-0a8a-4ba2-a753-a3ee1ee4d647 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received unexpected event network-vif-plugged-edc7d28f-eaba-44b8-9916-f2089618ca70 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:33:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:33:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:33:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:33:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:18.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:33:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:18.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:18 np0005593233 nova_compute[222017]: 2026-01-23 09:33:18.312 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:18 np0005593233 nova_compute[222017]: 2026-01-23 09:33:18.630 222021 DEBUG nova.network.neutron [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:18 np0005593233 nova_compute[222017]: 2026-01-23 09:33:18.673 222021 INFO nova.compute.manager [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Took 2.16 seconds to deallocate network for instance.#033[00m
Jan 23 04:33:18 np0005593233 nova_compute[222017]: 2026-01-23 09:33:18.793 222021 DEBUG nova.compute.manager [req-a34394d6-6afd-4022-961e-4cc8bff3cfbe req-deeb0f04-1f14-47dc-be73-c31eaf6db0f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Received event network-vif-deleted-edc7d28f-eaba-44b8-9916-f2089618ca70 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:33:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.080 222021 INFO nova.compute.manager [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Took 0.41 seconds to detach 1 volumes for instance.#033[00m
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.082 222021 DEBUG nova.compute.manager [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Deleting volume: 2c9770c1-d351-43fa-b18d-aaf9291801fe _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.490 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.491 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.498 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.541 222021 INFO nova.scheduler.client.report [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Deleted allocations for instance 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520#033[00m
Jan 23 04:33:19 np0005593233 nova_compute[222017]: 2026-01-23 09:33:19.620 222021 DEBUG oslo_concurrency.lockutils [None req-dc0912a0-2429-4fe1-9193-c456e04f59de a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "54a1ad4e-6fc9-42dc-aa4c-99d3f1297520" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:20 np0005593233 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 23 04:33:20 np0005593233 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000012.scope: Consumed 15.144s CPU time.
Jan 23 04:33:20 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:33:20 np0005593233 systemd-machined[190954]: Machine qemu-10-instance-00000012 terminated.
Jan 23 04:33:20 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:33:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.004000112s ======
Jan 23 04:33:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:20.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000112s
Jan 23 04:33:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:20.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.714 222021 INFO nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.720 222021 INFO nova.virt.libvirt.driver [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance destroyed successfully.#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.723 222021 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] skipping disk for instance-00000012 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.723 222021 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] skipping disk for instance-00000012 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.856 222021 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.856 222021 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.856 222021 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.913 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "f62791ad-fc40-451f-b02a-ba991f2dbc32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.913 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.913 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.913 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.913 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.914 222021 INFO nova.compute.manager [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Terminating instance#033[00m
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.915 222021 DEBUG nova.compute.manager [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:33:20 np0005593233 kernel: tap857f8a0c-0b (unregistering): left promiscuous mode
Jan 23 04:33:20 np0005593233 NetworkManager[48871]: <info>  [1769160800.9739] device (tap857f8a0c-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00068|binding|INFO|Releasing lport 857f8a0c-0bda-43ca-85aa-7f22568eddc7 from this chassis (sb_readonly=0)
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.987 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00069|binding|INFO|Setting lport 857f8a0c-0bda-43ca-85aa-7f22568eddc7 down in Southbound
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00070|binding|INFO|Releasing lport 7dc28ada-b6f3-4524-9e75-42c4d4604d63 from this chassis (sb_readonly=0)
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00071|binding|INFO|Setting lport 7dc28ada-b6f3-4524-9e75-42c4d4604d63 down in Southbound
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00072|binding|INFO|Removing iface tap857f8a0c-0b ovn-installed in OVS
Jan 23 04:33:20 np0005593233 nova_compute[222017]: 2026-01-23 09:33:20.990 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00073|binding|INFO|Releasing lport 8e19ba82-19a8-44be-8cf0-66f5e53af8a2 from this chassis (sb_readonly=0)
Jan 23 04:33:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:33:20Z|00074|binding|INFO|Releasing lport 7b93c40e-1f44-4d5a-9bad-e23468f98d69 from this chassis (sb_readonly=0)
Jan 23 04:33:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:20.995 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:aa:f3 10.100.0.10'], port_security=['fa:16:3e:d9:aa:f3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-412021528', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f62791ad-fc40-451f-b02a-ba991f2dbc32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-412021528', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cabb3d88-013b-4542-b789-52d49c567d53, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=857f8a0c-0bda-43ca-85aa-7f22568eddc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:33:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:20.997 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:1d:32 19.80.0.19'], port_security=['fa:16:3e:4b:1d:32 19.80.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['857f8a0c-0bda-43ca-85aa-7f22568eddc7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1183347964', 'neutron:cidrs': '19.80.0.19/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48c9624b-33de-47f9-a720-02dd9028b5ea', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1183347964', 'neutron:project_id': 'c56e53b3339e4e4db30b7a9d330bc380', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c0c0e09a-b9c3-4a3a-af9e-c3b66e9f8bc1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=7ac2005c-13d2-4227-8eb4-3d332da8f5d6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7dc28ada-b6f3-4524-9e75-42c4d4604d63) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:33:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:20.998 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 857f8a0c-0bda-43ca-85aa-7f22568eddc7 in datapath 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 unbound from our chassis#033[00m
Jan 23 04:33:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:20.999 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:20.999 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[50101d77-40c9-481c-87bb-e4d5b5c96e3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.000 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 namespace which is not needed anymore#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.019 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 23 04:33:21 np0005593233 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 7.078s CPU time.
Jan 23 04:33:21 np0005593233 systemd-machined[190954]: Machine qemu-5-instance-00000009 terminated.
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.091 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.105 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 NetworkManager[48871]: <info>  [1769160801.1384] manager: (tap857f8a0c-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.152 222021 INFO nova.virt.libvirt.driver [-] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Instance destroyed successfully.#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.153 222021 DEBUG nova.objects.instance [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lazy-loading 'resources' on Instance uuid f62791ad-fc40-451f-b02a-ba991f2dbc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [NOTICE]   (227256) : haproxy version is 2.8.14-c23fe91
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [NOTICE]   (227256) : path to executable is /usr/sbin/haproxy
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [WARNING]  (227256) : Exiting Master process...
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [ALERT]    (227256) : Current worker (227258) exited with code 143 (Terminated)
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4[227252]: [WARNING]  (227256) : All workers exited. Exiting... (0)
Jan 23 04:33:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:33:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:33:21 np0005593233 systemd[1]: libpod-0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2.scope: Deactivated successfully.
Jan 23 04:33:21 np0005593233 podman[231246]: 2026-01-23 09:33:21.192100274 +0000 UTC m=+0.053729190 container died 0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:33:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2-userdata-shm.mount: Deactivated successfully.
Jan 23 04:33:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay-91c39e435e75570e48611ebab9c88856bae214abb7864a9771c9ad0c72717b90-merged.mount: Deactivated successfully.
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.230 222021 DEBUG nova.virt.libvirt.vif [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1280958077',display_name='tempest-LiveMigrationTest-server-1280958077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1280958077',id=9,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:30:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c56e53b3339e4e4db30b7a9d330bc380',ramdisk_id='',reservation_id='r-xkzmzoa6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1903931568',owner_user_name='tempest-LiveMigrationTest-1903931568-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:31:28Z,user_data=None,user_id='a43b680a6019491aafe42c0a10e648df',uuid=f62791ad-fc40-451f-b02a-ba991f2dbc32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.230 222021 DEBUG nova.network.os_vif_util [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converting VIF {"id": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "address": "fa:16:3e:d9:aa:f3", "network": {"id": "385e7a4d-f87e-44c5-9fc0-5a322eecd4b4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1143816535-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c56e53b3339e4e4db30b7a9d330bc380", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap857f8a0c-0b", "ovs_interfaceid": "857f8a0c-0bda-43ca-85aa-7f22568eddc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.231 222021 DEBUG nova.network.os_vif_util [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:aa:f3,bridge_name='br-int',has_traffic_filtering=True,id=857f8a0c-0bda-43ca-85aa-7f22568eddc7,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap857f8a0c-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.231 222021 DEBUG os_vif [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:aa:f3,bridge_name='br-int',has_traffic_filtering=True,id=857f8a0c-0bda-43ca-85aa-7f22568eddc7,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap857f8a0c-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.234 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap857f8a0c-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:21 np0005593233 podman[231246]: 2026-01-23 09:33:21.234112365 +0000 UTC m=+0.095741281 container cleanup 0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.235 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.237 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.240 222021 INFO os_vif [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:aa:f3,bridge_name='br-int',has_traffic_filtering=True,id=857f8a0c-0bda-43ca-85aa-7f22568eddc7,network=Network(385e7a4d-f87e-44c5-9fc0-5a322eecd4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap857f8a0c-0b')#033[00m
Jan 23 04:33:21 np0005593233 systemd[1]: libpod-conmon-0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2.scope: Deactivated successfully.
Jan 23 04:33:21 np0005593233 podman[231292]: 2026-01-23 09:33:21.313562427 +0000 UTC m=+0.050412138 container remove 0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.323 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4b45b8-437f-4592-a543-5b5a63807921]: (4, ('Fri Jan 23 09:33:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 (0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2)\n0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2\nFri Jan 23 09:33:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 (0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2)\n0095c1abecda1161470f269ede9d2b35ed0bdda2e8bb14496f71b44bba3340a2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.326 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[166ef1a5-575c-41eb-8881-42a1f9071d1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.327 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap385e7a4d-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.329 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 kernel: tap385e7a4d-f0: left promiscuous mode
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.345 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.348 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8e4a8780-2273-4912-b914-def6db5102a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.362 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2da94e6d-48fb-4052-8cd0-57e6b207a54f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.364 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[45ebc955-27ca-4e44-8d96-dacb8808f8fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.381 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b02f3740-2fca-4e00-ad9d-6a4e69ce890d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459409, 'reachable_time': 42435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231318, 'error': None, 'target': 'ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 systemd[1]: run-netns-ovnmeta\x2d385e7a4d\x2df87e\x2d44c5\x2d9fc0\x2d5a322eecd4b4.mount: Deactivated successfully.
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.386 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-385e7a4d-f87e-44c5-9fc0-5a322eecd4b4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.386 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[560441ec-038a-4d46-b02e-bc1006a6410c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.387 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 7dc28ada-b6f3-4524-9e75-42c4d4604d63 in datapath 48c9624b-33de-47f9-a720-02dd9028b5ea unbound from our chassis#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.388 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48c9624b-33de-47f9-a720-02dd9028b5ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.389 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a114a5-994b-4931-b0f2-464013f33930]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.389 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea namespace which is not needed anymore#033[00m
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [NOTICE]   (227330) : haproxy version is 2.8.14-c23fe91
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [NOTICE]   (227330) : path to executable is /usr/sbin/haproxy
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [WARNING]  (227330) : Exiting Master process...
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [WARNING]  (227330) : Exiting Master process...
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [ALERT]    (227330) : Current worker (227332) exited with code 143 (Terminated)
Jan 23 04:33:21 np0005593233 neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea[227326]: [WARNING]  (227330) : All workers exited. Exiting... (0)
Jan 23 04:33:21 np0005593233 systemd[1]: libpod-9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6.scope: Deactivated successfully.
Jan 23 04:33:21 np0005593233 podman[231336]: 2026-01-23 09:33:21.54503993 +0000 UTC m=+0.049683337 container died 9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:33:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6-userdata-shm.mount: Deactivated successfully.
Jan 23 04:33:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay-87a33d0d5827298b94d81918e38e4f2f185514ed892094c19f6d562f6d1836fa-merged.mount: Deactivated successfully.
Jan 23 04:33:21 np0005593233 podman[231336]: 2026-01-23 09:33:21.587172124 +0000 UTC m=+0.091815531 container cleanup 9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.592 222021 DEBUG nova.compute.manager [req-e31a7e27-903d-40ce-9e92-fb385c107806 req-727937a8-e1aa-4a6e-9c9e-c9211ba0fa78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Received event network-vif-unplugged-857f8a0c-0bda-43ca-85aa-7f22568eddc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.593 222021 DEBUG oslo_concurrency.lockutils [req-e31a7e27-903d-40ce-9e92-fb385c107806 req-727937a8-e1aa-4a6e-9c9e-c9211ba0fa78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.593 222021 DEBUG oslo_concurrency.lockutils [req-e31a7e27-903d-40ce-9e92-fb385c107806 req-727937a8-e1aa-4a6e-9c9e-c9211ba0fa78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.593 222021 DEBUG oslo_concurrency.lockutils [req-e31a7e27-903d-40ce-9e92-fb385c107806 req-727937a8-e1aa-4a6e-9c9e-c9211ba0fa78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.593 222021 DEBUG nova.compute.manager [req-e31a7e27-903d-40ce-9e92-fb385c107806 req-727937a8-e1aa-4a6e-9c9e-c9211ba0fa78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] No waiting events found dispatching network-vif-unplugged-857f8a0c-0bda-43ca-85aa-7f22568eddc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.593 222021 DEBUG nova.compute.manager [req-e31a7e27-903d-40ce-9e92-fb385c107806 req-727937a8-e1aa-4a6e-9c9e-c9211ba0fa78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Received event network-vif-unplugged-857f8a0c-0bda-43ca-85aa-7f22568eddc7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:33:21 np0005593233 systemd[1]: libpod-conmon-9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6.scope: Deactivated successfully.
Jan 23 04:33:21 np0005593233 podman[231368]: 2026-01-23 09:33:21.650255576 +0000 UTC m=+0.042285079 container remove 9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.656 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[374c339e-a64c-4a55-8c59-4a0e6ca930a6]: (4, ('Fri Jan 23 09:33:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea (9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6)\n9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6\nFri Jan 23 09:33:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea (9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6)\n9c5349bb5eb0e82f3b55dc0e75d7e9980f5f67fda127c8cf2423f65286f0ade6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.657 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a4aea7e0-1209-4448-9c39-3c45fb38da34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.658 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48c9624b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 kernel: tap48c9624b-30: left promiscuous mode
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.664 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd30b76-9ee0-41e8-b6be-5ae36ead9d11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.683 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[108086a8-bf06-4b52-b788-a206bd22d0b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.684 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0dffad-2ec2-42c2-828e-664f31c41c05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.699 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1e75f571-9f54-4ef3-b9c1-3b447d53a755]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459505, 'reachable_time': 39565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231383, 'error': None, 'target': 'ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.701 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48c9624b-33de-47f9-a720-02dd9028b5ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:21.701 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[48db03ac-d73f-4332-9780-93caeaed6f29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.716 222021 INFO nova.virt.libvirt.driver [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Deleting instance files /var/lib/nova/instances/f62791ad-fc40-451f-b02a-ba991f2dbc32_del#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.717 222021 INFO nova.virt.libvirt.driver [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Deletion of /var/lib/nova/instances/f62791ad-fc40-451f-b02a-ba991f2dbc32_del complete#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.795 222021 INFO nova.compute.manager [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Took 0.88 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.795 222021 DEBUG oslo.service.loopingcall [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.795 222021 DEBUG nova.compute.manager [-] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:33:21 np0005593233 nova_compute[222017]: 2026-01-23 09:33:21.796 222021 DEBUG nova.network.neutron [-] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:33:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:33:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:22.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:33:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:22.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:22 np0005593233 systemd[1]: run-netns-ovnmeta\x2d48c9624b\x2d33de\x2d47f9\x2da720\x2d02dd9028b5ea.mount: Deactivated successfully.
Jan 23 04:33:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.315 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.792 222021 DEBUG nova.compute.manager [req-49ee81d6-fbed-4acd-9546-2b38049caabf req-30d932ca-3aaa-467d-9c07-9314301fe953 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Received event network-vif-plugged-857f8a0c-0bda-43ca-85aa-7f22568eddc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.792 222021 DEBUG oslo_concurrency.lockutils [req-49ee81d6-fbed-4acd-9546-2b38049caabf req-30d932ca-3aaa-467d-9c07-9314301fe953 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.792 222021 DEBUG oslo_concurrency.lockutils [req-49ee81d6-fbed-4acd-9546-2b38049caabf req-30d932ca-3aaa-467d-9c07-9314301fe953 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.793 222021 DEBUG oslo_concurrency.lockutils [req-49ee81d6-fbed-4acd-9546-2b38049caabf req-30d932ca-3aaa-467d-9c07-9314301fe953 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.793 222021 DEBUG nova.compute.manager [req-49ee81d6-fbed-4acd-9546-2b38049caabf req-30d932ca-3aaa-467d-9c07-9314301fe953 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] No waiting events found dispatching network-vif-plugged-857f8a0c-0bda-43ca-85aa-7f22568eddc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:33:23 np0005593233 nova_compute[222017]: 2026-01-23 09:33:23.793 222021 WARNING nova.compute.manager [req-49ee81d6-fbed-4acd-9546-2b38049caabf req-30d932ca-3aaa-467d-9c07-9314301fe953 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Received unexpected event network-vif-plugged-857f8a0c-0bda-43ca-85aa-7f22568eddc7 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:33:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:24.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:24 np0005593233 podman[231384]: 2026-01-23 09:33:24.150412895 +0000 UTC m=+0.153886193 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:33:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:24.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:24 np0005593233 nova_compute[222017]: 2026-01-23 09:33:24.418 222021 DEBUG nova.network.neutron [-] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:24 np0005593233 nova_compute[222017]: 2026-01-23 09:33:24.449 222021 INFO nova.compute.manager [-] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Took 2.65 seconds to deallocate network for instance.#033[00m
Jan 23 04:33:24 np0005593233 nova_compute[222017]: 2026-01-23 09:33:24.511 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:24 np0005593233 nova_compute[222017]: 2026-01-23 09:33:24.512 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:24 np0005593233 nova_compute[222017]: 2026-01-23 09:33:24.619 222021 DEBUG oslo_concurrency.processutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:24.969 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3942486669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:25 np0005593233 nova_compute[222017]: 2026-01-23 09:33:25.106 222021 DEBUG oslo_concurrency.processutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:25 np0005593233 nova_compute[222017]: 2026-01-23 09:33:25.113 222021 DEBUG nova.compute.provider_tree [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:25 np0005593233 nova_compute[222017]: 2026-01-23 09:33:25.136 222021 DEBUG nova.scheduler.client.report [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:25 np0005593233 nova_compute[222017]: 2026-01-23 09:33:25.209 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:25 np0005593233 nova_compute[222017]: 2026-01-23 09:33:25.272 222021 INFO nova.scheduler.client.report [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Deleted allocations for instance f62791ad-fc40-451f-b02a-ba991f2dbc32#033[00m
Jan 23 04:33:25 np0005593233 nova_compute[222017]: 2026-01-23 09:33:25.404 222021 DEBUG oslo_concurrency.lockutils [None req-e0bceeda-edc0-4299-bd73-3c3c9b38dac4 a43b680a6019491aafe42c0a10e648df c56e53b3339e4e4db30b7a9d330bc380 - - default default] Lock "f62791ad-fc40-451f-b02a-ba991f2dbc32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:26.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:26.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:26 np0005593233 nova_compute[222017]: 2026-01-23 09:33:26.237 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:28.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:28.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:28 np0005593233 nova_compute[222017]: 2026-01-23 09:33:28.437 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:30.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:30.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.031 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160796.0301507, 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.032 222021 INFO nova.compute.manager [-] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.077 222021 DEBUG nova.compute.manager [None req-b2123e3e-37e9-4eb8-b578-2418eef08026 - - - - - -] [instance: 54a1ad4e-6fc9-42dc-aa4c-99d3f1297520] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.133 222021 INFO nova.compute.manager [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Swapping old allocation on dict_keys(['929812a2-38ca-4ee7-9f24-090d633cb42b']) held by migration 36643aa1-ddbd-427a-beef-052fd4db42bf for instance#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.191 222021 DEBUG nova.scheduler.client.report [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Overwriting current allocation {'allocations': {'89873210-bee9-46e9-9f9d-0cd7a156c3a8': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 13}}, 'project_id': '11def90dfdc14cfe928302bec2835794', 'user_id': '7536fa2e625541fba613dc32a49a4c5b', 'consumer_generation': 1} on consumer 07c9ba0d-ab3d-4079-ab97-46e91de4911a move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.239 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.779 222021 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.780 222021 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:31 np0005593233 nova_compute[222017]: 2026-01-23 09:33:31.780 222021 DEBUG nova.network.neutron [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:32.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.027000755s ======
Jan 23 04:33:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:32.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.027000755s
Jan 23 04:33:32 np0005593233 nova_compute[222017]: 2026-01-23 09:33:32.491 222021 DEBUG nova.network.neutron [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:33 np0005593233 nova_compute[222017]: 2026-01-23 09:33:33.487 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:33 np0005593233 nova_compute[222017]: 2026-01-23 09:33:33.829 222021 DEBUG nova.network.neutron [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:33 np0005593233 nova_compute[222017]: 2026-01-23 09:33:33.913 222021 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:33 np0005593233 nova_compute[222017]: 2026-01-23 09:33:33.914 222021 DEBUG nova.virt.libvirt.driver [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 23 04:33:33 np0005593233 nova_compute[222017]: 2026-01-23 09:33:33.985 222021 DEBUG nova.storage.rbd_utils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rolling back rbd image(07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 23 04:33:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.096 222021 DEBUG nova.storage.rbd_utils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] removing snapshot(nova-resize) on rbd image(07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:33:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:34.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:34.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.902 222021 DEBUG nova.virt.libvirt.driver [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.905 222021 WARNING nova.virt.libvirt.driver [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.912 222021 DEBUG nova.virt.libvirt.host [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.913 222021 DEBUG nova.virt.libvirt.host [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.916 222021 DEBUG nova.virt.libvirt.host [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.916 222021 DEBUG nova.virt.libvirt.host [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.917 222021 DEBUG nova.virt.libvirt.driver [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.918 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.918 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.918 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.918 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.919 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.919 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.919 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.919 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.920 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.920 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.920 222021 DEBUG nova.virt.hardware [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.920 222021 DEBUG nova.objects.instance [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:34 np0005593233 nova_compute[222017]: 2026-01-23 09:33:34.945 222021 DEBUG oslo_concurrency.processutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:35 np0005593233 nova_compute[222017]: 2026-01-23 09:33:35.183 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160800.1820464, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:35 np0005593233 nova_compute[222017]: 2026-01-23 09:33:35.183 222021 INFO nova.compute.manager [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:33:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:33:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1734406312' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:33:35 np0005593233 nova_compute[222017]: 2026-01-23 09:33:35.411 222021 DEBUG oslo_concurrency.processutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:35 np0005593233 nova_compute[222017]: 2026-01-23 09:33:35.468 222021 DEBUG oslo_concurrency.processutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:33:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/363487294' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:33:35 np0005593233 nova_compute[222017]: 2026-01-23 09:33:35.947 222021 DEBUG oslo_concurrency.processutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:35 np0005593233 nova_compute[222017]: 2026-01-23 09:33:35.953 222021 DEBUG nova.virt.libvirt.driver [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <uuid>07c9ba0d-ab3d-4079-ab97-46e91de4911a</uuid>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <name>instance-00000012</name>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:name>tempest-MigrationsAdminTest-server-2012645134</nova:name>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:33:34</nova:creationTime>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <entry name="serial">07c9ba0d-ab3d-4079-ab97-46e91de4911a</entry>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <entry name="uuid">07c9ba0d-ab3d-4079-ab97-46e91de4911a</entry>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/console.log" append="off"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:33:35 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:33:35 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:33:35 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:33:35 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:33:36 np0005593233 systemd-machined[190954]: New machine qemu-12-instance-00000012.
Jan 23 04:33:36 np0005593233 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Jan 23 04:33:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:36.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.151 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160801.15019, f62791ad-fc40-451f-b02a-ba991f2dbc32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.152 222021 INFO nova.compute.manager [-] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.243 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:36.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.537 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 07c9ba0d-ab3d-4079-ab97-46e91de4911a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.538 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160816.5373151, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.538 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.542 222021 DEBUG nova.compute.manager [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.547 222021 INFO nova.virt.libvirt.driver [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance running successfully.#033[00m
Jan 23 04:33:36 np0005593233 nova_compute[222017]: 2026-01-23 09:33:36.548 222021 DEBUG nova.virt.libvirt.driver [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.342 222021 DEBUG nova.compute.manager [None req-953c5268-003d-42df-949c-a7ca0f5448ae - - - - - -] [instance: f62791ad-fc40-451f-b02a-ba991f2dbc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.343 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.346 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.354 222021 DEBUG nova.compute.manager [None req-44f9cd83-f111-4846-a9eb-32004697a6cc - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.426 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.426 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160816.5380743, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.426 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Started (Lifecycle Event)#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.450 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.453 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.487 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 04:33:37 np0005593233 nova_compute[222017]: 2026-01-23 09:33:37.500 222021 INFO nova.compute.manager [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance to original state: 'active'#033[00m
Jan 23 04:33:38 np0005593233 podman[231657]: 2026-01-23 09:33:38.074955775 +0000 UTC m=+0.071121249 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:33:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:38.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:38.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:38 np0005593233 nova_compute[222017]: 2026-01-23 09:33:38.491 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:40.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:40.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.060794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821060888, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2443, "num_deletes": 254, "total_data_size": 5788205, "memory_usage": 5856336, "flush_reason": "Manual Compaction"}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821099090, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3726777, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25190, "largest_seqno": 27628, "table_properties": {"data_size": 3716845, "index_size": 6234, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21599, "raw_average_key_size": 20, "raw_value_size": 3696614, "raw_average_value_size": 3557, "num_data_blocks": 274, "num_entries": 1039, "num_filter_entries": 1039, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160639, "oldest_key_time": 1769160639, "file_creation_time": 1769160821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 38353 microseconds, and 8977 cpu microseconds.
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.099154) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3726777 bytes OK
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.099179) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.101238) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.101256) EVENT_LOG_v1 {"time_micros": 1769160821101250, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.101276) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5777114, prev total WAL file size 5777114, number of live WAL files 2.
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.103009) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3639KB)], [51(9130KB)]
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821103224, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 13076910, "oldest_snapshot_seqno": -1}
Jan 23 04:33:41 np0005593233 nova_compute[222017]: 2026-01-23 09:33:41.245 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5380 keys, 11100931 bytes, temperature: kUnknown
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821249345, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 11100931, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11062317, "index_size": 24040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 134480, "raw_average_key_size": 24, "raw_value_size": 10962607, "raw_average_value_size": 2037, "num_data_blocks": 991, "num_entries": 5380, "num_filter_entries": 5380, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769160821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.249793) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 11100931 bytes
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.251839) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.4 rd, 75.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 8.9 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 5907, records dropped: 527 output_compression: NoCompression
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.251870) EVENT_LOG_v1 {"time_micros": 1769160821251856, "job": 30, "event": "compaction_finished", "compaction_time_micros": 146214, "compaction_time_cpu_micros": 41451, "output_level": 6, "num_output_files": 1, "total_output_size": 11100931, "num_input_records": 5907, "num_output_records": 5380, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821253073, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821255247, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.102598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.255347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.255364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.255369) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.255374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:41.255378) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:42.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:42.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:42.600 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:42.601 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:33:42.601 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 23 04:33:43 np0005593233 nova_compute[222017]: 2026-01-23 09:33:43.493 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.505352) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823505456, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 291, "num_deletes": 256, "total_data_size": 92417, "memory_usage": 99096, "flush_reason": "Manual Compaction"}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823508395, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 60856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27633, "largest_seqno": 27919, "table_properties": {"data_size": 58936, "index_size": 148, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4706, "raw_average_key_size": 16, "raw_value_size": 55011, "raw_average_value_size": 197, "num_data_blocks": 7, "num_entries": 278, "num_filter_entries": 278, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160822, "oldest_key_time": 1769160822, "file_creation_time": 1769160823, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 3055 microseconds, and 1041 cpu microseconds.
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.508427) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 60856 bytes OK
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.508443) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.509702) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.509729) EVENT_LOG_v1 {"time_micros": 1769160823509721, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.509752) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 90232, prev total WAL file size 90232, number of live WAL files 2.
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.510304) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373534' seq:0, type:0; will stop at (end)
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(59KB)], [54(10MB)]
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823510434, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 11161787, "oldest_snapshot_seqno": -1}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5134 keys, 11075532 bytes, temperature: kUnknown
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823617781, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 11075532, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11037826, "index_size": 23772, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 130545, "raw_average_key_size": 25, "raw_value_size": 10941723, "raw_average_value_size": 2131, "num_data_blocks": 976, "num_entries": 5134, "num_filter_entries": 5134, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769160823, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.618343) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 11075532 bytes
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.620373) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.8 rd, 103.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 10.6 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(365.4) write-amplify(182.0) OK, records in: 5658, records dropped: 524 output_compression: NoCompression
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.620404) EVENT_LOG_v1 {"time_micros": 1769160823620390, "job": 32, "event": "compaction_finished", "compaction_time_micros": 107553, "compaction_time_cpu_micros": 35539, "output_level": 6, "num_output_files": 1, "total_output_size": 11075532, "num_input_records": 5658, "num_output_records": 5134, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823620605, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823623305, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.510135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.623399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.623409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.623411) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.623413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:33:43.623415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:44.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:44.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.253 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.254 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.254 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.254 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.255 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.256 222021 INFO nova.compute.manager [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Terminating instance#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.258 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.258 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:45 np0005593233 nova_compute[222017]: 2026-01-23 09:33:45.258 222021 DEBUG nova.network.neutron [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:46.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:46 np0005593233 nova_compute[222017]: 2026-01-23 09:33:46.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:46.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:46 np0005593233 nova_compute[222017]: 2026-01-23 09:33:46.815 222021 DEBUG nova.network.neutron [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:47 np0005593233 nova_compute[222017]: 2026-01-23 09:33:47.986 222021 DEBUG nova.network.neutron [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:48.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:48 np0005593233 nova_compute[222017]: 2026-01-23 09:33:48.192 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:48 np0005593233 nova_compute[222017]: 2026-01-23 09:33:48.193 222021 DEBUG nova.compute.manager [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:33:48 np0005593233 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 23 04:33:48 np0005593233 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 12.207s CPU time.
Jan 23 04:33:48 np0005593233 systemd-machined[190954]: Machine qemu-12-instance-00000012 terminated.
Jan 23 04:33:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:48.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:48 np0005593233 nova_compute[222017]: 2026-01-23 09:33:48.418 222021 INFO nova.virt.libvirt.driver [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance destroyed successfully.#033[00m
Jan 23 04:33:48 np0005593233 nova_compute[222017]: 2026-01-23 09:33:48.419 222021 DEBUG nova.objects.instance [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'resources' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:48 np0005593233 nova_compute[222017]: 2026-01-23 09:33:48.528 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:49 np0005593233 nova_compute[222017]: 2026-01-23 09:33:49.002 222021 INFO nova.virt.libvirt.driver [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Deleting instance files /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a_del#033[00m
Jan 23 04:33:49 np0005593233 nova_compute[222017]: 2026-01-23 09:33:49.003 222021 INFO nova.virt.libvirt.driver [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Deletion of /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a_del complete#033[00m
Jan 23 04:33:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:49 np0005593233 nova_compute[222017]: 2026-01-23 09:33:49.828 222021 INFO nova.compute.manager [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Took 1.63 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:33:49 np0005593233 nova_compute[222017]: 2026-01-23 09:33:49.828 222021 DEBUG oslo.service.loopingcall [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:33:49 np0005593233 nova_compute[222017]: 2026-01-23 09:33:49.829 222021 DEBUG nova.compute.manager [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:33:49 np0005593233 nova_compute[222017]: 2026-01-23 09:33:49.829 222021 DEBUG nova.network.neutron [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:33:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:50.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:50.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:50 np0005593233 nova_compute[222017]: 2026-01-23 09:33:50.404 222021 DEBUG nova.network.neutron [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:50 np0005593233 nova_compute[222017]: 2026-01-23 09:33:50.426 222021 DEBUG nova.network.neutron [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:50 np0005593233 nova_compute[222017]: 2026-01-23 09:33:50.456 222021 INFO nova.compute.manager [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Took 0.63 seconds to deallocate network for instance.#033[00m
Jan 23 04:33:50 np0005593233 nova_compute[222017]: 2026-01-23 09:33:50.514 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:50 np0005593233 nova_compute[222017]: 2026-01-23 09:33:50.515 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:50 np0005593233 nova_compute[222017]: 2026-01-23 09:33:50.628 222021 DEBUG oslo_concurrency.processutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/62338276' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:51 np0005593233 nova_compute[222017]: 2026-01-23 09:33:51.095 222021 DEBUG oslo_concurrency.processutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:51 np0005593233 nova_compute[222017]: 2026-01-23 09:33:51.103 222021 DEBUG nova.compute.provider_tree [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:51 np0005593233 nova_compute[222017]: 2026-01-23 09:33:51.249 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:52.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:52.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:53 np0005593233 nova_compute[222017]: 2026-01-23 09:33:53.143 222021 DEBUG nova.scheduler.client.report [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:53 np0005593233 nova_compute[222017]: 2026-01-23 09:33:53.259 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:53 np0005593233 nova_compute[222017]: 2026-01-23 09:33:53.439 222021 INFO nova.scheduler.client.report [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Deleted allocations for instance 07c9ba0d-ab3d-4079-ab97-46e91de4911a#033[00m
Jan 23 04:33:53 np0005593233 nova_compute[222017]: 2026-01-23 09:33:53.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:54.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:54.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:54 np0005593233 nova_compute[222017]: 2026-01-23 09:33:54.640 222021 DEBUG oslo_concurrency.lockutils [None req-8b22b155-2fe6-4e95-8424-f1008ab7d29f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "07c9ba0d-ab3d-4079-ab97-46e91de4911a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:55 np0005593233 podman[231721]: 2026-01-23 09:33:55.261055872 +0000 UTC m=+0.185015359 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:33:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:56.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:56 np0005593233 nova_compute[222017]: 2026-01-23 09:33:56.251 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 23 04:33:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 23 04:33:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:58.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:33:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:33:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:58.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:33:58 np0005593233 nova_compute[222017]: 2026-01-23 09:33:58.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 23 04:33:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:00.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:00.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.387 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "0fb415e8-9c82-4021-9088-cfd399d453a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.388 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.388 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "0fb415e8-9c82-4021-9088-cfd399d453a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.389 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.389 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.390 222021 INFO nova.compute.manager [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Terminating instance#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.391 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.392 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.392 222021 DEBUG nova.network.neutron [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:34:00 np0005593233 nova_compute[222017]: 2026-01-23 09:34:00.829 222021 DEBUG nova.network.neutron [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:34:01 np0005593233 nova_compute[222017]: 2026-01-23 09:34:01.253 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.111 222021 DEBUG nova.network.neutron [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:34:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:02.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.233 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.234 222021 DEBUG nova.compute.manager [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:34:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:02 np0005593233 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 23 04:34:02 np0005593233 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Consumed 15.708s CPU time.
Jan 23 04:34:02 np0005593233 systemd-machined[190954]: Machine qemu-9-instance-0000000e terminated.
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.413 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.413 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.463 222021 INFO nova.virt.libvirt.driver [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance destroyed successfully.#033[00m
Jan 23 04:34:02 np0005593233 nova_compute[222017]: 2026-01-23 09:34:02.464 222021 DEBUG nova.objects.instance [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'resources' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.416 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160828.4159012, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.417 222021 INFO nova.compute.manager [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.434 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.435 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.504 222021 DEBUG nova.compute.manager [None req-d331e34f-fe36-49d1-86e2-cb6255076478 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.537 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 23 04:34:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:03 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/940264932' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:03 np0005593233 nova_compute[222017]: 2026-01-23 09:34:03.882 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.002 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.003 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000000e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:34:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:04.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.222 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.223 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4861MB free_disk=20.80617904663086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.223 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.224 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.241 222021 INFO nova.virt.libvirt.driver [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Deleting instance files /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0_del#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.242 222021 INFO nova.virt.libvirt.driver [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Deletion of /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0_del complete#033[00m
Jan 23 04:34:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:04.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.422 222021 INFO nova.compute.manager [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Took 2.19 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.424 222021 DEBUG oslo.service.loopingcall [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.424 222021 DEBUG nova.compute.manager [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.425 222021 DEBUG nova.network.neutron [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.450 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0fb415e8-9c82-4021-9088-cfd399d453a0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.451 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.452 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.585 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.629 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.630 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.737 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.804 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:34:04 np0005593233 nova_compute[222017]: 2026-01-23 09:34:04.911 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1325590943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:05 np0005593233 nova_compute[222017]: 2026-01-23 09:34:05.398 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:05 np0005593233 nova_compute[222017]: 2026-01-23 09:34:05.406 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:34:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:06.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:06 np0005593233 nova_compute[222017]: 2026-01-23 09:34:06.256 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:06.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:06 np0005593233 nova_compute[222017]: 2026-01-23 09:34:06.855 222021 DEBUG nova.network.neutron [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:34:07 np0005593233 nova_compute[222017]: 2026-01-23 09:34:07.373 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:34:07 np0005593233 nova_compute[222017]: 2026-01-23 09:34:07.379 222021 DEBUG nova.network.neutron [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:34:07 np0005593233 nova_compute[222017]: 2026-01-23 09:34:07.442 222021 INFO nova.compute.manager [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Took 3.02 seconds to deallocate network for instance.#033[00m
Jan 23 04:34:07 np0005593233 nova_compute[222017]: 2026-01-23 09:34:07.845 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:34:07 np0005593233 nova_compute[222017]: 2026-01-23 09:34:07.846 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:08.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.287 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.289 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:08.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.468 222021 DEBUG oslo_concurrency.processutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.848 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.849 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:34:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2443167423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.988 222021 DEBUG oslo_concurrency.processutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:08 np0005593233 nova_compute[222017]: 2026-01-23 09:34:08.998 222021 DEBUG nova.compute.provider_tree [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:34:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:09 np0005593233 podman[231835]: 2026-01-23 09:34:09.077796139 +0000 UTC m=+0.082103547 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.110 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.112 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.112 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.113 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.113 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.113 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.122 222021 DEBUG nova.scheduler.client.report [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.204 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.385 222021 INFO nova.scheduler.client.report [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Deleted allocations for instance 0fb415e8-9c82-4021-9088-cfd399d453a0#033[00m
Jan 23 04:34:09 np0005593233 nova_compute[222017]: 2026-01-23 09:34:09.627 222021 DEBUG oslo_concurrency.lockutils [None req-3453b5c7-ecea-4c52-bad7-c5ebaa74e259 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "0fb415e8-9c82-4021-9088-cfd399d453a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:10.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:10.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:10 np0005593233 nova_compute[222017]: 2026-01-23 09:34:10.644 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:11 np0005593233 nova_compute[222017]: 2026-01-23 09:34:11.258 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:12.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:12.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:13 np0005593233 nova_compute[222017]: 2026-01-23 09:34:13.546 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:14.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:14.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:16.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:16 np0005593233 nova_compute[222017]: 2026-01-23 09:34:16.259 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:16.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:17 np0005593233 nova_compute[222017]: 2026-01-23 09:34:17.462 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160842.4612734, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:34:17 np0005593233 nova_compute[222017]: 2026-01-23 09:34:17.463 222021 INFO nova.compute.manager [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:34:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:18.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:18.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:18 np0005593233 nova_compute[222017]: 2026-01-23 09:34:18.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:20.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:20.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:21 np0005593233 nova_compute[222017]: 2026-01-23 09:34:21.021 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:34:21.023 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:34:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:34:21.023 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:34:21 np0005593233 nova_compute[222017]: 2026-01-23 09:34:21.076 222021 DEBUG nova.compute.manager [None req-88c640e6-d59f-40c9-87d5-7597fb805c98 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:34:21 np0005593233 nova_compute[222017]: 2026-01-23 09:34:21.260 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:22.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:22.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:23 np0005593233 nova_compute[222017]: 2026-01-23 09:34:23.551 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:34:24.026 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:34:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:24.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:24.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:25 np0005593233 ovn_controller[130653]: 2026-01-23T09:34:25Z|00075|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 04:34:26 np0005593233 podman[231859]: 2026-01-23 09:34:26.004062547 +0000 UTC m=+0.138112001 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 04:34:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:26.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:26 np0005593233 nova_compute[222017]: 2026-01-23 09:34:26.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:26.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:28.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:34:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:34:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:34:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:34:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:28.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:28 np0005593233 nova_compute[222017]: 2026-01-23 09:34:28.554 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:30.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:30.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:31 np0005593233 nova_compute[222017]: 2026-01-23 09:34:31.264 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:34:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:32.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:34:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:32.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:33 np0005593233 nova_compute[222017]: 2026-01-23 09:34:33.557 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:34:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:34.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:34:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:34.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:34:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:34:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:36.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:36 np0005593233 nova_compute[222017]: 2026-01-23 09:34:36.266 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:36.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:38.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:38.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 23 04:34:38 np0005593233 nova_compute[222017]: 2026-01-23 09:34:38.560 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:40 np0005593233 podman[232068]: 2026-01-23 09:34:40.088093831 +0000 UTC m=+0.099891418 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 23 04:34:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:40.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:40.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:41 np0005593233 nova_compute[222017]: 2026-01-23 09:34:41.309 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:42.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:42.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.534 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "3aa76906-f3e3-4e71-9465-92984a1b0b47" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.535 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "3aa76906-f3e3-4e71-9465-92984a1b0b47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.577 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:34:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:34:42.601 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:34:42.602 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:34:42.602 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.698 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.699 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.709 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.710 222021 INFO nova.compute.claims [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:34:42 np0005593233 nova_compute[222017]: 2026-01-23 09:34:42.881 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3914020323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.340 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.349 222021 DEBUG nova.compute.provider_tree [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.562 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.730 222021 DEBUG nova.scheduler.client.report [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.797 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.798 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.899 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.900 222021 DEBUG nova.network.neutron [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:34:43 np0005593233 nova_compute[222017]: 2026-01-23 09:34:43.948 222021 INFO nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.002 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:34:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.176 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.177 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.178 222021 INFO nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Creating image(s)#033[00m
Jan 23 04:34:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:34:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:44.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.215 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.241 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.264 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.268 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.325 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.326 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.327 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.327 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.354 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.357 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:44.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.716 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.359s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.794 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] resizing rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.894 222021 DEBUG nova.objects.instance [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lazy-loading 'migration_context' on Instance uuid 3aa76906-f3e3-4e71-9465-92984a1b0b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.922 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.922 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Ensure instance console log exists: /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.923 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.923 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.924 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.937 222021 DEBUG nova.network.neutron [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.937 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.939 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.945 222021 WARNING nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.950 222021 DEBUG nova.virt.libvirt.host [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.952 222021 DEBUG nova.virt.libvirt.host [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.954 222021 DEBUG nova.virt.libvirt.host [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.955 222021 DEBUG nova.virt.libvirt.host [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.957 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.957 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.958 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.958 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.958 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.959 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.959 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.959 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.959 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.960 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.960 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.960 222021 DEBUG nova.virt.hardware [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:34:44 np0005593233 nova_compute[222017]: 2026-01-23 09:34:44.964 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:34:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/985572231' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:34:45 np0005593233 nova_compute[222017]: 2026-01-23 09:34:45.447 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:45 np0005593233 nova_compute[222017]: 2026-01-23 09:34:45.479 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:45 np0005593233 nova_compute[222017]: 2026-01-23 09:34:45.485 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:34:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1849903170' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:34:45 np0005593233 nova_compute[222017]: 2026-01-23 09:34:45.942 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:45 np0005593233 nova_compute[222017]: 2026-01-23 09:34:45.944 222021 DEBUG nova.objects.instance [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lazy-loading 'pci_devices' on Instance uuid 3aa76906-f3e3-4e71-9465-92984a1b0b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:34:45 np0005593233 nova_compute[222017]: 2026-01-23 09:34:45.968 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <uuid>3aa76906-f3e3-4e71-9465-92984a1b0b47</uuid>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <name>instance-00000014</name>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:name>tempest-LiveMigrationNegativeTest-server-133987843</nova:name>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:34:44</nova:creationTime>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:user uuid="3e17ce3f8d5246daad6b3964a2b6df05">tempest-LiveMigrationNegativeTest-202193021-project-member</nova:user>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <nova:project uuid="ab9c85124a434b1390041a9ca5c05ddd">tempest-LiveMigrationNegativeTest-202193021</nova:project>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <entry name="serial">3aa76906-f3e3-4e71-9465-92984a1b0b47</entry>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <entry name="uuid">3aa76906-f3e3-4e71-9465-92984a1b0b47</entry>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3aa76906-f3e3-4e71-9465-92984a1b0b47_disk">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3aa76906-f3e3-4e71-9465-92984a1b0b47_disk.config">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/console.log" append="off"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:34:45 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:34:45 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:34:45 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:34:45 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.113 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.114 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.114 222021 INFO nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Using config drive#033[00m
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.144 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:46.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.311 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:46.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.920 222021 INFO nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Creating config drive at /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/disk.config#033[00m
Jan 23 04:34:46 np0005593233 nova_compute[222017]: 2026-01-23 09:34:46.926 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptnyf6v2p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.063 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptnyf6v2p" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.095 222021 DEBUG nova.storage.rbd_utils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.100 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/disk.config 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.305 222021 DEBUG oslo_concurrency.processutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/disk.config 3aa76906-f3e3-4e71-9465-92984a1b0b47_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.307 222021 INFO nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Deleting local config drive /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47/disk.config because it was imported into RBD.#033[00m
Jan 23 04:34:47 np0005593233 systemd-machined[190954]: New machine qemu-13-instance-00000014.
Jan 23 04:34:47 np0005593233 systemd[1]: Started Virtual Machine qemu-13-instance-00000014.
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.897 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160887.897354, 3aa76906-f3e3-4e71-9465-92984a1b0b47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.899 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.902 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.903 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.906 222021 INFO nova.virt.libvirt.driver [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance spawned successfully.#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.907 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.961 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:34:47 np0005593233 nova_compute[222017]: 2026-01-23 09:34:47.965 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.006 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.007 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.007 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.008 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.008 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.009 222021 DEBUG nova.virt.libvirt.driver [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.053 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.054 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160887.8983183, 3aa76906-f3e3-4e71-9465-92984a1b0b47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.055 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] VM Started (Lifecycle Event)#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.134 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.138 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:34:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:48.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.224 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.233 222021 INFO nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Took 4.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.234 222021 DEBUG nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.321 222021 INFO nova.compute.manager [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Took 5.67 seconds to build instance.#033[00m
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.350 222021 DEBUG oslo_concurrency.lockutils [None req-5ac0ecd2-bb2b-49c6-8467-7cab40574a31 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "3aa76906-f3e3-4e71-9465-92984a1b0b47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:48.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:48 np0005593233 nova_compute[222017]: 2026-01-23 09:34:48.563 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:50.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:50.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:51 np0005593233 nova_compute[222017]: 2026-01-23 09:34:51.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:52.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:52.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:53 np0005593233 nova_compute[222017]: 2026-01-23 09:34:53.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:54.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:54.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:56.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:56 np0005593233 nova_compute[222017]: 2026-01-23 09:34:56.351 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:56.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:57 np0005593233 podman[232456]: 2026-01-23 09:34:57.112714757 +0000 UTC m=+0.114493043 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:34:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 23 04:34:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 23 04:34:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:34:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:58.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:34:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:34:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:58.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:58 np0005593233 nova_compute[222017]: 2026-01-23 09:34:58.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 23 04:35:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:00.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:00.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:01 np0005593233 nova_compute[222017]: 2026-01-23 09:35:01.355 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:02.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:03 np0005593233 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.444 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.444 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.444 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.444 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.445 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.582 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:03 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2989760290' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:03 np0005593233 nova_compute[222017]: 2026-01-23 09:35:03.923 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.045 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.045 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000014 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:35:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:04.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.245 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.247 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=20.87649154663086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.247 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.248 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.386 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 3aa76906-f3e3-4e71-9465-92984a1b0b47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.387 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.387 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:35:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:04.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.472 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1920496545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.926 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:04 np0005593233 nova_compute[222017]: 2026-01-23 09:35:04.933 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:05 np0005593233 nova_compute[222017]: 2026-01-23 09:35:05.188 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:05 np0005593233 nova_compute[222017]: 2026-01-23 09:35:05.402 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:35:05 np0005593233 nova_compute[222017]: 2026-01-23 09:35:05.402 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:06.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.402 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.402 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.402 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:35:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:06.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.972 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-3aa76906-f3e3-4e71-9465-92984a1b0b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.973 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-3aa76906-f3e3-4e71-9465-92984a1b0b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.973 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:35:06 np0005593233 nova_compute[222017]: 2026-01-23 09:35:06.973 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3aa76906-f3e3-4e71-9465-92984a1b0b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.247 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.786 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.807 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-3aa76906-f3e3-4e71-9465-92984a1b0b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.807 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.807 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.808 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.808 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:07 np0005593233 nova_compute[222017]: 2026-01-23 09:35:07.808 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:35:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:08.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:08 np0005593233 nova_compute[222017]: 2026-01-23 09:35:08.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:08.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:08 np0005593233 nova_compute[222017]: 2026-01-23 09:35:08.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:09 np0005593233 nova_compute[222017]: 2026-01-23 09:35:09.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:10.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:10 np0005593233 nova_compute[222017]: 2026-01-23 09:35:10.415 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:10.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:10 np0005593233 nova_compute[222017]: 2026-01-23 09:35:10.901 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Acquiring lock "641f6008-576e-4221-a1d8-33ddfca6d069" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:10 np0005593233 nova_compute[222017]: 2026-01-23 09:35:10.902 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "641f6008-576e-4221-a1d8-33ddfca6d069" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:10 np0005593233 nova_compute[222017]: 2026-01-23 09:35:10.902 222021 INFO nova.compute.manager [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Unshelving#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.015 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.016 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.021 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'pci_requests' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.037 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'numa_topology' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:11 np0005593233 podman[232526]: 2026-01-23 09:35:11.053184698 +0000 UTC m=+0.063074902 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.062 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.063 222021 INFO nova.compute.claims [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.263 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4289192520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.728 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.737 222021 DEBUG nova.compute.provider_tree [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.757 222021 DEBUG nova.scheduler.client.report [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:11 np0005593233 nova_compute[222017]: 2026-01-23 09:35:11.788 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.059 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Acquiring lock "refresh_cache-641f6008-576e-4221-a1d8-33ddfca6d069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.059 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Acquired lock "refresh_cache-641f6008-576e-4221-a1d8-33ddfca6d069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.059 222021 DEBUG nova.network.neutron [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:35:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:12.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.436 222021 DEBUG nova.network.neutron [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:12.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.807 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "3aa76906-f3e3-4e71-9465-92984a1b0b47" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.807 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "3aa76906-f3e3-4e71-9465-92984a1b0b47" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.808 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "3aa76906-f3e3-4e71-9465-92984a1b0b47-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.808 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "3aa76906-f3e3-4e71-9465-92984a1b0b47-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.808 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "3aa76906-f3e3-4e71-9465-92984a1b0b47-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.810 222021 INFO nova.compute.manager [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Terminating instance#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.811 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "refresh_cache-3aa76906-f3e3-4e71-9465-92984a1b0b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.811 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquired lock "refresh_cache-3aa76906-f3e3-4e71-9465-92984a1b0b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:12 np0005593233 nova_compute[222017]: 2026-01-23 09:35:12.811 222021 DEBUG nova.network.neutron [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.062 222021 DEBUG nova.network.neutron [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.082 222021 DEBUG nova.network.neutron [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.089 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Releasing lock "refresh_cache-641f6008-576e-4221-a1d8-33ddfca6d069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.091 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.092 222021 INFO nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Creating image(s)#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.133 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] rbd image 641f6008-576e-4221-a1d8-33ddfca6d069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.141 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'trusted_certs' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.185 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] rbd image 641f6008-576e-4221-a1d8-33ddfca6d069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.211 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] rbd image 641f6008-576e-4221-a1d8-33ddfca6d069_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.215 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Acquiring lock "d2242beabde53e1403d12ae5962c05ab7844aef5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.216 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "d2242beabde53e1403d12ae5962c05ab7844aef5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.614 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.751 222021 DEBUG nova.network.neutron [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.766 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Releasing lock "refresh_cache-3aa76906-f3e3-4e71-9465-92984a1b0b47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.767 222021 DEBUG nova.compute.manager [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.841 222021 DEBUG nova.virt.libvirt.imagebackend [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/cb273736-b860-41f6-b2fe-c26bfa105d90/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/cb273736-b860-41f6-b2fe-c26bfa105d90/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 04:35:13 np0005593233 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 23 04:35:13 np0005593233 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000014.scope: Consumed 14.300s CPU time.
Jan 23 04:35:13 np0005593233 systemd-machined[190954]: Machine qemu-13-instance-00000014 terminated.
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.911 222021 DEBUG nova.virt.libvirt.imagebackend [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/cb273736-b860-41f6-b2fe-c26bfa105d90/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.912 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] cloning images/cb273736-b860-41f6-b2fe-c26bfa105d90@snap to None/641f6008-576e-4221-a1d8-33ddfca6d069_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.998 222021 INFO nova.virt.libvirt.driver [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance destroyed successfully.#033[00m
Jan 23 04:35:13 np0005593233 nova_compute[222017]: 2026-01-23 09:35:13.999 222021 DEBUG nova.objects.instance [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lazy-loading 'resources' on Instance uuid 3aa76906-f3e3-4e71-9465-92984a1b0b47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.051 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "d2242beabde53e1403d12ae5962c05ab7844aef5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.196 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'migration_context' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:14.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.270 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] flattening vms/641f6008-576e-4221-a1d8-33ddfca6d069_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:35:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:14.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.706 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Image rbd:vms/641f6008-576e-4221-a1d8-33ddfca6d069_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.706 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.707 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Ensure instance console log exists: /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.707 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.708 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.708 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.710 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T09:34:42Z,direct_url=<?>,disk_format='raw',id=cb273736-b860-41f6-b2fe-c26bfa105d90,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1701597104-shelved',owner='307173cd6ebb4dd5ad3883dedac0271e',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T09:35:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.714 222021 WARNING nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.722 222021 DEBUG nova.virt.libvirt.host [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.722 222021 DEBUG nova.virt.libvirt.host [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.726 222021 DEBUG nova.virt.libvirt.host [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.727 222021 DEBUG nova.virt.libvirt.host [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.728 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.728 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T09:34:42Z,direct_url=<?>,disk_format='raw',id=cb273736-b860-41f6-b2fe-c26bfa105d90,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1701597104-shelved',owner='307173cd6ebb4dd5ad3883dedac0271e',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T09:35:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.729 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.729 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.730 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.730 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.730 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.731 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.731 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.731 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.732 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.732 222021 DEBUG nova.virt.hardware [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.732 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'vcpu_model' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.756 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.887 222021 INFO nova.virt.libvirt.driver [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Deleting instance files /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47_del#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.889 222021 INFO nova.virt.libvirt.driver [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Deletion of /var/lib/nova/instances/3aa76906-f3e3-4e71-9465-92984a1b0b47_del complete#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.942 222021 INFO nova.compute.manager [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Took 1.17 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.943 222021 DEBUG oslo.service.loopingcall [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.944 222021 DEBUG nova.compute.manager [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:35:14 np0005593233 nova_compute[222017]: 2026-01-23 09:35:14.944 222021 DEBUG nova.network.neutron [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:35:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.195 222021 DEBUG nova.network.neutron [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2383709975' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.213 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.240 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] rbd image 641f6008-576e-4221-a1d8-33ddfca6d069_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.244 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.272 222021 DEBUG nova.network.neutron [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.292 222021 INFO nova.compute.manager [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Took 0.35 seconds to deallocate network for instance.#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.359 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.360 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:35:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2861372310' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.683 222021 DEBUG oslo_concurrency.processutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.709 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.711 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'pci_devices' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.730 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <uuid>641f6008-576e-4221-a1d8-33ddfca6d069</uuid>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <name>instance-00000013</name>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1701597104</nova:name>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:35:14</nova:creationTime>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:user uuid="5874ba32b4a94f68aaa43252721d2fb0">tempest-UnshelveToHostMultiNodesTest-1879363435-project-member</nova:user>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <nova:project uuid="307173cd6ebb4dd5ad3883dedac0271e">tempest-UnshelveToHostMultiNodesTest-1879363435</nova:project>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="cb273736-b860-41f6-b2fe-c26bfa105d90"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <entry name="serial">641f6008-576e-4221-a1d8-33ddfca6d069</entry>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <entry name="uuid">641f6008-576e-4221-a1d8-33ddfca6d069</entry>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/641f6008-576e-4221-a1d8-33ddfca6d069_disk">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/641f6008-576e-4221-a1d8-33ddfca6d069_disk.config">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/console.log" append="off"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:35:15 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:35:15 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:35:15 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:35:15 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.855 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.856 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.857 222021 INFO nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Using config drive#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.891 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] rbd image 641f6008-576e-4221-a1d8-33ddfca6d069_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:15 np0005593233 nova_compute[222017]: 2026-01-23 09:35:15.920 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'ec2_ids' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.025 222021 DEBUG nova.objects.instance [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lazy-loading 'keypairs' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1990495954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.158 222021 DEBUG oslo_concurrency.processutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.166 222021 DEBUG nova.compute.provider_tree [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.193 222021 DEBUG nova.scheduler.client.report [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.227 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:16.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.261 222021 INFO nova.scheduler.client.report [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Deleted allocations for instance 3aa76906-f3e3-4e71-9465-92984a1b0b47#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.364 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.419 222021 DEBUG oslo_concurrency.lockutils [None req-3851fbd0-9132-48ac-bb66-94bc5ab781e2 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "3aa76906-f3e3-4e71-9465-92984a1b0b47" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:16.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.967 222021 INFO nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Creating config drive at /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/disk.config#033[00m
Jan 23 04:35:16 np0005593233 nova_compute[222017]: 2026-01-23 09:35:16.973 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsy5ipe3e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.109 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsy5ipe3e" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.140 222021 DEBUG nova.storage.rbd_utils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] rbd image 641f6008-576e-4221-a1d8-33ddfca6d069_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.145 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/disk.config 641f6008-576e-4221-a1d8-33ddfca6d069_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.325 222021 DEBUG oslo_concurrency.processutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/disk.config 641f6008-576e-4221-a1d8-33ddfca6d069_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.180s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.327 222021 INFO nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Deleting local config drive /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069/disk.config because it was imported into RBD.#033[00m
Jan 23 04:35:17 np0005593233 systemd-machined[190954]: New machine qemu-14-instance-00000013.
Jan 23 04:35:17 np0005593233 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.842 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160917.8415616, 641f6008-576e-4221-a1d8-33ddfca6d069 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.844 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.849 222021 DEBUG nova.compute.manager [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.849 222021 DEBUG nova.virt.libvirt.driver [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.854 222021 INFO nova.virt.libvirt.driver [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Instance spawned successfully.#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.874 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.878 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.902 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.902 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160917.842679, 641f6008-576e-4221-a1d8-33ddfca6d069 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.902 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] VM Started (Lifecycle Event)#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.951 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.956 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:17 np0005593233 nova_compute[222017]: 2026-01-23 09:35:17.991 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:35:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:18.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:18.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:18 np0005593233 nova_compute[222017]: 2026-01-23 09:35:18.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 23 04:35:20 np0005593233 nova_compute[222017]: 2026-01-23 09:35:20.032 222021 DEBUG nova.compute.manager [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:20 np0005593233 nova_compute[222017]: 2026-01-23 09:35:20.124 222021 DEBUG oslo_concurrency.lockutils [None req-efbc229f-f9ad-47e7-8fdb-989e23df2d26 86b66718f3a54282b1d7a2f58d62706f 782083282cf74a109e9cc81fd3a64fef - - default default] Lock "641f6008-576e-4221-a1d8-33ddfca6d069" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 9.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:20.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:20.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:21 np0005593233 nova_compute[222017]: 2026-01-23 09:35:21.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:22.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:22.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.264 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Acquiring lock "641f6008-576e-4221-a1d8-33ddfca6d069" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.265 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lock "641f6008-576e-4221-a1d8-33ddfca6d069" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.266 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Acquiring lock "641f6008-576e-4221-a1d8-33ddfca6d069-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.266 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lock "641f6008-576e-4221-a1d8-33ddfca6d069-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.266 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lock "641f6008-576e-4221-a1d8-33ddfca6d069-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.267 222021 INFO nova.compute.manager [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Terminating instance#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.268 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Acquiring lock "refresh_cache-641f6008-576e-4221-a1d8-33ddfca6d069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.269 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Acquired lock "refresh_cache-641f6008-576e-4221-a1d8-33ddfca6d069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.269 222021 DEBUG nova.network.neutron [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:35:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.619 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:23 np0005593233 nova_compute[222017]: 2026-01-23 09:35:23.896 222021 DEBUG nova.network.neutron [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 23 04:35:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:24.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 23 04:35:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:24.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:24 np0005593233 nova_compute[222017]: 2026-01-23 09:35:24.637 222021 DEBUG nova.network.neutron [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:24 np0005593233 nova_compute[222017]: 2026-01-23 09:35:24.664 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Releasing lock "refresh_cache-641f6008-576e-4221-a1d8-33ddfca6d069" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:24 np0005593233 nova_compute[222017]: 2026-01-23 09:35:24.665 222021 DEBUG nova.compute.manager [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:35:24 np0005593233 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 23 04:35:24 np0005593233 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 7.684s CPU time.
Jan 23 04:35:24 np0005593233 systemd-machined[190954]: Machine qemu-14-instance-00000013 terminated.
Jan 23 04:35:24 np0005593233 nova_compute[222017]: 2026-01-23 09:35:24.891 222021 INFO nova.virt.libvirt.driver [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Instance destroyed successfully.#033[00m
Jan 23 04:35:24 np0005593233 nova_compute[222017]: 2026-01-23 09:35:24.892 222021 DEBUG nova.objects.instance [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lazy-loading 'resources' on Instance uuid 641f6008-576e-4221-a1d8-33ddfca6d069 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:24 np0005593233 nova_compute[222017]: 2026-01-23 09:35:24.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:35:24.949 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:35:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:35:24.951 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:35:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:26.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.373 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:26.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.864 222021 INFO nova.virt.libvirt.driver [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Deleting instance files /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069_del#033[00m
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.865 222021 INFO nova.virt.libvirt.driver [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Deletion of /var/lib/nova/instances/641f6008-576e-4221-a1d8-33ddfca6d069_del complete#033[00m
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.932 222021 INFO nova.compute.manager [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Took 2.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.933 222021 DEBUG oslo.service.loopingcall [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.934 222021 DEBUG nova.compute.manager [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:35:26 np0005593233 nova_compute[222017]: 2026-01-23 09:35:26.934 222021 DEBUG nova.network.neutron [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:35:27 np0005593233 nova_compute[222017]: 2026-01-23 09:35:27.470 222021 DEBUG nova.network.neutron [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:27 np0005593233 nova_compute[222017]: 2026-01-23 09:35:27.492 222021 DEBUG nova.network.neutron [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:27 np0005593233 nova_compute[222017]: 2026-01-23 09:35:27.520 222021 INFO nova.compute.manager [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Took 0.59 seconds to deallocate network for instance.#033[00m
Jan 23 04:35:27 np0005593233 nova_compute[222017]: 2026-01-23 09:35:27.599 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:27 np0005593233 nova_compute[222017]: 2026-01-23 09:35:27.600 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:27 np0005593233 nova_compute[222017]: 2026-01-23 09:35:27.713 222021 DEBUG oslo_concurrency.processutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:28 np0005593233 podman[233044]: 2026-01-23 09:35:28.106653193 +0000 UTC m=+0.115393559 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:35:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2864154721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.267 222021 DEBUG oslo_concurrency.processutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.554s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:28.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.279 222021 DEBUG nova.compute.provider_tree [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.310 222021 DEBUG nova.scheduler.client.report [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.351 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.413 222021 INFO nova.scheduler.client.report [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Deleted allocations for instance 641f6008-576e-4221-a1d8-33ddfca6d069#033[00m
Jan 23 04:35:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:28.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.503 222021 DEBUG oslo_concurrency.lockutils [None req-d729d716-8319-4a1f-a5de-1f031f1fa872 5874ba32b4a94f68aaa43252721d2fb0 307173cd6ebb4dd5ad3883dedac0271e - - default default] Lock "641f6008-576e-4221-a1d8-33ddfca6d069" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.621 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.996 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160913.995001, 3aa76906-f3e3-4e71-9465-92984a1b0b47 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:28 np0005593233 nova_compute[222017]: 2026-01-23 09:35:28.997 222021 INFO nova.compute.manager [-] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:35:29 np0005593233 nova_compute[222017]: 2026-01-23 09:35:29.023 222021 DEBUG nova.compute.manager [None req-61b7b0a9-6c19-45e8-9833-ea29fbdb71e4 - - - - - -] [instance: 3aa76906-f3e3-4e71-9465-92984a1b0b47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:35:29.955 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:35:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:30.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:30.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:31 np0005593233 nova_compute[222017]: 2026-01-23 09:35:31.375 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:32.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:32.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:33 np0005593233 nova_compute[222017]: 2026-01-23 09:35:33.622 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:34.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:34.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.638 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.639 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.664 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.755 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.756 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.767 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.767 222021 INFO nova.compute.claims [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:35:34 np0005593233 nova_compute[222017]: 2026-01-23 09:35:34.889 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/209005317' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.378 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.385 222021 DEBUG nova.compute.provider_tree [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.403 222021 DEBUG nova.scheduler.client.report [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.437 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.438 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.506 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.508 222021 DEBUG nova.network.neutron [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.533 222021 INFO nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.554 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.890 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.891 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.891 222021 INFO nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Creating image(s)#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.917 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.948 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.980 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:35 np0005593233 nova_compute[222017]: 2026-01-23 09:35:35.984 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.058 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.060 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.061 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.062 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.102 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.108 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:36.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.290 222021 DEBUG nova.network.neutron [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.291 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:35:36 np0005593233 nova_compute[222017]: 2026-01-23 09:35:36.378 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:36.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:35:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:35:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.030 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.922s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.119 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] resizing rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.300 222021 DEBUG nova.objects.instance [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lazy-loading 'migration_context' on Instance uuid b3560f1a-604c-41a4-8b5f-74a9b3da9364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.325 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.325 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Ensure instance console log exists: /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.325 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.326 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.326 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.327 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.332 222021 WARNING nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.336 222021 DEBUG nova.virt.libvirt.host [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.336 222021 DEBUG nova.virt.libvirt.host [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.339 222021 DEBUG nova.virt.libvirt.host [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.340 222021 DEBUG nova.virt.libvirt.host [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.341 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.341 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.341 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.341 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.342 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.342 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.342 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.342 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.343 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.343 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.343 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.343 222021 DEBUG nova.virt.hardware [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.346 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:35:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3710165472' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.823 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.848 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:37 np0005593233 nova_compute[222017]: 2026-01-23 09:35:37.854 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:35:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/448918089' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:35:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:38.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.310 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.313 222021 DEBUG nova.objects.instance [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lazy-loading 'pci_devices' on Instance uuid b3560f1a-604c-41a4-8b5f-74a9b3da9364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.335 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <uuid>b3560f1a-604c-41a4-8b5f-74a9b3da9364</uuid>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <name>instance-00000016</name>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerDiagnosticsTest-server-482582491</nova:name>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:35:37</nova:creationTime>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:user uuid="ff2903590e2e4503a3059891aebf39aa">tempest-ServerDiagnosticsTest-1448215485-project-member</nova:user>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <nova:project uuid="f8cc5d9202194541a6be69de6e0d0f91">tempest-ServerDiagnosticsTest-1448215485</nova:project>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <entry name="serial">b3560f1a-604c-41a4-8b5f-74a9b3da9364</entry>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <entry name="uuid">b3560f1a-604c-41a4-8b5f-74a9b3da9364</entry>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk.config">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/console.log" append="off"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:35:38 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:35:38 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:35:38 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:35:38 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.409 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.410 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.410 222021 INFO nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Using config drive#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.438 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:38.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.624 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.828 222021 INFO nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Creating config drive at /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/disk.config#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.834 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4oz2i4dk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:38 np0005593233 nova_compute[222017]: 2026-01-23 09:35:38.964 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4oz2i4dk" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.007 222021 DEBUG nova.storage.rbd_utils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] rbd image b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.013 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/disk.config b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.201 222021 DEBUG oslo_concurrency.processutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/disk.config b3560f1a-604c-41a4-8b5f-74a9b3da9364_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.202 222021 INFO nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Deleting local config drive /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364/disk.config because it was imported into RBD.#033[00m
Jan 23 04:35:39 np0005593233 systemd-machined[190954]: New machine qemu-15-instance-00000016.
Jan 23 04:35:39 np0005593233 systemd[1]: Started Virtual Machine qemu-15-instance-00000016.
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.890 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160924.8893075, 641f6008-576e-4221-a1d8-33ddfca6d069 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.893 222021 INFO nova.compute.manager [-] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.939 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160939.9390545, b3560f1a-604c-41a4-8b5f-74a9b3da9364 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.940 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.947 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.947 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.954 222021 INFO nova.virt.libvirt.driver [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Instance spawned successfully.#033[00m
Jan 23 04:35:39 np0005593233 nova_compute[222017]: 2026-01-23 09:35:39.955 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.007 222021 DEBUG nova.compute.manager [None req-1fa33e55-cbc8-4715-b740-d46153d9efb4 - - - - - -] [instance: 641f6008-576e-4221-a1d8-33ddfca6d069] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:40.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:40.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.574 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.578 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.578 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.579 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.579 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.580 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.580 222021 DEBUG nova.virt.libvirt.driver [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.586 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.675 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.675 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160939.9405873, b3560f1a-604c-41a4-8b5f-74a9b3da9364 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.676 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] VM Started (Lifecycle Event)#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.723 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.727 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.733 222021 INFO nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Took 4.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.734 222021 DEBUG nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.762 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.798 222021 INFO nova.compute.manager [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Took 6.07 seconds to build instance.#033[00m
Jan 23 04:35:40 np0005593233 nova_compute[222017]: 2026-01-23 09:35:40.814 222021 DEBUG oslo_concurrency.lockutils [None req-30676052-f182-43e2-9237-1377334df616 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:41 np0005593233 nova_compute[222017]: 2026-01-23 09:35:41.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:42 np0005593233 podman[233571]: 2026-01-23 09:35:42.112971966 +0000 UTC m=+0.113188586 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:35:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:42.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:42.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:35:42.602 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:35:42.603 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:35:42.603 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.669 222021 DEBUG nova.compute.manager [None req-9f1cf232-c181-4308-97b6-c73162b6f10a 0a8187bab0d54550b260ee7b812a4a0f ac48328d305a4b859c797259ac40457d - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.673 222021 INFO nova.compute.manager [None req-9f1cf232-c181-4308-97b6-c73162b6f10a 0a8187bab0d54550b260ee7b812a4a0f ac48328d305a4b859c797259ac40457d - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Retrieving diagnostics#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.952 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.953 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.953 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.953 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.954 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.955 222021 INFO nova.compute.manager [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Terminating instance#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.956 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "refresh_cache-b3560f1a-604c-41a4-8b5f-74a9b3da9364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.956 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquired lock "refresh_cache-b3560f1a-604c-41a4-8b5f-74a9b3da9364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:42 np0005593233 nova_compute[222017]: 2026-01-23 09:35:42.956 222021 DEBUG nova.network.neutron [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:35:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:35:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.098 222021 DEBUG nova.network.neutron [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.625 222021 DEBUG nova.network.neutron [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.643 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Releasing lock "refresh_cache-b3560f1a-604c-41a4-8b5f-74a9b3da9364" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.644 222021 DEBUG nova.compute.manager [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:35:43 np0005593233 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 23 04:35:43 np0005593233 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Consumed 4.511s CPU time.
Jan 23 04:35:43 np0005593233 systemd-machined[190954]: Machine qemu-15-instance-00000016 terminated.
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.870 222021 INFO nova.virt.libvirt.driver [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Instance destroyed successfully.#033[00m
Jan 23 04:35:43 np0005593233 nova_compute[222017]: 2026-01-23 09:35:43.871 222021 DEBUG nova.objects.instance [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lazy-loading 'resources' on Instance uuid b3560f1a-604c-41a4-8b5f-74a9b3da9364 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:44.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.461 222021 INFO nova.virt.libvirt.driver [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Deleting instance files /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364_del#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.462 222021 INFO nova.virt.libvirt.driver [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Deletion of /var/lib/nova/instances/b3560f1a-604c-41a4-8b5f-74a9b3da9364_del complete#033[00m
Jan 23 04:35:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:44.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.559 222021 INFO nova.compute.manager [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.559 222021 DEBUG oslo.service.loopingcall [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.559 222021 DEBUG nova.compute.manager [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.560 222021 DEBUG nova.network.neutron [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.726 222021 DEBUG nova.network.neutron [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.777 222021 DEBUG nova.network.neutron [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.802 222021 INFO nova.compute.manager [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Took 0.24 seconds to deallocate network for instance.#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.868 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.869 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:44 np0005593233 nova_compute[222017]: 2026-01-23 09:35:44.926 222021 DEBUG oslo_concurrency.processutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4204549930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:45 np0005593233 nova_compute[222017]: 2026-01-23 09:35:45.415 222021 DEBUG oslo_concurrency.processutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:45 np0005593233 nova_compute[222017]: 2026-01-23 09:35:45.426 222021 DEBUG nova.compute.provider_tree [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:45 np0005593233 nova_compute[222017]: 2026-01-23 09:35:45.829 222021 DEBUG nova.scheduler.client.report [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:46 np0005593233 nova_compute[222017]: 2026-01-23 09:35:46.012 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:46 np0005593233 nova_compute[222017]: 2026-01-23 09:35:46.114 222021 INFO nova.scheduler.client.report [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Deleted allocations for instance b3560f1a-604c-41a4-8b5f-74a9b3da9364#033[00m
Jan 23 04:35:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:46.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:46 np0005593233 nova_compute[222017]: 2026-01-23 09:35:46.325 222021 DEBUG oslo_concurrency.lockutils [None req-63ad4204-7725-423d-b4b5-473149f90425 ff2903590e2e4503a3059891aebf39aa f8cc5d9202194541a6be69de6e0d0f91 - - default default] Lock "b3560f1a-604c-41a4-8b5f-74a9b3da9364" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:46 np0005593233 nova_compute[222017]: 2026-01-23 09:35:46.382 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:46.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:48.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:48.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:48 np0005593233 nova_compute[222017]: 2026-01-23 09:35:48.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:50.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:50.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:51 np0005593233 nova_compute[222017]: 2026-01-23 09:35:51.385 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:52.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:35:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:52.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:35:53 np0005593233 nova_compute[222017]: 2026-01-23 09:35:53.629 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:54.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:35:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:54.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:35:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:56.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:56 np0005593233 nova_compute[222017]: 2026-01-23 09:35:56.387 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:56.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:58.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.482 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "04b2feca-0698-43c8-8038-9b6df4df60c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.482 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "04b2feca-0698-43c8-8038-9b6df4df60c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.510 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:35:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:35:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:58.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.612 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.613 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.621 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.622 222021 INFO nova.compute.claims [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.709 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.768 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.867 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160943.8666635, b3560f1a-604c-41a4-8b5f-74a9b3da9364 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.868 222021 INFO nova.compute.manager [-] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:35:58 np0005593233 nova_compute[222017]: 2026-01-23 09:35:58.904 222021 DEBUG nova.compute.manager [None req-a1f475bf-dda1-4072-98b7-6c01185407e3 - - - - - -] [instance: b3560f1a-604c-41a4-8b5f-74a9b3da9364] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:59 np0005593233 podman[233705]: 2026-01-23 09:35:59.097836913 +0000 UTC m=+0.110731107 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 23 04:35:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/61243799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.221 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.228 222021 DEBUG nova.compute.provider_tree [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.245 222021 DEBUG nova.scheduler.client.report [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.278 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.279 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.345 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.368 222021 INFO nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.390 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.524 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.526 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.526 222021 INFO nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Creating image(s)#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.560 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.603 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.636 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.640 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.713 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.714 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.715 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.716 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.743 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:35:59 np0005593233 nova_compute[222017]: 2026-01-23 09:35:59.747 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 04b2feca-0698-43c8-8038-9b6df4df60c8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:00.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:01 np0005593233 nova_compute[222017]: 2026-01-23 09:36:01.389 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:02.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:02 np0005593233 nova_compute[222017]: 2026-01-23 09:36:02.444 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 04b2feca-0698-43c8-8038-9b6df4df60c8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.697s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:02 np0005593233 nova_compute[222017]: 2026-01-23 09:36:02.514 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] resizing rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:36:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:02.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.710 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.802 222021 DEBUG nova.objects.instance [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lazy-loading 'migration_context' on Instance uuid 04b2feca-0698-43c8-8038-9b6df4df60c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.821 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.822 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Ensure instance console log exists: /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.823 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.823 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.824 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.827 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.834 222021 WARNING nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.842 222021 DEBUG nova.virt.libvirt.host [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.843 222021 DEBUG nova.virt.libvirt.host [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.847 222021 DEBUG nova.virt.libvirt.host [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.848 222021 DEBUG nova.virt.libvirt.host [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.850 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.850 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.850 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.851 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.851 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.851 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.851 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.851 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.852 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.852 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.852 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.852 222021 DEBUG nova.virt.hardware [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:36:03 np0005593233 nova_compute[222017]: 2026-01-23 09:36:03.855 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1621803970' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.320 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:04.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.356 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.362 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.395 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.396 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.396 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:04.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2059073713' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.854 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:04 np0005593233 nova_compute[222017]: 2026-01-23 09:36:04.859 222021 DEBUG nova.objects.instance [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lazy-loading 'pci_devices' on Instance uuid 04b2feca-0698-43c8-8038-9b6df4df60c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:06.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:06 np0005593233 nova_compute[222017]: 2026-01-23 09:36:06.393 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.219 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.220 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.220 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.220 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.221 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.255 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <uuid>04b2feca-0698-43c8-8038-9b6df4df60c8</uuid>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <name>instance-00000019</name>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-911203850</nova:name>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:36:03</nova:creationTime>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:user uuid="179bf12e8ac3448fa951b47a68b334fe">tempest-ServerDiagnosticsV248Test-753019702-project-member</nova:user>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <nova:project uuid="e080af52572142c0b58598e9b4ef1c8b">tempest-ServerDiagnosticsV248Test-753019702</nova:project>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <entry name="serial">04b2feca-0698-43c8-8038-9b6df4df60c8</entry>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <entry name="uuid">04b2feca-0698-43c8-8038-9b6df4df60c8</entry>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/04b2feca-0698-43c8-8038-9b6df4df60c8_disk">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/04b2feca-0698-43c8-8038-9b6df4df60c8_disk.config">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/console.log" append="off"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:36:07 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:36:07 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:36:07 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:36:07 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.377 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.378 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.378 222021 INFO nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Using config drive#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.409 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1880463220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.747 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.824 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000019 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.824 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000019 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.987 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.988 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4811MB free_disk=20.94662857055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.988 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:07 np0005593233 nova_compute[222017]: 2026-01-23 09:36:07.988 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.075 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 04b2feca-0698-43c8-8038-9b6df4df60c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.076 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.076 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.127 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:08.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.378 222021 INFO nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Creating config drive at /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/disk.config#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.385 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn2zrfhrs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.516 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn2zrfhrs" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:08.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.550 222021 DEBUG nova.storage.rbd_utils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] rbd image 04b2feca-0698-43c8-8038-9b6df4df60c8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2291957616' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.556 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/disk.config 04b2feca-0698-43c8-8038-9b6df4df60c8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.598 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.608 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.638 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.675 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.675 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.711 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.780 222021 DEBUG oslo_concurrency.processutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/disk.config 04b2feca-0698-43c8-8038-9b6df4df60c8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:08 np0005593233 nova_compute[222017]: 2026-01-23 09:36:08.781 222021 INFO nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Deleting local config drive /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8/disk.config because it was imported into RBD.#033[00m
Jan 23 04:36:08 np0005593233 systemd-machined[190954]: New machine qemu-16-instance-00000019.
Jan 23 04:36:08 np0005593233 systemd[1]: Started Virtual Machine qemu-16-instance-00000019.
Jan 23 04:36:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.525 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160969.5248072, 04b2feca-0698-43c8-8038-9b6df4df60c8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.526 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.529 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.530 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.535 222021 INFO nova.virt.libvirt.driver [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Instance spawned successfully.#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.536 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.568 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.573 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.585 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.586 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.586 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.586 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.587 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.587 222021 DEBUG nova.virt.libvirt.driver [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.600 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.600 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769160969.5288982, 04b2feca-0698-43c8-8038-9b6df4df60c8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.600 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] VM Started (Lifecycle Event)#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.626 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.634 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.663 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.673 222021 INFO nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Took 10.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.673 222021 DEBUG nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.746 222021 INFO nova.compute.manager [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Took 11.17 seconds to build instance.#033[00m
Jan 23 04:36:09 np0005593233 nova_compute[222017]: 2026-01-23 09:36:09.765 222021 DEBUG oslo_concurrency.lockutils [None req-fa52042a-0b6f-4806-aea5-3ad91a20ac39 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "04b2feca-0698-43c8-8038-9b6df4df60c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:10.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:10.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:10 np0005593233 nova_compute[222017]: 2026-01-23 09:36:10.691 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:10 np0005593233 nova_compute[222017]: 2026-01-23 09:36:10.692 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:36:10 np0005593233 nova_compute[222017]: 2026-01-23 09:36:10.693 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.066 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-04b2feca-0698-43c8-8038-9b6df4df60c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.067 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-04b2feca-0698-43c8-8038-9b6df4df60c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.067 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.067 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 04b2feca-0698-43c8-8038-9b6df4df60c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.271 222021 DEBUG nova.compute.manager [None req-7826d3bf-a2e2-43a4-bba8-5a6fdc081a6a 0ae33cdfc78c4d3b8cc24e886c10c243 592c9ef4a42d41d0afe1c68da1491d75 - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.275 222021 INFO nova.compute.manager [None req-7826d3bf-a2e2-43a4-bba8-5a6fdc081a6a 0ae33cdfc78c4d3b8cc24e886c10c243 592c9ef4a42d41d0afe1c68da1491d75 - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Retrieving diagnostics#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.395 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:11 np0005593233 nova_compute[222017]: 2026-01-23 09:36:11.686 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.122 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.188 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-04b2feca-0698-43c8-8038-9b6df4df60c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.189 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.189 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.189 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.190 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:12 np0005593233 nova_compute[222017]: 2026-01-23 09:36:12.190 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:36:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:12.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:12.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:13 np0005593233 podman[234121]: 2026-01-23 09:36:13.089498551 +0000 UTC m=+0.078696477 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:36:13 np0005593233 nova_compute[222017]: 2026-01-23 09:36:13.741 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:13 np0005593233 nova_compute[222017]: 2026-01-23 09:36:13.878 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:14.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:14.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/792013636' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:16.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:16 np0005593233 nova_compute[222017]: 2026-01-23 09:36:16.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:16.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:18.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:18.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:18 np0005593233 nova_compute[222017]: 2026-01-23 09:36:18.772 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:20.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:20.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.398 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.544 222021 DEBUG nova.compute.manager [None req-fc2728a9-c2f7-42e5-8b28-fc85846d7253 0ae33cdfc78c4d3b8cc24e886c10c243 592c9ef4a42d41d0afe1c68da1491d75 - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.548 222021 INFO nova.compute.manager [None req-fc2728a9-c2f7-42e5-8b28-fc85846d7253 0ae33cdfc78c4d3b8cc24e886c10c243 592c9ef4a42d41d0afe1c68da1491d75 - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Retrieving diagnostics#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.845 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "04b2feca-0698-43c8-8038-9b6df4df60c8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.846 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "04b2feca-0698-43c8-8038-9b6df4df60c8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.846 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "04b2feca-0698-43c8-8038-9b6df4df60c8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.847 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "04b2feca-0698-43c8-8038-9b6df4df60c8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.847 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "04b2feca-0698-43c8-8038-9b6df4df60c8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.848 222021 INFO nova.compute.manager [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Terminating instance#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.849 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "refresh_cache-04b2feca-0698-43c8-8038-9b6df4df60c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.849 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquired lock "refresh_cache-04b2feca-0698-43c8-8038-9b6df4df60c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:21 np0005593233 nova_compute[222017]: 2026-01-23 09:36:21.849 222021 DEBUG nova.network.neutron [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:36:22 np0005593233 nova_compute[222017]: 2026-01-23 09:36:22.090 222021 DEBUG nova.network.neutron [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:36:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:22.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:22 np0005593233 nova_compute[222017]: 2026-01-23 09:36:22.443 222021 DEBUG nova.network.neutron [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:22 np0005593233 nova_compute[222017]: 2026-01-23 09:36:22.463 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Releasing lock "refresh_cache-04b2feca-0698-43c8-8038-9b6df4df60c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:22 np0005593233 nova_compute[222017]: 2026-01-23 09:36:22.464 222021 DEBUG nova.compute.manager [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:36:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:22.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:22 np0005593233 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 23 04:36:22 np0005593233 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Consumed 13.368s CPU time.
Jan 23 04:36:22 np0005593233 systemd-machined[190954]: Machine qemu-16-instance-00000019 terminated.
Jan 23 04:36:22 np0005593233 nova_compute[222017]: 2026-01-23 09:36:22.896 222021 INFO nova.virt.libvirt.driver [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Instance destroyed successfully.#033[00m
Jan 23 04:36:22 np0005593233 nova_compute[222017]: 2026-01-23 09:36:22.897 222021 DEBUG nova.objects.instance [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lazy-loading 'resources' on Instance uuid 04b2feca-0698-43c8-8038-9b6df4df60c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.628 222021 INFO nova.virt.libvirt.driver [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Deleting instance files /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8_del#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.629 222021 INFO nova.virt.libvirt.driver [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Deletion of /var/lib/nova/instances/04b2feca-0698-43c8-8038-9b6df4df60c8_del complete#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.825 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.908 222021 INFO nova.compute.manager [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Took 1.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.909 222021 DEBUG oslo.service.loopingcall [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.910 222021 DEBUG nova.compute.manager [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:36:23 np0005593233 nova_compute[222017]: 2026-01-23 09:36:23.911 222021 DEBUG nova.network.neutron [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:36:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.112 222021 DEBUG nova.network.neutron [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.148 222021 DEBUG nova.network.neutron [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.180 222021 INFO nova.compute.manager [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Took 0.27 seconds to deallocate network for instance.#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.280 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.280 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:24.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.397 222021 DEBUG oslo_concurrency.processutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:24.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4044474463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.899 222021 DEBUG oslo_concurrency.processutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.911 222021 DEBUG nova.compute.provider_tree [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.945 222021 DEBUG nova.scheduler.client.report [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:36:24 np0005593233 nova_compute[222017]: 2026-01-23 09:36:24.973 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:25 np0005593233 nova_compute[222017]: 2026-01-23 09:36:25.031 222021 INFO nova.scheduler.client.report [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Deleted allocations for instance 04b2feca-0698-43c8-8038-9b6df4df60c8#033[00m
Jan 23 04:36:25 np0005593233 nova_compute[222017]: 2026-01-23 09:36:25.146 222021 DEBUG oslo_concurrency.lockutils [None req-1fc4ef00-0f54-4466-bf72-9cc551a17859 179bf12e8ac3448fa951b47a68b334fe e080af52572142c0b58598e9b4ef1c8b - - default default] Lock "04b2feca-0698-43c8-8038-9b6df4df60c8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:26.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:26 np0005593233 nova_compute[222017]: 2026-01-23 09:36:26.400 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:26.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:28.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:28.440 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:28.441 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:36:28 np0005593233 nova_compute[222017]: 2026-01-23 09:36:28.497 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:28.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:28 np0005593233 nova_compute[222017]: 2026-01-23 09:36:28.828 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:30 np0005593233 podman[234185]: 2026-01-23 09:36:30.091726979 +0000 UTC m=+0.096711178 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:36:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:30.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:30.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:31 np0005593233 nova_compute[222017]: 2026-01-23 09:36:31.402 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:31.443 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:32.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:32.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:33 np0005593233 nova_compute[222017]: 2026-01-23 09:36:33.899 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:34.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:34.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:36.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:36 np0005593233 nova_compute[222017]: 2026-01-23 09:36:36.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:36.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.189 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.190 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.236 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.354 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.354 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.361 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.362 222021 INFO nova.compute.claims [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.630 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.894 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160982.892536, 04b2feca-0698-43c8-8038-9b6df4df60c8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.896 222021 INFO nova.compute.manager [-] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:36:37 np0005593233 nova_compute[222017]: 2026-01-23 09:36:37.933 222021 DEBUG nova.compute.manager [None req-5e8dbc5a-a309-4ada-8ce8-8ffa37a690a6 - - - - - -] [instance: 04b2feca-0698-43c8-8038-9b6df4df60c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3380063672' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.068 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.075 222021 DEBUG nova.compute.provider_tree [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.099 222021 DEBUG nova.scheduler.client.report [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.376 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.376 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:36:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:38.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.440 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.440 222021 DEBUG nova.network.neutron [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.462 222021 INFO nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.490 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.594 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.596 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.596 222021 INFO nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Creating image(s)#033[00m
Jan 23 04:36:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:38.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.631 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.665 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.703 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.711 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.812 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.814 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.815 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.815 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.848 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.852 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:38 np0005593233 nova_compute[222017]: 2026-01-23 09:36:38.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.198 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.346s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.265 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] resizing rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.377 222021 DEBUG nova.objects.instance [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'migration_context' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.406 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.407 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Ensure instance console log exists: /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.408 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.408 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:39 np0005593233 nova_compute[222017]: 2026-01-23 09:36:39.408 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:40 np0005593233 nova_compute[222017]: 2026-01-23 09:36:40.204 222021 DEBUG nova.policy [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '726bd44b7ec443a0a4b8b632b06c622e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f68f8c2203944c9a6e44a6756c8b4b9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:36:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:40.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:40.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:41 np0005593233 nova_compute[222017]: 2026-01-23 09:36:41.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:42 np0005593233 nova_compute[222017]: 2026-01-23 09:36:42.224 222021 DEBUG nova.network.neutron [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Successfully created port: 257f01a3-c4a8-4e7f-a76b-ad302970586d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:36:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:42.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:42.603 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:42.604 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:42.604 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:42.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:43 np0005593233 podman[234424]: 2026-01-23 09:36:43.251089815 +0000 UTC m=+0.065001368 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:36:43 np0005593233 podman[234589]: 2026-01-23 09:36:43.869189573 +0000 UTC m=+0.078159891 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 23 04:36:43 np0005593233 nova_compute[222017]: 2026-01-23 09:36:43.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:44 np0005593233 podman[234589]: 2026-01-23 09:36:44.082609626 +0000 UTC m=+0.291579934 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 23 04:36:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:44.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 23 04:36:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:44.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:44 np0005593233 nova_compute[222017]: 2026-01-23 09:36:44.926 222021 DEBUG nova.network.neutron [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Successfully updated port: 257f01a3-c4a8-4e7f-a76b-ad302970586d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.120 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.120 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquired lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.120 222021 DEBUG nova.network.neutron [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.140 222021 DEBUG nova.compute.manager [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-changed-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.140 222021 DEBUG nova.compute.manager [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Refreshing instance network info cache due to event network-changed-257f01a3-c4a8-4e7f-a76b-ad302970586d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.141 222021 DEBUG oslo_concurrency.lockutils [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:45 np0005593233 nova_compute[222017]: 2026-01-23 09:36:45.616 222021 DEBUG nova.network.neutron [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:36:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:46.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:46 np0005593233 nova_compute[222017]: 2026-01-23 09:36:46.409 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:46.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:36:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:46 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:36:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:48.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:48.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:48 np0005593233 nova_compute[222017]: 2026-01-23 09:36:48.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.101 222021 DEBUG nova.network.neutron [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updating instance_info_cache with network_info: [{"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.178 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Releasing lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.178 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Instance network_info: |[{"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.179 222021 DEBUG oslo_concurrency.lockutils [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.179 222021 DEBUG nova.network.neutron [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Refreshing network info cache for port 257f01a3-c4a8-4e7f-a76b-ad302970586d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.181 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Start _get_guest_xml network_info=[{"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.187 222021 WARNING nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.193 222021 DEBUG nova.virt.libvirt.host [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.194 222021 DEBUG nova.virt.libvirt.host [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:36:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.198 222021 DEBUG nova.virt.libvirt.host [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.199 222021 DEBUG nova.virt.libvirt.host [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.200 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.201 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.201 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.202 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.203 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.203 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.203 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.203 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.204 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.204 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.204 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.205 222021 DEBUG nova.virt.hardware [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.209 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:50.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:50.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3915464398' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.663 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.701 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:50 np0005593233 nova_compute[222017]: 2026-01-23 09:36:50.707 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2913508663' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.145 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.148 222021 DEBUG nova.virt.libvirt.vif [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:36:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1382474509',display_name='tempest-VolumesAdminNegativeTest-server-1382474509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-1382474509',id=28,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELMwIbefZ+NS3vAuDYfwJRSTH1W1yIexIU189B5pmxU68Lx1yAkpZGNPjvBY4+9UGdkpB7hvqIyqFzykx4Z32orQ2CI/JRl2wMEWokCsnS+gbhMgRg9ToE9ME9CgCqyig==',key_name='tempest-keypair-863383197',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f68f8c2203944c9a6e44a6756c8b4b9',ramdisk_id='',reservation_id='r-m3to0xya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-905168495',owner_user_name='tempest-VolumesAdminNegativeTest-905168495-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='726bd44b7ec443a0a4b8b632b06c622e',uuid=f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.148 222021 DEBUG nova.network.os_vif_util [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converting VIF {"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.149 222021 DEBUG nova.network.os_vif_util [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.151 222021 DEBUG nova.objects.instance [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.206 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <uuid>f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc</uuid>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <name>instance-0000001c</name>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:name>tempest-VolumesAdminNegativeTest-server-1382474509</nova:name>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:36:50</nova:creationTime>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:user uuid="726bd44b7ec443a0a4b8b632b06c622e">tempest-VolumesAdminNegativeTest-905168495-project-member</nova:user>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:project uuid="9f68f8c2203944c9a6e44a6756c8b4b9">tempest-VolumesAdminNegativeTest-905168495</nova:project>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <nova:port uuid="257f01a3-c4a8-4e7f-a76b-ad302970586d">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <entry name="serial">f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc</entry>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <entry name="uuid">f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc</entry>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk.config">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:e1:b1:af"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <target dev="tap257f01a3-c4"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/console.log" append="off"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:36:51 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:36:51 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:36:51 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:36:51 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.208 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Preparing to wait for external event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.209 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.210 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.210 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.211 222021 DEBUG nova.virt.libvirt.vif [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:36:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1382474509',display_name='tempest-VolumesAdminNegativeTest-server-1382474509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-1382474509',id=28,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELMwIbefZ+NS3vAuDYfwJRSTH1W1yIexIU189B5pmxU68Lx1yAkpZGNPjvBY4+9UGdkpB7hvqIyqFzykx4Z32orQ2CI/JRl2wMEWokCsnS+gbhMgRg9ToE9ME9CgCqyig==',key_name='tempest-keypair-863383197',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f68f8c2203944c9a6e44a6756c8b4b9',ramdisk_id='',reservation_id='r-m3to0xya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-905168495',owner_user_name='tempest-VolumesAdminNegativeTest-905168495-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='726bd44b7ec443a0a4b8b632b06c622e',uuid=f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.211 222021 DEBUG nova.network.os_vif_util [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converting VIF {"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.212 222021 DEBUG nova.network.os_vif_util [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.213 222021 DEBUG os_vif [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.213 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.214 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.215 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.219 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap257f01a3-c4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.220 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap257f01a3-c4, col_values=(('external_ids', {'iface-id': '257f01a3-c4a8-4e7f-a76b-ad302970586d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:b1:af', 'vm-uuid': 'f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:51 np0005593233 NetworkManager[48871]: <info>  [1769161011.2241] manager: (tap257f01a3-c4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.237 222021 INFO os_vif [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4')#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.531 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.531 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.531 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No VIF found with MAC fa:16:3e:e1:b1:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.532 222021 INFO nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Using config drive#033[00m
Jan 23 04:36:51 np0005593233 nova_compute[222017]: 2026-01-23 09:36:51.557 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:52.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:36:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:52.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:36:53 np0005593233 nova_compute[222017]: 2026-01-23 09:36:53.982 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:54.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:54.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.135 222021 INFO nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Creating config drive at /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/disk.config#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.144 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppj_tlyg4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.178 222021 DEBUG nova.network.neutron [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updated VIF entry in instance network info cache for port 257f01a3-c4a8-4e7f-a76b-ad302970586d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.180 222021 DEBUG nova.network.neutron [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updating instance_info_cache with network_info: [{"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.241 222021 DEBUG oslo_concurrency.lockutils [req-3f1a0750-edfd-4b21-a6e3-253b1e2473c2 req-0095ce09-1758-476e-90aa-7944238fc7df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.289 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppj_tlyg4" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.325 222021 DEBUG nova.storage.rbd_utils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:55 np0005593233 nova_compute[222017]: 2026-01-23 09:36:55.330 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/disk.config f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.908574) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015908714, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2643, "num_deletes": 505, "total_data_size": 5484025, "memory_usage": 5571632, "flush_reason": "Manual Compaction"}
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015950741, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3579232, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27924, "largest_seqno": 30562, "table_properties": {"data_size": 3568832, "index_size": 6126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 25701, "raw_average_key_size": 20, "raw_value_size": 3545803, "raw_average_value_size": 2820, "num_data_blocks": 266, "num_entries": 1257, "num_filter_entries": 1257, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160824, "oldest_key_time": 1769160824, "file_creation_time": 1769161015, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 42228 microseconds, and 11657 cpu microseconds.
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.950844) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3579232 bytes OK
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.950886) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.952845) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.952869) EVENT_LOG_v1 {"time_micros": 1769161015952863, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.952939) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5471223, prev total WAL file size 5471223, number of live WAL files 2.
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.954891) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3495KB)], [57(10MB)]
Jan 23 04:36:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015955073, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14654764, "oldest_snapshot_seqno": -1}
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5364 keys, 8870509 bytes, temperature: kUnknown
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161016065653, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 8870509, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8834573, "index_size": 21426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 136819, "raw_average_key_size": 25, "raw_value_size": 8737717, "raw_average_value_size": 1628, "num_data_blocks": 864, "num_entries": 5364, "num_filter_entries": 5364, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161015, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.066257) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 8870509 bytes
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.072383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.2 rd, 80.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 10.6 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(6.6) write-amplify(2.5) OK, records in: 6391, records dropped: 1027 output_compression: NoCompression
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.072446) EVENT_LOG_v1 {"time_micros": 1769161016072425, "job": 34, "event": "compaction_finished", "compaction_time_micros": 110894, "compaction_time_cpu_micros": 22652, "output_level": 6, "num_output_files": 1, "total_output_size": 8870509, "num_input_records": 6391, "num_output_records": 5364, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161016074142, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161016076941, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:55.954631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.077223) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.077233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.077236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.077239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:36:56.077241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.091 222021 DEBUG oslo_concurrency.processutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/disk.config f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.761s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.092 222021 INFO nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Deleting local config drive /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc/disk.config because it was imported into RBD.#033[00m
Jan 23 04:36:56 np0005593233 kernel: tap257f01a3-c4: entered promiscuous mode
Jan 23 04:36:56 np0005593233 NetworkManager[48871]: <info>  [1769161016.1536] manager: (tap257f01a3-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 04:36:56 np0005593233 ovn_controller[130653]: 2026-01-23T09:36:56Z|00076|binding|INFO|Claiming lport 257f01a3-c4a8-4e7f-a76b-ad302970586d for this chassis.
Jan 23 04:36:56 np0005593233 ovn_controller[130653]: 2026-01-23T09:36:56Z|00077|binding|INFO|257f01a3-c4a8-4e7f-a76b-ad302970586d: Claiming fa:16:3e:e1:b1:af 10.100.0.6
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.164 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 systemd-udevd[235022]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:36:56 np0005593233 systemd-machined[190954]: New machine qemu-17-instance-0000001c.
Jan 23 04:36:56 np0005593233 NetworkManager[48871]: <info>  [1769161016.2083] device (tap257f01a3-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:36:56 np0005593233 NetworkManager[48871]: <info>  [1769161016.2094] device (tap257f01a3-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.227 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:b1:af 10.100.0.6'], port_security=['fa:16:3e:e1:b1:af 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f68f8c2203944c9a6e44a6756c8b4b9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cace19e3-4515-4611-845a-54d5fb8f6a17', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2448bc-0bf3-4fe3-aeb3-04d125f323ad, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=257f01a3-c4a8-4e7f-a76b-ad302970586d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.228 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 257f01a3-c4a8-4e7f-a76b-ad302970586d in datapath ef05741c-2d3e-419c-adbb-a2a3bca97f59 bound to our chassis#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.227 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.229 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef05741c-2d3e-419c-adbb-a2a3bca97f59#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.231 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 systemd[1]: Started Virtual Machine qemu-17-instance-0000001c.
Jan 23 04:36:56 np0005593233 ovn_controller[130653]: 2026-01-23T09:36:56Z|00078|binding|INFO|Setting lport 257f01a3-c4a8-4e7f-a76b-ad302970586d ovn-installed in OVS
Jan 23 04:36:56 np0005593233 ovn_controller[130653]: 2026-01-23T09:36:56Z|00079|binding|INFO|Setting lport 257f01a3-c4a8-4e7f-a76b-ad302970586d up in Southbound
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.239 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.253 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6c8738-68d3-4209-aa71-1372245277e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.254 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef05741c-21 in ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.258 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef05741c-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.258 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[79afdc95-432f-4d9b-b300-e7cd04b850fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.260 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ded2d72b-8e3a-4e73-8582-6dfab72aa359]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.282 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[74d483d3-6012-417c-b433-72dcc5b6ba4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.303 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ceadae73-29e9-4ecb-98d1-27a42e0aa679]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.340 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6987e6dc-7537-4173-a9a8-52ebe2e2f3d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.347 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9748ad6b-6309-40bc-a8de-bf44bca6b481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 NetworkManager[48871]: <info>  [1769161016.3496] manager: (tapef05741c-20): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.395 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ede14f-58cc-4c2e-b795-9cc1e9c5d40e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.399 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9b009de1-5e21-4930-bb27-3f19ecbdd13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:56.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:56 np0005593233 NetworkManager[48871]: <info>  [1769161016.4324] device (tapef05741c-20): carrier: link connected
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.440 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e3def496-3088-4c97-8d6a-d561988b0c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.456 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2645e696-89c1-43cb-bdd7-63fac32d412f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef05741c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:3d:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492906, 'reachable_time': 37105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235057, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.477 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34d88ed8-bc60-42af-a1b5-367d7f32161f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8a:3dba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492906, 'tstamp': 492906}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235059, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.499 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6df1b0d-6704-480e-8f04-b61619dfd66a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef05741c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:3d:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492906, 'reachable_time': 37105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235060, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.535 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[69d3d4f5-bc58-41d8-81e7-9332b0ab1ae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.610 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[88ea48a0-6069-49fd-9da7-08f364edcb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.612 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef05741c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.613 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.613 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef05741c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:56 np0005593233 NetworkManager[48871]: <info>  [1769161016.6165] manager: (tapef05741c-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.615 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 kernel: tapef05741c-20: entered promiscuous mode
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.621 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef05741c-20, col_values=(('external_ids', {'iface-id': 'f8b32530-de7e-473a-a3e9-65c6259bc8bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:56 np0005593233 ovn_controller[130653]: 2026-01-23T09:36:56Z|00080|binding|INFO|Releasing lport f8b32530-de7e-473a-a3e9-65c6259bc8bb from this chassis (sb_readonly=0)
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:56.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.639 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.641 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef05741c-2d3e-419c-adbb-a2a3bca97f59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef05741c-2d3e-419c-adbb-a2a3bca97f59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.642 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c017b716-ab1b-4616-a99f-3d0a3e094d29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.643 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-ef05741c-2d3e-419c-adbb-a2a3bca97f59
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/ef05741c-2d3e-419c-adbb-a2a3bca97f59.pid.haproxy
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID ef05741c-2d3e-419c-adbb-a2a3bca97f59
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:36:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:36:56.644 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'env', 'PROCESS_TAG=haproxy-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef05741c-2d3e-419c-adbb-a2a3bca97f59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.715 222021 DEBUG nova.compute.manager [req-d8b44e5e-e634-4c98-9e74-2c1ef33e579e req-7317fd06-a872-407c-bf1d-d47762f3dd93 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.716 222021 DEBUG oslo_concurrency.lockutils [req-d8b44e5e-e634-4c98-9e74-2c1ef33e579e req-7317fd06-a872-407c-bf1d-d47762f3dd93 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.716 222021 DEBUG oslo_concurrency.lockutils [req-d8b44e5e-e634-4c98-9e74-2c1ef33e579e req-7317fd06-a872-407c-bf1d-d47762f3dd93 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.717 222021 DEBUG oslo_concurrency.lockutils [req-d8b44e5e-e634-4c98-9e74-2c1ef33e579e req-7317fd06-a872-407c-bf1d-d47762f3dd93 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:56 np0005593233 nova_compute[222017]: 2026-01-23 09:36:56.717 222021 DEBUG nova.compute.manager [req-d8b44e5e-e634-4c98-9e74-2c1ef33e579e req-7317fd06-a872-407c-bf1d-d47762f3dd93 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Processing event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:36:57 np0005593233 podman[235125]: 2026-01-23 09:36:57.151590781 +0000 UTC m=+0.072755318 container create d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 04:36:57 np0005593233 systemd[1]: Started libpod-conmon-d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70.scope.
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.200 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161017.2001822, f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.202 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] VM Started (Lifecycle Event)#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.205 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:36:57 np0005593233 podman[235125]: 2026-01-23 09:36:57.113551701 +0000 UTC m=+0.034716228 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.210 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.215 222021 INFO nova.virt.libvirt.driver [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Instance spawned successfully.#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.215 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:36:57 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:36:57 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8039dbf9572d51a7410ec776136cc29f907bad6c0072fdaf910ea7acc90d896/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.238 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.244 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.245 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.245 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.246 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.246 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.247 222021 DEBUG nova.virt.libvirt.driver [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:57 np0005593233 podman[235125]: 2026-01-23 09:36:57.254367791 +0000 UTC m=+0.175532348 container init d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.254 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:57 np0005593233 podman[235125]: 2026-01-23 09:36:57.259586229 +0000 UTC m=+0.180750756 container start d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:36:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [NOTICE]   (235150) : New worker (235152) forked
Jan 23 04:36:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [NOTICE]   (235150) : Loading success.
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.306 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.307 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161017.201445, f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.307 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.388 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.398 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161017.209196, f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.399 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.403 222021 INFO nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Took 18.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.403 222021 DEBUG nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.459 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.463 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.488 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.499 222021 INFO nova.compute.manager [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Took 20.20 seconds to build instance.#033[00m
Jan 23 04:36:57 np0005593233 nova_compute[222017]: 2026-01-23 09:36:57.535 222021 DEBUG oslo_concurrency.lockutils [None req-7663a1b3-eedf-449f-84c3-b93343c65334 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:36:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:58.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:36:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:36:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:58.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 23 04:36:58 np0005593233 nova_compute[222017]: 2026-01-23 09:36:58.984 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:59 np0005593233 nova_compute[222017]: 2026-01-23 09:36:59.183 222021 DEBUG nova.compute.manager [req-4e7d8ead-ea56-41f8-a9d6-647fc0ee684f req-b87ede39-342b-429a-b198-8fa501d81fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:59 np0005593233 nova_compute[222017]: 2026-01-23 09:36:59.184 222021 DEBUG oslo_concurrency.lockutils [req-4e7d8ead-ea56-41f8-a9d6-647fc0ee684f req-b87ede39-342b-429a-b198-8fa501d81fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:59 np0005593233 nova_compute[222017]: 2026-01-23 09:36:59.185 222021 DEBUG oslo_concurrency.lockutils [req-4e7d8ead-ea56-41f8-a9d6-647fc0ee684f req-b87ede39-342b-429a-b198-8fa501d81fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:59 np0005593233 nova_compute[222017]: 2026-01-23 09:36:59.185 222021 DEBUG oslo_concurrency.lockutils [req-4e7d8ead-ea56-41f8-a9d6-647fc0ee684f req-b87ede39-342b-429a-b198-8fa501d81fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:59 np0005593233 nova_compute[222017]: 2026-01-23 09:36:59.186 222021 DEBUG nova.compute.manager [req-4e7d8ead-ea56-41f8-a9d6-647fc0ee684f req-b87ede39-342b-429a-b198-8fa501d81fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] No waiting events found dispatching network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:59 np0005593233 nova_compute[222017]: 2026-01-23 09:36:59.186 222021 WARNING nova.compute.manager [req-4e7d8ead-ea56-41f8-a9d6-647fc0ee684f req-b87ede39-342b-429a-b198-8fa501d81fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received unexpected event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d for instance with vm_state active and task_state None.#033[00m
Jan 23 04:37:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:00.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:00.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 23 04:37:01 np0005593233 podman[235161]: 2026-01-23 09:37:01.089450253 +0000 UTC m=+0.101132644 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:37:01 np0005593233 nova_compute[222017]: 2026-01-23 09:37:01.229 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:02.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:02.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:03 np0005593233 nova_compute[222017]: 2026-01-23 09:37:03.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1565] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1572] device (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <warn>  [1769161024.1574] device (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1586] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1593] device (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <warn>  [1769161024.1594] device (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1603] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1612] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1617] device (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:37:04 np0005593233 NetworkManager[48871]: <info>  [1769161024.1620] device (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.367 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:04 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:04Z|00081|binding|INFO|Releasing lport f8b32530-de7e-473a-a3e9-65c6259bc8bb from this chassis (sb_readonly=0)
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.394 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:04.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:04.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.888 222021 DEBUG nova.compute.manager [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-changed-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.889 222021 DEBUG nova.compute.manager [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Refreshing instance network info cache due to event network-changed-257f01a3-c4a8-4e7f-a76b-ad302970586d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.889 222021 DEBUG oslo_concurrency.lockutils [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.890 222021 DEBUG oslo_concurrency.lockutils [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:04 np0005593233 nova_compute[222017]: 2026-01-23 09:37:04.890 222021 DEBUG nova.network.neutron [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Refreshing network info cache for port 257f01a3-c4a8-4e7f-a76b-ad302970586d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.421 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.422 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1658821012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:05 np0005593233 nova_compute[222017]: 2026-01-23 09:37:05.908 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.017 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.018 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000001c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.231 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.234 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4615MB free_disk=20.921863555908203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.235 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.235 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.236 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.372 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.373 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.374 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:37:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:06.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.442 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:06.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2537849796' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.920 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.930 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:06 np0005593233 nova_compute[222017]: 2026-01-23 09:37:06.964 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:07 np0005593233 nova_compute[222017]: 2026-01-23 09:37:07.004 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:37:07 np0005593233 nova_compute[222017]: 2026-01-23 09:37:07.006 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:07 np0005593233 nova_compute[222017]: 2026-01-23 09:37:07.497 222021 DEBUG nova.network.neutron [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updated VIF entry in instance network info cache for port 257f01a3-c4a8-4e7f-a76b-ad302970586d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:37:07 np0005593233 nova_compute[222017]: 2026-01-23 09:37:07.499 222021 DEBUG nova.network.neutron [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updating instance_info_cache with network_info: [{"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:07 np0005593233 nova_compute[222017]: 2026-01-23 09:37:07.525 222021 DEBUG oslo_concurrency.lockutils [req-6c316cbf-2055-4736-8ac0-b35162cfb847 req-750b31e5-93d0-42d1-87b3-9d4f6ee22ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:07 np0005593233 nova_compute[222017]: 2026-01-23 09:37:07.589 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:08.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:08.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.008 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.008 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.008 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.023 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.277 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.278 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.278 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:37:09 np0005593233 nova_compute[222017]: 2026-01-23 09:37:09.279 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:10.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:10.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.240 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.865 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updating instance_info_cache with network_info: [{"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.897 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.898 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.898 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.899 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.899 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:11 np0005593233 nova_compute[222017]: 2026-01-23 09:37:11.899 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:37:12 np0005593233 nova_compute[222017]: 2026-01-23 09:37:12.079 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:12Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e1:b1:af 10.100.0.6
Jan 23 04:37:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:12Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e1:b1:af 10.100.0.6
Jan 23 04:37:12 np0005593233 nova_compute[222017]: 2026-01-23 09:37:12.270 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:12.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:12.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:14 np0005593233 nova_compute[222017]: 2026-01-23 09:37:14.025 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:14 np0005593233 podman[235235]: 2026-01-23 09:37:14.054056826 +0000 UTC m=+0.065809820 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:37:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:14.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:14 np0005593233 nova_compute[222017]: 2026-01-23 09:37:14.438 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:14.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:16 np0005593233 nova_compute[222017]: 2026-01-23 09:37:16.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:16.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:16.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:17.254 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:17 np0005593233 nova_compute[222017]: 2026-01-23 09:37:17.255 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:17.257 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.326 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.327 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.366 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:37:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:18.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.591 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.592 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.600 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.601 222021 INFO nova.compute.claims [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:37:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:18.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:18 np0005593233 nova_compute[222017]: 2026-01-23 09:37:18.804 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1506459437' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.313 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.322 222021 DEBUG nova.compute.provider_tree [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.347 222021 DEBUG nova.scheduler.client.report [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.375 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.376 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.476 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.477 222021 DEBUG nova.network.neutron [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.511 222021 INFO nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.534 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.658 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.659 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.660 222021 INFO nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating image(s)#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.689 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.720 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.752 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.758 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.835 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.837 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.838 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.838 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.866 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:19 np0005593233 nova_compute[222017]: 2026-01-23 09:37:19.870 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.297 222021 DEBUG nova.policy [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '191a72cfd0a841e9806246e07eb62fa6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.331 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.423 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] resizing rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:37:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:20.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.548 222021 DEBUG nova.objects.instance [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'migration_context' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.573 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.574 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Ensure instance console log exists: /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.575 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.575 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:20 np0005593233 nova_compute[222017]: 2026-01-23 09:37:20.575 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:20.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:21 np0005593233 nova_compute[222017]: 2026-01-23 09:37:21.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:21 np0005593233 nova_compute[222017]: 2026-01-23 09:37:21.950 222021 DEBUG nova.network.neutron [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Successfully created port: 541053d6-d3d0-4da0-9b9c-630177f53234 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:37:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:22.260 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.378 222021 DEBUG oslo_concurrency.lockutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.379 222021 DEBUG oslo_concurrency.lockutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.422 222021 DEBUG nova.objects.instance [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'flavor' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:22.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.500 222021 DEBUG oslo_concurrency.lockutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:22.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.829 222021 DEBUG oslo_concurrency.lockutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.830 222021 DEBUG oslo_concurrency.lockutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:22 np0005593233 nova_compute[222017]: 2026-01-23 09:37:22.830 222021 INFO nova.compute.manager [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Attaching volume 8bf8b5e1-14c0-4be1-9734-e2610e0b9950 to /dev/vdb#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.057 222021 DEBUG os_brick.utils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.061 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.083 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.083 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[ba66d544-1729-4447-8903-27b48c4fd856]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.086 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.091 222021 DEBUG nova.network.neutron [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Successfully updated port: 541053d6-d3d0-4da0-9b9c-630177f53234 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.098 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.098 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e96246-19c2-4c86-91e9-8c4f72ac71ed]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.101 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.112 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.113 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[76849484-b863-4c90-8a5f-862fa8d970ed]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.115 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[f827b48d-a72f-4981-a0b9-8dca9fcd2f8a]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.115 222021 DEBUG oslo_concurrency.processutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.149 222021 DEBUG oslo_concurrency.processutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "nvme version" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.152 222021 DEBUG os_brick.initiator.connectors.lightos [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.152 222021 DEBUG os_brick.initiator.connectors.lightos [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.153 222021 DEBUG os_brick.initiator.connectors.lightos [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.153 222021 DEBUG os_brick.utils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] <== get_connector_properties: return (94ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.153 222021 DEBUG nova.virt.block_device [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updating existing volume attachment record: 3ca13d06-d1a5-443f-8d44-c57d0ef6d59b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.455 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.455 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquired lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.457 222021 DEBUG nova.network.neutron [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.674 222021 DEBUG nova.compute.manager [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-changed-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.675 222021 DEBUG nova.compute.manager [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Refreshing instance network info cache due to event network-changed-541053d6-d3d0-4da0-9b9c-630177f53234. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.675 222021 DEBUG oslo_concurrency.lockutils [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:23 np0005593233 nova_compute[222017]: 2026-01-23 09:37:23.856 222021 DEBUG nova.network.neutron [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.028 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1291090198' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:24.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.603 222021 DEBUG nova.objects.instance [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'flavor' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.642 222021 DEBUG nova.virt.libvirt.driver [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Attempting to attach volume 8bf8b5e1-14c0-4be1-9734-e2610e0b9950 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.644 222021 DEBUG nova.virt.libvirt.guest [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-8bf8b5e1-14c0-4be1-9734-e2610e0b9950">
Jan 23 04:37:24 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  </source>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 04:37:24 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  </auth>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:37:24 np0005593233 nova_compute[222017]:  <serial>8bf8b5e1-14c0-4be1-9734-e2610e0b9950</serial>
Jan 23 04:37:24 np0005593233 nova_compute[222017]: </disk>
Jan 23 04:37:24 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:37:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:24.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.807 222021 DEBUG nova.virt.libvirt.driver [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.808 222021 DEBUG nova.virt.libvirt.driver [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.808 222021 DEBUG nova.virt.libvirt.driver [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:24 np0005593233 nova_compute[222017]: 2026-01-23 09:37:24.808 222021 DEBUG nova.virt.libvirt.driver [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No VIF found with MAC fa:16:3e:e1:b1:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.066 222021 DEBUG oslo_concurrency.lockutils [None req-7b3c9e95-af06-4278-bde3-0e4fbf688f46 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.724 222021 DEBUG nova.network.neutron [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Updating instance_info_cache with network_info: [{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.752 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Releasing lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.753 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance network_info: |[{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.755 222021 DEBUG oslo_concurrency.lockutils [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.755 222021 DEBUG nova.network.neutron [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Refreshing network info cache for port 541053d6-d3d0-4da0-9b9c-630177f53234 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.759 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start _get_guest_xml network_info=[{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.765 222021 WARNING nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.771 222021 DEBUG nova.virt.libvirt.host [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.771 222021 DEBUG nova.virt.libvirt.host [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.775 222021 DEBUG nova.virt.libvirt.host [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.775 222021 DEBUG nova.virt.libvirt.host [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.777 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.777 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.778 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.778 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.778 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.778 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.779 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.779 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.779 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.779 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.780 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.780 222021 DEBUG nova.virt.hardware [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:37:25 np0005593233 nova_compute[222017]: 2026-01-23 09:37:25.782 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1621761364' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.260 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.292 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.297 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:26.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 23 04:37:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:26.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/945830052' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.809 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.811 222021 DEBUG nova.virt.libvirt.vif [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:19Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.812 222021 DEBUG nova.network.os_vif_util [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.813 222021 DEBUG nova.network.os_vif_util [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.815 222021 DEBUG nova.objects.instance [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_devices' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.852 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <uuid>45a72d2f-6d73-4d45-873b-96eff48e3d22</uuid>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <name>instance-0000001e</name>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersAdminTestJSON-server-1151494829</nova:name>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:37:25</nova:creationTime>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:user uuid="191a72cfd0a841e9806246e07eb62fa6">tempest-ServersAdminTestJSON-1167530593-project-member</nova:user>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:project uuid="1a5f46b255cd4387bd3e4c0acaa39466">tempest-ServersAdminTestJSON-1167530593</nova:project>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <nova:port uuid="541053d6-d3d0-4da0-9b9c-630177f53234">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <entry name="serial">45a72d2f-6d73-4d45-873b-96eff48e3d22</entry>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <entry name="uuid">45a72d2f-6d73-4d45-873b-96eff48e3d22</entry>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/45a72d2f-6d73-4d45-873b-96eff48e3d22_disk">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:ab:99:47"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <target dev="tap541053d6-d3"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/console.log" append="off"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:37:26 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:37:26 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:37:26 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:37:26 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.852 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Preparing to wait for external event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.853 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.853 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.853 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.854 222021 DEBUG nova.virt.libvirt.vif [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:19Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.854 222021 DEBUG nova.network.os_vif_util [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.855 222021 DEBUG nova.network.os_vif_util [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.855 222021 DEBUG os_vif [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.856 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.857 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.861 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap541053d6-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.862 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap541053d6-d3, col_values=(('external_ids', {'iface-id': '541053d6-d3d0-4da0-9b9c-630177f53234', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:99:47', 'vm-uuid': '45a72d2f-6d73-4d45-873b-96eff48e3d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:26 np0005593233 NetworkManager[48871]: <info>  [1769161046.8644] manager: (tap541053d6-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.863 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.866 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.874 222021 INFO os_vif [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3')#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.940 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.941 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.941 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No VIF found with MAC fa:16:3e:ab:99:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.942 222021 INFO nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Using config drive#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.975 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.983 222021 DEBUG oslo_concurrency.lockutils [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:26 np0005593233 nova_compute[222017]: 2026-01-23 09:37:26.984 222021 DEBUG oslo_concurrency.lockutils [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.011 222021 INFO nova.compute.manager [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Detaching volume 8bf8b5e1-14c0-4be1-9734-e2610e0b9950#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.377 222021 INFO nova.virt.block_device [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Attempting to driver detach volume 8bf8b5e1-14c0-4be1-9734-e2610e0b9950 from mountpoint /dev/vdb#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.390 222021 DEBUG nova.virt.libvirt.driver [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Attempting to detach device vdb from instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.390 222021 DEBUG nova.virt.libvirt.guest [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-8bf8b5e1-14c0-4be1-9734-e2610e0b9950">
Jan 23 04:37:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  </source>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <serial>8bf8b5e1-14c0-4be1-9734-e2610e0b9950</serial>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]: </disk>
Jan 23 04:37:27 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.398 222021 INFO nova.virt.libvirt.driver [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Successfully detached device vdb from instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc from the persistent domain config.#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.399 222021 DEBUG nova.virt.libvirt.driver [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.399 222021 DEBUG nova.virt.libvirt.guest [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-8bf8b5e1-14c0-4be1-9734-e2610e0b9950">
Jan 23 04:37:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  </source>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <serial>8bf8b5e1-14c0-4be1-9734-e2610e0b9950</serial>
Jan 23 04:37:27 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 04:37:27 np0005593233 nova_compute[222017]: </disk>
Jan 23 04:37:27 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.454 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769161047.4539979, f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.456 222021 DEBUG nova.virt.libvirt.driver [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.458 222021 INFO nova.virt.libvirt.driver [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Successfully detached device vdb from instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc from the live domain config.#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.740 222021 INFO nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating config drive at /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.747 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp549_d333 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.879 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp549_d333" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.917 222021 DEBUG nova.storage.rbd_utils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:27 np0005593233 nova_compute[222017]: 2026-01-23 09:37:27.922 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.027 222021 DEBUG nova.objects.instance [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'flavor' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.211 222021 DEBUG oslo_concurrency.lockutils [None req-1da11ebf-e6e6-443e-ab87-3b91edf83319 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.255 222021 DEBUG oslo_concurrency.processutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.333s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.256 222021 INFO nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deleting local config drive /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config because it was imported into RBD.#033[00m
Jan 23 04:37:28 np0005593233 kernel: tap541053d6-d3: entered promiscuous mode
Jan 23 04:37:28 np0005593233 NetworkManager[48871]: <info>  [1769161048.3208] manager: (tap541053d6-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.321 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:28Z|00082|binding|INFO|Claiming lport 541053d6-d3d0-4da0-9b9c-630177f53234 for this chassis.
Jan 23 04:37:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:28Z|00083|binding|INFO|541053d6-d3d0-4da0-9b9c-630177f53234: Claiming fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.329 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:99:47 10.100.0.9'], port_security=['fa:16:3e:ab:99:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45a72d2f-6d73-4d45-873b-96eff48e3d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=541053d6-d3d0-4da0-9b9c-630177f53234) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.330 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 541053d6-d3d0-4da0-9b9c-630177f53234 in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c bound to our chassis#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.332 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.338 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:28Z|00084|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 ovn-installed in OVS
Jan 23 04:37:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:28Z|00085|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 up in Southbound
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.341 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.350 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e23476be-cb57-44b0-966e-95ace4885c82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.351 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f2b13ad-71 in ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.353 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f2b13ad-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.353 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[817d1140-2fb6-4b40-b3dd-587abb78165b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.354 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[63ca84d3-6af1-465d-b9f2-54ecd00820ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 systemd-udevd[235606]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.370 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[93c4469c-030b-4b6c-ab70-f274a33ca0a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 systemd-machined[190954]: New machine qemu-18-instance-0000001e.
Jan 23 04:37:28 np0005593233 NetworkManager[48871]: <info>  [1769161048.3805] device (tap541053d6-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:37:28 np0005593233 NetworkManager[48871]: <info>  [1769161048.3810] device (tap541053d6-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:37:28 np0005593233 systemd[1]: Started Virtual Machine qemu-18-instance-0000001e.
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.401 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b9d35a-bd7c-409f-b614-1fe7abfb764d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.436 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2268b2c8-18e4-4cf2-bc7b-591d437094cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 NetworkManager[48871]: <info>  [1769161048.4456] manager: (tap1f2b13ad-70): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.444 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fe649062-c14a-400e-b05c-e8727eda42b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:28.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.483 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7504d19c-b4cb-4044-9bc1-fdbf70d2b4cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.486 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1ccbc9-0f6d-4dac-9501-b1c182ec5af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 NetworkManager[48871]: <info>  [1769161048.5171] device (tap1f2b13ad-70): carrier: link connected
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.526 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c91d3dd8-7592-40e2-b343-0cd0bfedbde1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.547 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[133d2cb9-04ec-487a-9f66-6055508d684a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 19661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235639, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.568 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1d366b77-1a18-467e-8c63-533fe91ab55c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:78b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496114, 'tstamp': 496114}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235640, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.589 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca304660-740f-49f2-86db-b2f26f994d51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 19661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235641, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.625 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6581e054-4b1e-492d-b375-ecd66d243f02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:28.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.697 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc60e9b-ddb5-4459-b597-54a4968ecdc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.698 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.699 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.699 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:28 np0005593233 NetworkManager[48871]: <info>  [1769161048.7017] manager: (tap1f2b13ad-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.701 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 kernel: tap1f2b13ad-70: entered promiscuous mode
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.705 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.709 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.711 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:28Z|00086|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.711 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.713 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.715 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[15f7a9b4-0968-4dff-a747-2b734e737f83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.716 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.pid.haproxy
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:37:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:28.717 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'env', 'PROCESS_TAG=haproxy-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:37:28 np0005593233 nova_compute[222017]: 2026-01-23 09:37:28.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.068 222021 DEBUG nova.network.neutron [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Updated VIF entry in instance network info cache for port 541053d6-d3d0-4da0-9b9c-630177f53234. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.069 222021 DEBUG nova.network.neutron [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Updating instance_info_cache with network_info: [{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.072 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.093 222021 DEBUG oslo_concurrency.lockutils [req-a142655f-e8ec-4120-a557-c925e48de8bb req-842c19bd-edce-4ac7-b6c3-2a9b92e912c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:29 np0005593233 podman[235673]: 2026-01-23 09:37:29.164036621 +0000 UTC m=+0.061322403 container create e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 04:37:29 np0005593233 systemd[1]: Started libpod-conmon-e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b.scope.
Jan 23 04:37:29 np0005593233 podman[235673]: 2026-01-23 09:37:29.135666115 +0000 UTC m=+0.032951917 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:37:29 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:37:29 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/109f64ecd9a400035d256309eb8b4e240eb18a41fdf2e1d28dbebed98f2b5ec3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:37:29 np0005593233 podman[235673]: 2026-01-23 09:37:29.276560367 +0000 UTC m=+0.173846189 container init e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:37:29 np0005593233 podman[235673]: 2026-01-23 09:37:29.28616869 +0000 UTC m=+0.183454492 container start e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 04:37:29 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [NOTICE]   (235692) : New worker (235694) forked
Jan 23 04:37:29 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [NOTICE]   (235692) : Loading success.
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.717 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161049.7168355, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.718 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Started (Lifecycle Event)#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.751 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.755 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161049.7171423, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.756 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.786 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.790 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:29 np0005593233 nova_compute[222017]: 2026-01-23 09:37:29.827 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:30.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:30.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.395 222021 DEBUG nova.compute.manager [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.396 222021 DEBUG oslo_concurrency.lockutils [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.396 222021 DEBUG oslo_concurrency.lockutils [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.396 222021 DEBUG oslo_concurrency.lockutils [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.396 222021 DEBUG nova.compute.manager [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Processing event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.397 222021 DEBUG nova.compute.manager [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.397 222021 DEBUG oslo_concurrency.lockutils [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.397 222021 DEBUG oslo_concurrency.lockutils [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.397 222021 DEBUG oslo_concurrency.lockutils [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.397 222021 DEBUG nova.compute.manager [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.397 222021 WARNING nova.compute.manager [req-9fe562f6-8fcb-4ad9-9709-4ec3816b465c req-b138ad2b-44da-4bcd-aba9-53028eec6289 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.398 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.402 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161051.4017613, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.402 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.409 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.412 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance spawned successfully.#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.413 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.458 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.466 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.470 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.471 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.473 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.474 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.475 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.476 222021 DEBUG nova.virt.libvirt.driver [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:31 np0005593233 nova_compute[222017]: 2026-01-23 09:37:31.864 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:32 np0005593233 podman[235745]: 2026-01-23 09:37:32.119536347 +0000 UTC m=+0.114278238 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 23 04:37:32 np0005593233 nova_compute[222017]: 2026-01-23 09:37:32.356 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:32 np0005593233 nova_compute[222017]: 2026-01-23 09:37:32.438 222021 INFO nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Took 12.78 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:37:32 np0005593233 nova_compute[222017]: 2026-01-23 09:37:32.438 222021 DEBUG nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:32 np0005593233 nova_compute[222017]: 2026-01-23 09:37:32.521 222021 INFO nova.compute.manager [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Took 13.97 seconds to build instance.#033[00m
Jan 23 04:37:32 np0005593233 nova_compute[222017]: 2026-01-23 09:37:32.544 222021 DEBUG oslo_concurrency.lockutils [None req-2a4e4c96-da74-4fb6-907b-9d17d4c6644e 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:32.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 23 04:37:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:34 np0005593233 nova_compute[222017]: 2026-01-23 09:37:34.075 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:34.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:36.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:36.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:36 np0005593233 nova_compute[222017]: 2026-01-23 09:37:36.866 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:38.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:39 np0005593233 nova_compute[222017]: 2026-01-23 09:37:39.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/25512992' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:40.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:40.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:41 np0005593233 nova_compute[222017]: 2026-01-23 09:37:41.868 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:42.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:42.638 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:42.639 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:42.640 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:42.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:42 np0005593233 nova_compute[222017]: 2026-01-23 09:37:42.960 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:42 np0005593233 nova_compute[222017]: 2026-01-23 09:37:42.960 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:42 np0005593233 nova_compute[222017]: 2026-01-23 09:37:42.981 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.101 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.101 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.113 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.114 222021 INFO nova.compute.claims [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.334 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/661313113' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.839 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.847 222021 DEBUG nova.compute.provider_tree [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.866 222021 DEBUG nova.scheduler.client.report [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.893 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.895 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.951 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.952 222021 DEBUG nova.network.neutron [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:37:43 np0005593233 nova_compute[222017]: 2026-01-23 09:37:43.990 222021 INFO nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.019 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:37:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.178 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.181 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.181 222021 INFO nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Creating image(s)#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.218 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.254 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.292 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.300 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:44Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:37:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:44Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:37:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:37:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2201441979' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:37:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:37:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2201441979' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.378 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.380 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.381 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.381 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.409 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.414 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cc1d9141-e8bc-42a4-9690-50bc53d25998_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.679 222021 DEBUG nova.policy [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '191a72cfd0a841e9806246e07eb62fa6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:37:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:44.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.752 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593233 nova_compute[222017]: 2026-01-23 09:37:44.965 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cc1d9141-e8bc-42a4-9690-50bc53d25998_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.069 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] resizing rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:37:45 np0005593233 podman[235888]: 2026-01-23 09:37:45.072146692 +0000 UTC m=+0.072785816 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.250 222021 DEBUG nova.objects.instance [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'migration_context' on Instance uuid cc1d9141-e8bc-42a4-9690-50bc53d25998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.277 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.278 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Ensure instance console log exists: /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.278 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.279 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:45 np0005593233 nova_compute[222017]: 2026-01-23 09:37:45.279 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:46.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:46.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:46 np0005593233 nova_compute[222017]: 2026-01-23 09:37:46.871 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:47 np0005593233 nova_compute[222017]: 2026-01-23 09:37:47.303 222021 DEBUG nova.network.neutron [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Successfully created port: bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:37:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:48.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:48.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:49 np0005593233 nova_compute[222017]: 2026-01-23 09:37:49.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:49 np0005593233 nova_compute[222017]: 2026-01-23 09:37:49.806 222021 DEBUG nova.network.neutron [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Successfully updated port: bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:37:49 np0005593233 nova_compute[222017]: 2026-01-23 09:37:49.825 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:49 np0005593233 nova_compute[222017]: 2026-01-23 09:37:49.826 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquired lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:49 np0005593233 nova_compute[222017]: 2026-01-23 09:37:49.826 222021 DEBUG nova.network.neutron [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:37:50 np0005593233 nova_compute[222017]: 2026-01-23 09:37:50.015 222021 DEBUG nova.network.neutron [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:37:50 np0005593233 nova_compute[222017]: 2026-01-23 09:37:50.528 222021 DEBUG nova.compute.manager [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-changed-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:50 np0005593233 nova_compute[222017]: 2026-01-23 09:37:50.528 222021 DEBUG nova.compute.manager [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Refreshing instance network info cache due to event network-changed-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:37:50 np0005593233 nova_compute[222017]: 2026-01-23 09:37:50.529 222021 DEBUG oslo_concurrency.lockutils [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:50.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:50.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:51 np0005593233 nova_compute[222017]: 2026-01-23 09:37:51.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.195 222021 DEBUG nova.network.neutron [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Updating instance_info_cache with network_info: [{"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.228 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Releasing lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.229 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Instance network_info: |[{"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.230 222021 DEBUG oslo_concurrency.lockutils [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.230 222021 DEBUG nova.network.neutron [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Refreshing network info cache for port bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.237 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Start _get_guest_xml network_info=[{"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.245 222021 WARNING nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.251 222021 DEBUG nova.virt.libvirt.host [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.252 222021 DEBUG nova.virt.libvirt.host [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.256 222021 DEBUG nova.virt.libvirt.host [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.257 222021 DEBUG nova.virt.libvirt.host [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.258 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.259 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.259 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.259 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.260 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.260 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.260 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.260 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.261 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.261 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.261 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.261 222021 DEBUG nova.virt.hardware [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.264 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:52.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:52.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2924278092' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.802 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.840 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:52 np0005593233 nova_compute[222017]: 2026-01-23 09:37:52.845 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.132 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4227296551' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.299 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.301 222021 DEBUG nova.virt.libvirt.vif [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-2129961106',display_name='tempest-ServersAdminTestJSON-server-2129961106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-2129961106',id=33,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-ob9oijh4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:44Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=cc1d9141-e8bc-42a4-9690-50bc53d25998,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.302 222021 DEBUG nova.network.os_vif_util [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.303 222021 DEBUG nova.network.os_vif_util [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.305 222021 DEBUG nova.objects.instance [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_devices' on Instance uuid cc1d9141-e8bc-42a4-9690-50bc53d25998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.354 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <uuid>cc1d9141-e8bc-42a4-9690-50bc53d25998</uuid>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <name>instance-00000021</name>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersAdminTestJSON-server-2129961106</nova:name>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:37:52</nova:creationTime>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:user uuid="191a72cfd0a841e9806246e07eb62fa6">tempest-ServersAdminTestJSON-1167530593-project-member</nova:user>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:project uuid="1a5f46b255cd4387bd3e4c0acaa39466">tempest-ServersAdminTestJSON-1167530593</nova:project>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <nova:port uuid="bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <entry name="serial">cc1d9141-e8bc-42a4-9690-50bc53d25998</entry>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <entry name="uuid">cc1d9141-e8bc-42a4-9690-50bc53d25998</entry>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cc1d9141-e8bc-42a4-9690-50bc53d25998_disk">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cc1d9141-e8bc-42a4-9690-50bc53d25998_disk.config">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:24:c9:5e"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <target dev="tapbc45bc9a-0c"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/console.log" append="off"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:37:53 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:37:53 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:37:53 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:37:53 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.356 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Preparing to wait for external event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.356 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.357 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.357 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.358 222021 DEBUG nova.virt.libvirt.vif [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-2129961106',display_name='tempest-ServersAdminTestJSON-server-2129961106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-2129961106',id=33,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-ob9oijh4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:44Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=cc1d9141-e8bc-42a4-9690-50bc53d25998,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.358 222021 DEBUG nova.network.os_vif_util [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.359 222021 DEBUG nova.network.os_vif_util [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.360 222021 DEBUG os_vif [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.361 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.362 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.366 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.366 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc45bc9a-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.367 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc45bc9a-0c, col_values=(('external_ids', {'iface-id': 'bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:c9:5e', 'vm-uuid': 'cc1d9141-e8bc-42a4-9690-50bc53d25998'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.368 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:53 np0005593233 NetworkManager[48871]: <info>  [1769161073.3720] manager: (tapbc45bc9a-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.377 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.378 222021 INFO os_vif [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c')#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.530 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.531 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.532 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No VIF found with MAC fa:16:3e:24:c9:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.534 222021 INFO nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Using config drive#033[00m
Jan 23 04:37:53 np0005593233 nova_compute[222017]: 2026-01-23 09:37:53.562 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.058 222021 INFO nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Creating config drive at /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/disk.config#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.066 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj1_am94n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.186 222021 DEBUG nova.network.neutron [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Updated VIF entry in instance network info cache for port bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.187 222021 DEBUG nova.network.neutron [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Updating instance_info_cache with network_info: [{"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.201 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj1_am94n" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.233 222021 DEBUG nova.storage.rbd_utils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image cc1d9141-e8bc-42a4-9690-50bc53d25998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.239 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/disk.config cc1d9141-e8bc-42a4-9690-50bc53d25998_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.279 222021 DEBUG oslo_concurrency.lockutils [req-f32b61f1-ef63-4611-93b5-5b3b8bdf827b req-80190171-a248-420a-a29a-86f829c1e0ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.456 222021 DEBUG oslo_concurrency.processutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/disk.config cc1d9141-e8bc-42a4-9690-50bc53d25998_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.457 222021 INFO nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Deleting local config drive /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998/disk.config because it was imported into RBD.#033[00m
Jan 23 04:37:54 np0005593233 kernel: tapbc45bc9a-0c: entered promiscuous mode
Jan 23 04:37:54 np0005593233 NetworkManager[48871]: <info>  [1769161074.5261] manager: (tapbc45bc9a-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Jan 23 04:37:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:54Z|00087|binding|INFO|Claiming lport bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e for this chassis.
Jan 23 04:37:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:54Z|00088|binding|INFO|bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e: Claiming fa:16:3e:24:c9:5e 10.100.0.12
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.537 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c9:5e 10.100.0.12'], port_security=['fa:16:3e:24:c9:5e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cc1d9141-e8bc-42a4-9690-50bc53d25998', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.539 140224 INFO neutron.agent.ovn.metadata.agent [-] Port bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c bound to our chassis#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.543 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:37:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:54Z|00089|binding|INFO|Setting lport bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e ovn-installed in OVS
Jan 23 04:37:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:54Z|00090|binding|INFO|Setting lport bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e up in Southbound
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.566 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dde0f571-841d-478d-942b-913c84b0f524]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:54 np0005593233 systemd-udevd[236247]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:37:54 np0005593233 systemd-machined[190954]: New machine qemu-19-instance-00000021.
Jan 23 04:37:54 np0005593233 systemd[1]: Started Virtual Machine qemu-19-instance-00000021.
Jan 23 04:37:54 np0005593233 NetworkManager[48871]: <info>  [1769161074.6030] device (tapbc45bc9a-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:37:54 np0005593233 NetworkManager[48871]: <info>  [1769161074.6040] device (tapbc45bc9a-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.604 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[25a0b76f-a4b5-4135-a919-7f6a30beec0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.608 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[82307506-d64b-4c7a-9b83-193e91a60c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.637 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f0409a-0dfe-4081-a667-a7057260021e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.655 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc5706f-fd76-46ea-9638-bfc4e34a07a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 19661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236256, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.669 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[81b54de4-e6a7-40c0-861e-c1441ebaee00]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496128, 'tstamp': 496128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236260, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496132, 'tstamp': 496132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236260, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.672 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.674 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:54 np0005593233 nova_compute[222017]: 2026-01-23 09:37:54.675 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.678 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.679 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.679 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:54.680 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:54.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:37:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:54.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.135 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161075.134517, cc1d9141-e8bc-42a4-9690-50bc53d25998 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.135 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] VM Started (Lifecycle Event)#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.170 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.175 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161075.1347024, cc1d9141-e8bc-42a4-9690-50bc53d25998 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.176 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.198 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.203 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.234 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.287 222021 DEBUG nova.compute.manager [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.288 222021 DEBUG oslo_concurrency.lockutils [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.289 222021 DEBUG oslo_concurrency.lockutils [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.289 222021 DEBUG oslo_concurrency.lockutils [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.290 222021 DEBUG nova.compute.manager [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Processing event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.291 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.295 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161075.2952883, cc1d9141-e8bc-42a4-9690-50bc53d25998 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.295 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.297 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.301 222021 INFO nova.virt.libvirt.driver [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Instance spawned successfully.#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.301 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.354 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.361 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.365 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.365 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.366 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.366 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.367 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.367 222021 DEBUG nova.virt.libvirt.driver [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:37:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:37:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:37:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:37:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.425 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.491 222021 INFO nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Took 11.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.492 222021 DEBUG nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.690 222021 INFO nova.compute.manager [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Took 12.63 seconds to build instance.#033[00m
Jan 23 04:37:55 np0005593233 nova_compute[222017]: 2026-01-23 09:37:55.740 222021 DEBUG oslo_concurrency.lockutils [None req-422ed295-2f0b-4b58-8f9c-0c47378cf854 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:56.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:56.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.184 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.186 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.188 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.188 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.189 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.192 222021 INFO nova.compute.manager [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Terminating instance#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.194 222021 DEBUG nova.compute.manager [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:37:57 np0005593233 kernel: tap257f01a3-c4 (unregistering): left promiscuous mode
Jan 23 04:37:57 np0005593233 NetworkManager[48871]: <info>  [1769161077.3272] device (tap257f01a3-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:37:57 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:57Z|00091|binding|INFO|Releasing lport 257f01a3-c4a8-4e7f-a76b-ad302970586d from this chassis (sb_readonly=0)
Jan 23 04:37:57 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:57Z|00092|binding|INFO|Setting lport 257f01a3-c4a8-4e7f-a76b-ad302970586d down in Southbound
Jan 23 04:37:57 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:57Z|00093|binding|INFO|Removing iface tap257f01a3-c4 ovn-installed in OVS
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.359 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.371 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.371 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:b1:af 10.100.0.6'], port_security=['fa:16:3e:e1:b1:af 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f68f8c2203944c9a6e44a6756c8b4b9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cace19e3-4515-4611-845a-54d5fb8f6a17', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2448bc-0bf3-4fe3-aeb3-04d125f323ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=257f01a3-c4a8-4e7f-a76b-ad302970586d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.376 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 257f01a3-c4a8-4e7f-a76b-ad302970586d in datapath ef05741c-2d3e-419c-adbb-a2a3bca97f59 unbound from our chassis#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.378 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef05741c-2d3e-419c-adbb-a2a3bca97f59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.379 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[475c4dd0-80e5-4c19-bd41-46900ffebf9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.380 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 namespace which is not needed anymore#033[00m
Jan 23 04:37:57 np0005593233 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 23 04:37:57 np0005593233 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001c.scope: Consumed 17.068s CPU time.
Jan 23 04:37:57 np0005593233 systemd-machined[190954]: Machine qemu-17-instance-0000001c terminated.
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.439 222021 INFO nova.virt.libvirt.driver [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Instance destroyed successfully.#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.439 222021 DEBUG nova.objects.instance [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'resources' on Instance uuid f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.469 222021 DEBUG nova.virt.libvirt.vif [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:36:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1382474509',display_name='tempest-VolumesAdminNegativeTest-server-1382474509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-1382474509',id=28,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBELMwIbefZ+NS3vAuDYfwJRSTH1W1yIexIU189B5pmxU68Lx1yAkpZGNPjvBY4+9UGdkpB7hvqIyqFzykx4Z32orQ2CI/JRl2wMEWokCsnS+gbhMgRg9ToE9ME9CgCqyig==',key_name='tempest-keypair-863383197',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:36:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f68f8c2203944c9a6e44a6756c8b4b9',ramdisk_id='',reservation_id='r-m3to0xya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesAdminNegativeTest-905168495',owner_user_name='tempest-VolumesAdminNegativeTest-905168495-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:36:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='726bd44b7ec443a0a4b8b632b06c622e',uuid=f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.470 222021 DEBUG nova.network.os_vif_util [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converting VIF {"id": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "address": "fa:16:3e:e1:b1:af", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap257f01a3-c4", "ovs_interfaceid": "257f01a3-c4a8-4e7f-a76b-ad302970586d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.471 222021 DEBUG nova.network.os_vif_util [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.471 222021 DEBUG os_vif [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.473 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.474 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap257f01a3-c4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.476 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.480 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.482 222021 INFO os_vif [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:b1:af,bridge_name='br-int',has_traffic_filtering=True,id=257f01a3-c4a8-4e7f-a76b-ad302970586d,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap257f01a3-c4')#033[00m
Jan 23 04:37:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [NOTICE]   (235150) : haproxy version is 2.8.14-c23fe91
Jan 23 04:37:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [NOTICE]   (235150) : path to executable is /usr/sbin/haproxy
Jan 23 04:37:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [WARNING]  (235150) : Exiting Master process...
Jan 23 04:37:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [WARNING]  (235150) : Exiting Master process...
Jan 23 04:37:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [ALERT]    (235150) : Current worker (235152) exited with code 143 (Terminated)
Jan 23 04:37:57 np0005593233 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[235146]: [WARNING]  (235150) : All workers exited. Exiting... (0)
Jan 23 04:37:57 np0005593233 systemd[1]: libpod-d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70.scope: Deactivated successfully.
Jan 23 04:37:57 np0005593233 podman[236335]: 2026-01-23 09:37:57.563688675 +0000 UTC m=+0.057834144 container died d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 04:37:57 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70-userdata-shm.mount: Deactivated successfully.
Jan 23 04:37:57 np0005593233 systemd[1]: var-lib-containers-storage-overlay-c8039dbf9572d51a7410ec776136cc29f907bad6c0072fdaf910ea7acc90d896-merged.mount: Deactivated successfully.
Jan 23 04:37:57 np0005593233 podman[236335]: 2026-01-23 09:37:57.626928021 +0000 UTC m=+0.121073490 container cleanup d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.640 222021 DEBUG nova.compute.manager [req-28427f2f-ebad-4f20-add1-145a76c34653 req-0bdd7226-5673-4445-a00f-617b97a47b98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.640 222021 DEBUG oslo_concurrency.lockutils [req-28427f2f-ebad-4f20-add1-145a76c34653 req-0bdd7226-5673-4445-a00f-617b97a47b98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.640 222021 DEBUG oslo_concurrency.lockutils [req-28427f2f-ebad-4f20-add1-145a76c34653 req-0bdd7226-5673-4445-a00f-617b97a47b98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.640 222021 DEBUG oslo_concurrency.lockutils [req-28427f2f-ebad-4f20-add1-145a76c34653 req-0bdd7226-5673-4445-a00f-617b97a47b98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.640 222021 DEBUG nova.compute.manager [req-28427f2f-ebad-4f20-add1-145a76c34653 req-0bdd7226-5673-4445-a00f-617b97a47b98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] No waiting events found dispatching network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.641 222021 WARNING nova.compute.manager [req-28427f2f-ebad-4f20-add1-145a76c34653 req-0bdd7226-5673-4445-a00f-617b97a47b98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received unexpected event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e for instance with vm_state active and task_state None.#033[00m
Jan 23 04:37:57 np0005593233 systemd[1]: libpod-conmon-d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70.scope: Deactivated successfully.
Jan 23 04:37:57 np0005593233 podman[236383]: 2026-01-23 09:37:57.702700664 +0000 UTC m=+0.050007412 container remove d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.714 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c95fda-4621-489e-bddc-86be7838043d]: (4, ('Fri Jan 23 09:37:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 (d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70)\nd9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70\nFri Jan 23 09:37:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 (d9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70)\nd9f4fbe99539a5870bc527ec69edfa20624ae5ea3c5e4e03a62e639262895a70\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.716 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd91d1e-82b5-4555-939b-3bd5e93073dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.717 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef05741c-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.719 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593233 kernel: tapef05741c-20: left promiscuous mode
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.740 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.746 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[df128469-305b-4139-aa2f-e49d794d1bf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.772 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6201655f-21bf-4824-b1e5-4778ed7735a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.774 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb1b0e9-6c36-4a48-be26-f1d2937a0584]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.791 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78fb5cd1-ed80-49b9-b1f4-850c42cf6f54]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492896, 'reachable_time': 31339, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236397, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 systemd[1]: run-netns-ovnmeta\x2def05741c\x2d2d3e\x2d419c\x2dadbb\x2da2a3bca97f59.mount: Deactivated successfully.
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.796 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:37:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:57.796 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[b06eb14e-4e92-4317-9f8b-81bb8921d70a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.885 222021 DEBUG nova.compute.manager [req-95367401-cd0f-404d-bfe7-a48684a71d0c req-fc1168a4-6cf8-422f-b028-06c850a3fe55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-vif-unplugged-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.885 222021 DEBUG oslo_concurrency.lockutils [req-95367401-cd0f-404d-bfe7-a48684a71d0c req-fc1168a4-6cf8-422f-b028-06c850a3fe55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.885 222021 DEBUG oslo_concurrency.lockutils [req-95367401-cd0f-404d-bfe7-a48684a71d0c req-fc1168a4-6cf8-422f-b028-06c850a3fe55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.886 222021 DEBUG oslo_concurrency.lockutils [req-95367401-cd0f-404d-bfe7-a48684a71d0c req-fc1168a4-6cf8-422f-b028-06c850a3fe55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.886 222021 DEBUG nova.compute.manager [req-95367401-cd0f-404d-bfe7-a48684a71d0c req-fc1168a4-6cf8-422f-b028-06c850a3fe55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] No waiting events found dispatching network-vif-unplugged-257f01a3-c4a8-4e7f-a76b-ad302970586d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.886 222021 DEBUG nova.compute.manager [req-95367401-cd0f-404d-bfe7-a48684a71d0c req-fc1168a4-6cf8-422f-b028-06c850a3fe55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-vif-unplugged-257f01a3-c4a8-4e7f-a76b-ad302970586d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:37:57 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:57Z|00094|memory|INFO|peak resident set size grew 51% in last 1450.8 seconds, from 16256 kB to 24564 kB
Jan 23 04:37:57 np0005593233 ovn_controller[130653]: 2026-01-23T09:37:57Z|00095|memory|INFO|idl-cells-OVN_Southbound:10903 idl-cells-Open_vSwitch:927 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:381 lflow-cache-entries-cache-matches:294 lflow-cache-size-KB:1603 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:657 ofctrl_installed_flow_usage-KB:480 ofctrl_sb_flow_ref_usage-KB:246
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.990 222021 INFO nova.virt.libvirt.driver [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Deleting instance files /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_del#033[00m
Jan 23 04:37:57 np0005593233 nova_compute[222017]: 2026-01-23 09:37:57.991 222021 INFO nova.virt.libvirt.driver [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Deletion of /var/lib/nova/instances/f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc_del complete#033[00m
Jan 23 04:37:58 np0005593233 nova_compute[222017]: 2026-01-23 09:37:58.080 222021 INFO nova.compute.manager [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:37:58 np0005593233 nova_compute[222017]: 2026-01-23 09:37:58.080 222021 DEBUG oslo.service.loopingcall [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:37:58 np0005593233 nova_compute[222017]: 2026-01-23 09:37:58.080 222021 DEBUG nova.compute.manager [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:37:58 np0005593233 nova_compute[222017]: 2026-01-23 09:37:58.081 222021 DEBUG nova.network.neutron [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:37:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:58.082 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:58.084 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:37:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:37:58.085 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:58 np0005593233 nova_compute[222017]: 2026-01-23 09:37:58.085 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3734079513' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:58.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:37:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:37:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:58.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:37:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:59 np0005593233 nova_compute[222017]: 2026-01-23 09:37:59.089 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.651 222021 DEBUG nova.compute.manager [req-aa74cf79-3877-4bc4-a163-88de3fc7c79a req-632fad99-f7bd-4f22-b64a-f8fba84442c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.651 222021 DEBUG oslo_concurrency.lockutils [req-aa74cf79-3877-4bc4-a163-88de3fc7c79a req-632fad99-f7bd-4f22-b64a-f8fba84442c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.653 222021 DEBUG oslo_concurrency.lockutils [req-aa74cf79-3877-4bc4-a163-88de3fc7c79a req-632fad99-f7bd-4f22-b64a-f8fba84442c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.654 222021 DEBUG oslo_concurrency.lockutils [req-aa74cf79-3877-4bc4-a163-88de3fc7c79a req-632fad99-f7bd-4f22-b64a-f8fba84442c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.654 222021 DEBUG nova.compute.manager [req-aa74cf79-3877-4bc4-a163-88de3fc7c79a req-632fad99-f7bd-4f22-b64a-f8fba84442c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] No waiting events found dispatching network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.655 222021 WARNING nova.compute.manager [req-aa74cf79-3877-4bc4-a163-88de3fc7c79a req-632fad99-f7bd-4f22-b64a-f8fba84442c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received unexpected event network-vif-plugged-257f01a3-c4a8-4e7f-a76b-ad302970586d for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:38:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:00.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.779 222021 DEBUG nova.network.neutron [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.805 222021 INFO nova.compute.manager [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Took 2.72 seconds to deallocate network for instance.#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.884 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.885 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:00 np0005593233 nova_compute[222017]: 2026-01-23 09:38:00.928 222021 DEBUG nova.compute.manager [req-cdbb9e6f-d946-4e43-9694-a1a04e0c2356 req-363bd705-3edc-46c4-8ce3-29e768e96372 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Received event network-vif-deleted-257f01a3-c4a8-4e7f-a76b-ad302970586d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:00.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.108 222021 DEBUG oslo_concurrency.processutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2117449316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.599 222021 DEBUG oslo_concurrency.processutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.605 222021 DEBUG nova.compute.provider_tree [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.624 222021 DEBUG nova.scheduler.client.report [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.665 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.709 222021 INFO nova.scheduler.client.report [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Deleted allocations for instance f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc#033[00m
Jan 23 04:38:01 np0005593233 nova_compute[222017]: 2026-01-23 09:38:01.817 222021 DEBUG oslo_concurrency.lockutils [None req-3edafdbb-692d-443f-8c74-517f5c8d18fd 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:38:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:38:02 np0005593233 nova_compute[222017]: 2026-01-23 09:38:02.513 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:02.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:02.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:03 np0005593233 podman[236476]: 2026-01-23 09:38:03.104535883 +0000 UTC m=+0.110097689 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller)
Jan 23 04:38:03 np0005593233 nova_compute[222017]: 2026-01-23 09:38:03.420 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:04 np0005593233 nova_compute[222017]: 2026-01-23 09:38:04.090 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:04.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:04.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:05 np0005593233 nova_compute[222017]: 2026-01-23 09:38:05.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:06 np0005593233 nova_compute[222017]: 2026-01-23 09:38:06.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:06.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:06.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.415 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.416 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.515 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2656382896' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:07 np0005593233 nova_compute[222017]: 2026-01-23 09:38:07.950 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.067 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.068 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.072 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.072 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.268 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.269 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4251MB free_disk=20.79494857788086GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.270 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.270 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:08.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.865 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 45a72d2f-6d73-4d45-873b-96eff48e3d22 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.865 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance cc1d9141-e8bc-42a4-9690-50bc53d25998 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.865 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:38:08 np0005593233 nova_compute[222017]: 2026-01-23 09:38:08.866 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:38:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:08.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.054 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.095 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/818591784' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.553 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.562 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.582 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.622 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:38:09 np0005593233 nova_compute[222017]: 2026-01-23 09:38:09.623 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:10 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:10Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:c9:5e 10.100.0.12
Jan 23 04:38:10 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:10Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:c9:5e 10.100.0.12
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.625 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.625 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.625 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:38:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:10.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.861 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.862 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.862 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:38:10 np0005593233 nova_compute[222017]: 2026-01-23 09:38:10.863 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:10.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:12 np0005593233 nova_compute[222017]: 2026-01-23 09:38:12.436 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161077.434449, f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:12 np0005593233 nova_compute[222017]: 2026-01-23 09:38:12.436 222021 INFO nova.compute.manager [-] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:38:12 np0005593233 nova_compute[222017]: 2026-01-23 09:38:12.465 222021 DEBUG nova.compute.manager [None req-0a071736-b5a4-4caf-904c-769878ef4f07 - - - - - -] [instance: f8b984c8-aa93-48e5-b6c5-15c6b22ca3cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:12 np0005593233 nova_compute[222017]: 2026-01-23 09:38:12.523 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:12.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:12.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:14 np0005593233 nova_compute[222017]: 2026-01-23 09:38:14.095 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:14.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:14.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:16 np0005593233 podman[236548]: 2026-01-23 09:38:16.056418315 +0000 UTC m=+0.067815167 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:38:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:16.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:16 np0005593233 nova_compute[222017]: 2026-01-23 09:38:16.981 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Updating instance_info_cache with network_info: [{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:16.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.142 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-45a72d2f-6d73-4d45-873b-96eff48e3d22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.143 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.144 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.145 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.145 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.146 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.146 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.147 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.428 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.429 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.429 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.519 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:38:17 np0005593233 nova_compute[222017]: 2026-01-23 09:38:17.525 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:18Z|00096|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593233 nova_compute[222017]: 2026-01-23 09:38:18.051 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:18Z|00097|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593233 nova_compute[222017]: 2026-01-23 09:38:18.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:18.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:19.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:19 np0005593233 nova_compute[222017]: 2026-01-23 09:38:19.097 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:20.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:21.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:22 np0005593233 nova_compute[222017]: 2026-01-23 09:38:22.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:22.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:23.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:24 np0005593233 nova_compute[222017]: 2026-01-23 09:38:24.100 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:25.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:38:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3488761959' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:38:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:38:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3488761959' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:38:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:26.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:27.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:27 np0005593233 nova_compute[222017]: 2026-01-23 09:38:27.532 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:28.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:29.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:29 np0005593233 nova_compute[222017]: 2026-01-23 09:38:29.103 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:30 np0005593233 nova_compute[222017]: 2026-01-23 09:38:30.570 222021 INFO nova.compute.manager [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Rebuilding instance#033[00m
Jan 23 04:38:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:30.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:31.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.557 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.597 222021 DEBUG nova.compute.manager [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.738 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_requests' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.772 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_devices' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.799 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'resources' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.813 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'migration_context' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.841 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:38:31 np0005593233 nova_compute[222017]: 2026-01-23 09:38:31.846 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:38:32 np0005593233 nova_compute[222017]: 2026-01-23 09:38:32.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:32.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:33.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 podman[236569]: 2026-01-23 09:38:34.126457006 +0000 UTC m=+0.121087190 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 04:38:34 np0005593233 kernel: tap541053d6-d3 (unregistering): left promiscuous mode
Jan 23 04:38:34 np0005593233 NetworkManager[48871]: <info>  [1769161114.1649] device (tap541053d6-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.176 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:34Z|00098|binding|INFO|Releasing lport 541053d6-d3d0-4da0-9b9c-630177f53234 from this chassis (sb_readonly=0)
Jan 23 04:38:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:34Z|00099|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 down in Southbound
Jan 23 04:38:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:34Z|00100|binding|INFO|Removing iface tap541053d6-d3 ovn-installed in OVS
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.185 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.196 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 23 04:38:34 np0005593233 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001e.scope: Consumed 17.852s CPU time.
Jan 23 04:38:34 np0005593233 systemd-machined[190954]: Machine qemu-18-instance-0000001e terminated.
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.264 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:99:47 10.100.0.9'], port_security=['fa:16:3e:ab:99:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45a72d2f-6d73-4d45-873b-96eff48e3d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=541053d6-d3d0-4da0-9b9c-630177f53234) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.265 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 541053d6-d3d0-4da0-9b9c-630177f53234 in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c unbound from our chassis#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.267 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.293 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[37e0b8a3-2903-402d-9d8c-113fa87893e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.337 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[94f2add4-c08c-49c1-b562-05b9e95719c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.341 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c4a982-78ae-4941-9828-83fe4ceec932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.386 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8da28b-69ec-42b9-9e02-7a6a04f55036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.411 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b95db702-38dc-411f-8a3f-5e90ff2cbcd1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 19661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236608, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.432 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[79ea2960-efc8-4ba1-95e1-8718668551ce]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496128, 'tstamp': 496128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236615, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496132, 'tstamp': 496132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236615, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.434 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.441 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.441 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.441 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.442 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:34.442 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:34.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.868 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.875 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance destroyed successfully.#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.881 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance destroyed successfully.#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.882 222021 DEBUG nova.virt.libvirt.vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:29Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.882 222021 DEBUG nova.network.os_vif_util [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.883 222021 DEBUG nova.network.os_vif_util [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.884 222021 DEBUG os_vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.885 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.885 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap541053d6-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.887 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.889 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.890 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.892 222021 INFO os_vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3')#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.944 222021 DEBUG nova.compute.manager [req-95b5f309-b938-4856-9f48-a57444303985 req-754551ac-c6d7-4981-b389-6583f8e39e1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.945 222021 DEBUG oslo_concurrency.lockutils [req-95b5f309-b938-4856-9f48-a57444303985 req-754551ac-c6d7-4981-b389-6583f8e39e1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.945 222021 DEBUG oslo_concurrency.lockutils [req-95b5f309-b938-4856-9f48-a57444303985 req-754551ac-c6d7-4981-b389-6583f8e39e1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.946 222021 DEBUG oslo_concurrency.lockutils [req-95b5f309-b938-4856-9f48-a57444303985 req-754551ac-c6d7-4981-b389-6583f8e39e1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.946 222021 DEBUG nova.compute.manager [req-95b5f309-b938-4856-9f48-a57444303985 req-754551ac-c6d7-4981-b389-6583f8e39e1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:34 np0005593233 nova_compute[222017]: 2026-01-23 09:38:34.946 222021 WARNING nova.compute.manager [req-95b5f309-b938-4856-9f48-a57444303985 req-754551ac-c6d7-4981-b389-6583f8e39e1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state error and task_state rebuilding.#033[00m
Jan 23 04:38:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:35.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.542 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deleting instance files /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22_del#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.543 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deletion of /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22_del complete#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.802 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.803 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating image(s)#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.830 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.857 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.886 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.891 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:35 np0005593233 nova_compute[222017]: 2026-01-23 09:38:35.892 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:36.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:37.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.342 222021 DEBUG nova.compute.manager [req-d5118fa9-2efb-4b8d-901a-76645aa3d6d1 req-a1e65bac-fb6a-4cf8-899b-a6e9479fcf5b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.343 222021 DEBUG oslo_concurrency.lockutils [req-d5118fa9-2efb-4b8d-901a-76645aa3d6d1 req-a1e65bac-fb6a-4cf8-899b-a6e9479fcf5b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.344 222021 DEBUG oslo_concurrency.lockutils [req-d5118fa9-2efb-4b8d-901a-76645aa3d6d1 req-a1e65bac-fb6a-4cf8-899b-a6e9479fcf5b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.344 222021 DEBUG oslo_concurrency.lockutils [req-d5118fa9-2efb-4b8d-901a-76645aa3d6d1 req-a1e65bac-fb6a-4cf8-899b-a6e9479fcf5b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.344 222021 DEBUG nova.compute.manager [req-d5118fa9-2efb-4b8d-901a-76645aa3d6d1 req-a1e65bac-fb6a-4cf8-899b-a6e9479fcf5b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.344 222021 WARNING nova.compute.manager [req-d5118fa9-2efb-4b8d-901a-76645aa3d6d1 req-a1e65bac-fb6a-4cf8-899b-a6e9479fcf5b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state error and task_state rebuild_spawning.#033[00m
Jan 23 04:38:37 np0005593233 nova_compute[222017]: 2026-01-23 09:38:37.475 222021 DEBUG nova.virt.libvirt.imagebackend [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/ae1f9e37-418c-462f-81d1-3599a6d89de9/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/ae1f9e37-418c-462f-81d1-3599a6d89de9/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 04:38:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:38.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:39.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:39 np0005593233 nova_compute[222017]: 2026-01-23 09:38:39.107 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:39 np0005593233 nova_compute[222017]: 2026-01-23 09:38:39.888 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.576 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.649 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.650 222021 DEBUG nova.virt.images [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] ae1f9e37-418c-462f-81d1-3599a6d89de9 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.651 222021 DEBUG nova.privsep.utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.652 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:40.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.897 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted" returned: 0 in 0.245s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.903 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.977 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:40 np0005593233 nova_compute[222017]: 2026-01-23 09:38:40.978 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 5.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.006 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.011 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:41.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.366 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.355s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.444 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] resizing rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.559 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.560 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Ensure instance console log exists: /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.560 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.561 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.561 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.563 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start _get_guest_xml network_info=[{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.568 222021 WARNING nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.580 222021 DEBUG nova.virt.libvirt.host [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.581 222021 DEBUG nova.virt.libvirt.host [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.593 222021 DEBUG nova.virt.libvirt.host [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.594 222021 DEBUG nova.virt.libvirt.host [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.596 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.596 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.596 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.597 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.597 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.597 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.597 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.597 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.598 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.598 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.598 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.598 222021 DEBUG nova.virt.hardware [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.599 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:41 np0005593233 nova_compute[222017]: 2026-01-23 09:38:41.681 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:38:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1892494573' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.166 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.207 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.213 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:42.640 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:42.640 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:42.641 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:38:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3419663960' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.742 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.746 222021 DEBUG nova.virt.libvirt.vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:35Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.747 222021 DEBUG nova.network.os_vif_util [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.748 222021 DEBUG nova.network.os_vif_util [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.753 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <uuid>45a72d2f-6d73-4d45-873b-96eff48e3d22</uuid>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <name>instance-0000001e</name>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersAdminTestJSON-server-1151494829</nova:name>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:38:41</nova:creationTime>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:user uuid="191a72cfd0a841e9806246e07eb62fa6">tempest-ServersAdminTestJSON-1167530593-project-member</nova:user>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:project uuid="1a5f46b255cd4387bd3e4c0acaa39466">tempest-ServersAdminTestJSON-1167530593</nova:project>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <nova:port uuid="541053d6-d3d0-4da0-9b9c-630177f53234">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <entry name="serial">45a72d2f-6d73-4d45-873b-96eff48e3d22</entry>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <entry name="uuid">45a72d2f-6d73-4d45-873b-96eff48e3d22</entry>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/45a72d2f-6d73-4d45-873b-96eff48e3d22_disk">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:ab:99:47"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <target dev="tap541053d6-d3"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/console.log" append="off"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:38:42 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:38:42 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:38:42 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:38:42 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.754 222021 DEBUG nova.virt.libvirt.vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:35Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.754 222021 DEBUG nova.network.os_vif_util [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.755 222021 DEBUG nova.network.os_vif_util [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.755 222021 DEBUG os_vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.756 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.756 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.757 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.760 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap541053d6-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.761 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap541053d6-d3, col_values=(('external_ids', {'iface-id': '541053d6-d3d0-4da0-9b9c-630177f53234', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:99:47', 'vm-uuid': '45a72d2f-6d73-4d45-873b-96eff48e3d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:42 np0005593233 NetworkManager[48871]: <info>  [1769161122.7642] manager: (tap541053d6-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.768 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:42.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:42 np0005593233 nova_compute[222017]: 2026-01-23 09:38:42.769 222021 INFO os_vif [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3')#033[00m
Jan 23 04:38:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:43.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.083 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.084 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.084 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No VIF found with MAC fa:16:3e:ab:99:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.085 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Using config drive#033[00m
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.123 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.295 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:43 np0005593233 nova_compute[222017]: 2026-01-23 09:38:43.440 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'keypairs' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:44 np0005593233 nova_compute[222017]: 2026-01-23 09:38:44.110 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:44.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:45.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:46 np0005593233 nova_compute[222017]: 2026-01-23 09:38:46.671 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating config drive at /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config#033[00m
Jan 23 04:38:46 np0005593233 nova_compute[222017]: 2026-01-23 09:38:46.678 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7pfris_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:46.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:46 np0005593233 nova_compute[222017]: 2026-01-23 09:38:46.816 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7pfris_" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:46 np0005593233 nova_compute[222017]: 2026-01-23 09:38:46.852 222021 DEBUG nova.storage.rbd_utils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:46 np0005593233 nova_compute[222017]: 2026-01-23 09:38:46.857 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:47.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:47 np0005593233 podman[236938]: 2026-01-23 09:38:47.048433408 +0000 UTC m=+0.056534986 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 04:38:47 np0005593233 nova_compute[222017]: 2026-01-23 09:38:47.764 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:47 np0005593233 nova_compute[222017]: 2026-01-23 09:38:47.790 222021 DEBUG oslo_concurrency.processutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.933s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:47 np0005593233 nova_compute[222017]: 2026-01-23 09:38:47.790 222021 INFO nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deleting local config drive /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config because it was imported into RBD.#033[00m
Jan 23 04:38:47 np0005593233 kernel: tap541053d6-d3: entered promiscuous mode
Jan 23 04:38:47 np0005593233 NetworkManager[48871]: <info>  [1769161127.8612] manager: (tap541053d6-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Jan 23 04:38:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:47Z|00101|binding|INFO|Claiming lport 541053d6-d3d0-4da0-9b9c-630177f53234 for this chassis.
Jan 23 04:38:47 np0005593233 nova_compute[222017]: 2026-01-23 09:38:47.864 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:47Z|00102|binding|INFO|541053d6-d3d0-4da0-9b9c-630177f53234: Claiming fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:38:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:47Z|00103|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 ovn-installed in OVS
Jan 23 04:38:47 np0005593233 nova_compute[222017]: 2026-01-23 09:38:47.882 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:47 np0005593233 nova_compute[222017]: 2026-01-23 09:38:47.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:47 np0005593233 systemd-udevd[236973]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:38:47 np0005593233 NetworkManager[48871]: <info>  [1769161127.9049] device (tap541053d6-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:38:47 np0005593233 NetworkManager[48871]: <info>  [1769161127.9056] device (tap541053d6-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:38:47 np0005593233 systemd-machined[190954]: New machine qemu-20-instance-0000001e.
Jan 23 04:38:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:38:47Z|00104|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 up in Southbound
Jan 23 04:38:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:47.952 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:99:47 10.100.0.9'], port_security=['fa:16:3e:ab:99:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45a72d2f-6d73-4d45-873b-96eff48e3d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=541053d6-d3d0-4da0-9b9c-630177f53234) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:47.954 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 541053d6-d3d0-4da0-9b9c-630177f53234 in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c bound to our chassis#033[00m
Jan 23 04:38:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:47.957 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:38:47 np0005593233 systemd[1]: Started Virtual Machine qemu-20-instance-0000001e.
Jan 23 04:38:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:47.980 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba79969-b627-4685-ac18-080eabb30823]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.016 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[560b4d69-d258-442e-a994-29274767f954]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.019 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[12dea5f7-ffc2-4800-a92c-2e741e829d43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.056 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d656c30b-cb58-466b-9a22-82aee93565fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.078 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a46711a-1b6d-4d83-b903-55be64c2253b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 42994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236990, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.099 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9faa68-ec9e-4ab7-994c-5c7821fbd5e6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496128, 'tstamp': 496128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236991, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496132, 'tstamp': 496132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236991, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.101 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.103 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.106 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.107 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.107 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:48.107 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.379 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 45a72d2f-6d73-4d45-873b-96eff48e3d22 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.380 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161128.3793435, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.380 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.383 222021 DEBUG nova.compute.manager [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.383 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.387 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance spawned successfully.#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.387 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.417 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.421 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.578 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.579 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161128.380258, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.579 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Started (Lifecycle Event)#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.586 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.586 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.587 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.588 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.588 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:48 np0005593233 nova_compute[222017]: 2026-01-23 09:38:48.589 222021 DEBUG nova.virt.libvirt.driver [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:48.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:49.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:49 np0005593233 nova_compute[222017]: 2026-01-23 09:38:49.112 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:49 np0005593233 nova_compute[222017]: 2026-01-23 09:38:49.332 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:49 np0005593233 nova_compute[222017]: 2026-01-23 09:38:49.335 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:49 np0005593233 nova_compute[222017]: 2026-01-23 09:38:49.484 222021 DEBUG nova.compute.manager [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:49 np0005593233 nova_compute[222017]: 2026-01-23 09:38:49.491 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:38:50 np0005593233 nova_compute[222017]: 2026-01-23 09:38:50.281 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:50 np0005593233 nova_compute[222017]: 2026-01-23 09:38:50.283 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:50 np0005593233 nova_compute[222017]: 2026-01-23 09:38:50.283 222021 DEBUG nova.objects.instance [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:38:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:50.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:51.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.001 222021 DEBUG oslo_concurrency.lockutils [None req-f49b3b67-608e-4993-bb5d-a7912aa3971a 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 1.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.776 222021 DEBUG nova.compute.manager [req-01121919-5e6e-488f-a223-8d53206b2aaf req-36654b7f-1cd0-4e52-8622-a19e530e87c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.776 222021 DEBUG oslo_concurrency.lockutils [req-01121919-5e6e-488f-a223-8d53206b2aaf req-36654b7f-1cd0-4e52-8622-a19e530e87c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.776 222021 DEBUG oslo_concurrency.lockutils [req-01121919-5e6e-488f-a223-8d53206b2aaf req-36654b7f-1cd0-4e52-8622-a19e530e87c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.777 222021 DEBUG oslo_concurrency.lockutils [req-01121919-5e6e-488f-a223-8d53206b2aaf req-36654b7f-1cd0-4e52-8622-a19e530e87c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.777 222021 DEBUG nova.compute.manager [req-01121919-5e6e-488f-a223-8d53206b2aaf req-36654b7f-1cd0-4e52-8622-a19e530e87c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:52 np0005593233 nova_compute[222017]: 2026-01-23 09:38:52.777 222021 WARNING nova.compute.manager [req-01121919-5e6e-488f-a223-8d53206b2aaf req-36654b7f-1cd0-4e52-8622-a19e530e87c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:38:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:38:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:52.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:38:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:53.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:53.251 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:38:53.253 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:38:53 np0005593233 nova_compute[222017]: 2026-01-23 09:38:53.253 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:54 np0005593233 nova_compute[222017]: 2026-01-23 09:38:54.114 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:54.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:38:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:55.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:38:55 np0005593233 nova_compute[222017]: 2026-01-23 09:38:55.108 222021 DEBUG nova.compute.manager [req-413c2fb0-1771-4616-a667-8057f878da19 req-83b03f34-7180-4804-8c02-c74acc0b0bc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:55 np0005593233 nova_compute[222017]: 2026-01-23 09:38:55.109 222021 DEBUG oslo_concurrency.lockutils [req-413c2fb0-1771-4616-a667-8057f878da19 req-83b03f34-7180-4804-8c02-c74acc0b0bc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:55 np0005593233 nova_compute[222017]: 2026-01-23 09:38:55.109 222021 DEBUG oslo_concurrency.lockutils [req-413c2fb0-1771-4616-a667-8057f878da19 req-83b03f34-7180-4804-8c02-c74acc0b0bc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:55 np0005593233 nova_compute[222017]: 2026-01-23 09:38:55.110 222021 DEBUG oslo_concurrency.lockutils [req-413c2fb0-1771-4616-a667-8057f878da19 req-83b03f34-7180-4804-8c02-c74acc0b0bc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:55 np0005593233 nova_compute[222017]: 2026-01-23 09:38:55.110 222021 DEBUG nova.compute.manager [req-413c2fb0-1771-4616-a667-8057f878da19 req-83b03f34-7180-4804-8c02-c74acc0b0bc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:55 np0005593233 nova_compute[222017]: 2026-01-23 09:38:55.111 222021 WARNING nova.compute.manager [req-413c2fb0-1771-4616-a667-8057f878da19 req-83b03f34-7180-4804-8c02-c74acc0b0bc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:38:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:56.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:57.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:57 np0005593233 nova_compute[222017]: 2026-01-23 09:38:57.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:58.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:38:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:59.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:59 np0005593233 nova_compute[222017]: 2026-01-23 09:38:59.118 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:59 np0005593233 nova_compute[222017]: 2026-01-23 09:38:59.464 222021 INFO nova.compute.manager [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Rebuilding instance#033[00m
Jan 23 04:39:00 np0005593233 nova_compute[222017]: 2026-01-23 09:39:00.673 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:00 np0005593233 nova_compute[222017]: 2026-01-23 09:39:00.706 222021 DEBUG nova.compute.manager [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:39:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:00.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:00 np0005593233 nova_compute[222017]: 2026-01-23 09:39:00.845 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_requests' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:00 np0005593233 nova_compute[222017]: 2026-01-23 09:39:00.971 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_devices' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:01 np0005593233 nova_compute[222017]: 2026-01-23 09:39:01.024 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'resources' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:01.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:01 np0005593233 nova_compute[222017]: 2026-01-23 09:39:01.075 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'migration_context' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:01 np0005593233 nova_compute[222017]: 2026-01-23 09:39:01.102 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:39:01 np0005593233 nova_compute[222017]: 2026-01-23 09:39:01.106 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:39:02 np0005593233 nova_compute[222017]: 2026-01-23 09:39:02.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:02.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:03.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:39:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:39:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:39:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:03.256 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:03 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:03Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:39:03 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:03Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:39:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:04 np0005593233 nova_compute[222017]: 2026-01-23 09:39:04.120 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:04.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:05.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:05 np0005593233 podman[237166]: 2026-01-23 09:39:05.112963448 +0000 UTC m=+0.113564896 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 04:39:05 np0005593233 nova_compute[222017]: 2026-01-23 09:39:05.476 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:06.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:07.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:07 np0005593233 nova_compute[222017]: 2026-01-23 09:39:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:07 np0005593233 nova_compute[222017]: 2026-01-23 09:39:07.791 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.458 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.460 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.461 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.461 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.461 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:08.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1416143834' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:08 np0005593233 nova_compute[222017]: 2026-01-23 09:39:08.956 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:09.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.119 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.120 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000021 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.123 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.127 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.127 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000001e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.347 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.348 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4408MB free_disk=20.80654525756836GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.348 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.349 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.531 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 45a72d2f-6d73-4d45-873b-96eff48e3d22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.532 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance cc1d9141-e8bc-42a4-9690-50bc53d25998 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.532 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.532 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.555 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.600 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.601 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.631 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.661 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:39:09 np0005593233 nova_compute[222017]: 2026-01-23 09:39:09.772 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1205011298' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:10 np0005593233 nova_compute[222017]: 2026-01-23 09:39:10.268 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:10 np0005593233 nova_compute[222017]: 2026-01-23 09:39:10.276 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:39:10 np0005593233 nova_compute[222017]: 2026-01-23 09:39:10.388 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:39:10 np0005593233 nova_compute[222017]: 2026-01-23 09:39:10.445 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:39:10 np0005593233 nova_compute[222017]: 2026-01-23 09:39:10.446 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:10.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:39:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:39:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:11.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:11 np0005593233 nova_compute[222017]: 2026-01-23 09:39:11.161 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:39:11 np0005593233 nova_compute[222017]: 2026-01-23 09:39:11.447 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:11 np0005593233 nova_compute[222017]: 2026-01-23 09:39:11.448 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:39:12 np0005593233 nova_compute[222017]: 2026-01-23 09:39:12.142 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:39:12 np0005593233 nova_compute[222017]: 2026-01-23 09:39:12.142 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:39:12 np0005593233 nova_compute[222017]: 2026-01-23 09:39:12.142 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:39:12 np0005593233 nova_compute[222017]: 2026-01-23 09:39:12.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:12.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:13.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:14 np0005593233 nova_compute[222017]: 2026-01-23 09:39:14.125 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:14.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:15.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.333 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance shutdown successfully after 14 seconds.#033[00m
Jan 23 04:39:15 np0005593233 kernel: tap541053d6-d3 (unregistering): left promiscuous mode
Jan 23 04:39:15 np0005593233 NetworkManager[48871]: <info>  [1769161155.5810] device (tap541053d6-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:39:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:15Z|00105|binding|INFO|Releasing lport 541053d6-d3d0-4da0-9b9c-630177f53234 from this chassis (sb_readonly=0)
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.591 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:15Z|00106|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 down in Southbound
Jan 23 04:39:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:15Z|00107|binding|INFO|Removing iface tap541053d6-d3 ovn-installed in OVS
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.594 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.598 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:99:47 10.100.0.9'], port_security=['fa:16:3e:ab:99:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45a72d2f-6d73-4d45-873b-96eff48e3d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=541053d6-d3d0-4da0-9b9c-630177f53234) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.600 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 541053d6-d3d0-4da0-9b9c-630177f53234 in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c unbound from our chassis#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.601 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.628 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5cdfce08-d0a0-40bd-9376-630d09294648]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:15 np0005593233 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 23 04:39:15 np0005593233 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001e.scope: Consumed 15.266s CPU time.
Jan 23 04:39:15 np0005593233 systemd-machined[190954]: Machine qemu-20-instance-0000001e terminated.
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.672 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[da86edb3-8204-4023-9f6a-e64a95666c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.675 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[96b974b9-ef60-4ff2-a461-a0069a3ddee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.708 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[84c25786-4402-49d1-baec-fff3f288bb12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.727 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[432bb636-1c96-4d24-b5b0-93074527b68f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 42994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237301, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.752 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5d9ade-e9f3-4ea2-86d8-f7e6f635bd42]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496128, 'tstamp': 496128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237302, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496132, 'tstamp': 496132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237302, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.754 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.788 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.796 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.797 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.797 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:15.798 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.813 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance destroyed successfully.#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.821 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance destroyed successfully.#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.822 222021 DEBUG nova.virt.libvirt.vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:38:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:58Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.823 222021 DEBUG nova.network.os_vif_util [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.825 222021 DEBUG nova.network.os_vif_util [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.826 222021 DEBUG os_vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.831 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.832 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap541053d6-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.835 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.837 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:39:15 np0005593233 nova_compute[222017]: 2026-01-23 09:39:15.840 222021 INFO os_vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3')#033[00m
Jan 23 04:39:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:16.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.029 222021 DEBUG nova.compute.manager [req-bfce9be3-2b4e-4029-9042-71fffbb9f04d req-bcb3ca88-21b1-48e1-bc72-f48bad219933 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.029 222021 DEBUG oslo_concurrency.lockutils [req-bfce9be3-2b4e-4029-9042-71fffbb9f04d req-bcb3ca88-21b1-48e1-bc72-f48bad219933 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.030 222021 DEBUG oslo_concurrency.lockutils [req-bfce9be3-2b4e-4029-9042-71fffbb9f04d req-bcb3ca88-21b1-48e1-bc72-f48bad219933 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.030 222021 DEBUG oslo_concurrency.lockutils [req-bfce9be3-2b4e-4029-9042-71fffbb9f04d req-bcb3ca88-21b1-48e1-bc72-f48bad219933 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.030 222021 DEBUG nova.compute.manager [req-bfce9be3-2b4e-4029-9042-71fffbb9f04d req-bcb3ca88-21b1-48e1-bc72-f48bad219933 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.030 222021 WARNING nova.compute.manager [req-bfce9be3-2b4e-4029-9042-71fffbb9f04d req-bcb3ca88-21b1-48e1-bc72-f48bad219933 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.064 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Updating instance_info_cache with network_info: [{"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:39:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:17.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.136 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-cc1d9141-e8bc-42a4-9690-50bc53d25998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.136 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.137 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.137 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.137 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:17 np0005593233 nova_compute[222017]: 2026-01-23 09:39:17.138 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:39:18 np0005593233 podman[237334]: 2026-01-23 09:39:18.091292954 +0000 UTC m=+0.088526065 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.296 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deleting instance files /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22_del#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.298 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deletion of /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22_del complete#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.616 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.616 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating image(s)#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.643 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.675 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.708 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.713 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.786 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.787 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.788 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.788 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.817 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:18 np0005593233 nova_compute[222017]: 2026-01-23 09:39:18.821 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:18.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.127 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.198 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.376s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.278 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] resizing rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.383 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.383 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Ensure instance console log exists: /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.384 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.384 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.385 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.388 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start _get_guest_xml network_info=[{"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.392 222021 WARNING nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.398 222021 DEBUG nova.virt.libvirt.host [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.399 222021 DEBUG nova.virt.libvirt.host [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.403 222021 DEBUG nova.virt.libvirt.host [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.404 222021 DEBUG nova.virt.libvirt.host [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.406 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.406 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.407 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.408 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.408 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.408 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.408 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.409 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.409 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.410 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.410 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.410 222021 DEBUG nova.virt.hardware [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.411 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.592 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.668 222021 DEBUG nova.compute.manager [req-503ae606-ea02-4821-bc1f-e9a51dfc2d2e req-3dfbd12b-3fbf-4533-b05b-1e190d6640cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.670 222021 DEBUG oslo_concurrency.lockutils [req-503ae606-ea02-4821-bc1f-e9a51dfc2d2e req-3dfbd12b-3fbf-4533-b05b-1e190d6640cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.671 222021 DEBUG oslo_concurrency.lockutils [req-503ae606-ea02-4821-bc1f-e9a51dfc2d2e req-3dfbd12b-3fbf-4533-b05b-1e190d6640cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.671 222021 DEBUG oslo_concurrency.lockutils [req-503ae606-ea02-4821-bc1f-e9a51dfc2d2e req-3dfbd12b-3fbf-4533-b05b-1e190d6640cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.672 222021 DEBUG nova.compute.manager [req-503ae606-ea02-4821-bc1f-e9a51dfc2d2e req-3dfbd12b-3fbf-4533-b05b-1e190d6640cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:19 np0005593233 nova_compute[222017]: 2026-01-23 09:39:19.673 222021 WARNING nova.compute.manager [req-503ae606-ea02-4821-bc1f-e9a51dfc2d2e req-3dfbd12b-3fbf-4533-b05b-1e190d6640cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 04:39:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:39:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3680882403' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.042 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.077 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.084 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.109 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.111 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:39:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3897092376' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.523 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.525 222021 DEBUG nova.virt.libvirt.vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:38:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:18Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.526 222021 DEBUG nova.network.os_vif_util [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.527 222021 DEBUG nova.network.os_vif_util [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.530 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <uuid>45a72d2f-6d73-4d45-873b-96eff48e3d22</uuid>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <name>instance-0000001e</name>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersAdminTestJSON-server-1151494829</nova:name>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:39:19</nova:creationTime>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:user uuid="191a72cfd0a841e9806246e07eb62fa6">tempest-ServersAdminTestJSON-1167530593-project-member</nova:user>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:project uuid="1a5f46b255cd4387bd3e4c0acaa39466">tempest-ServersAdminTestJSON-1167530593</nova:project>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <nova:port uuid="541053d6-d3d0-4da0-9b9c-630177f53234">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <entry name="serial">45a72d2f-6d73-4d45-873b-96eff48e3d22</entry>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <entry name="uuid">45a72d2f-6d73-4d45-873b-96eff48e3d22</entry>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/45a72d2f-6d73-4d45-873b-96eff48e3d22_disk">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:ab:99:47"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <target dev="tap541053d6-d3"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/console.log" append="off"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:39:20 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:39:20 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:39:20 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:39:20 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.532 222021 DEBUG nova.virt.libvirt.vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:38:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:39:18Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.533 222021 DEBUG nova.network.os_vif_util [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.533 222021 DEBUG nova.network.os_vif_util [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.534 222021 DEBUG os_vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.535 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.535 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.537 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.537 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap541053d6-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.538 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap541053d6-d3, col_values=(('external_ids', {'iface-id': '541053d6-d3d0-4da0-9b9c-630177f53234', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:99:47', 'vm-uuid': '45a72d2f-6d73-4d45-873b-96eff48e3d22'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.539 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:20 np0005593233 NetworkManager[48871]: <info>  [1769161160.5406] manager: (tap541053d6-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.542 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.545 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.545 222021 INFO os_vif [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3')#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.680 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.680 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.680 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No VIF found with MAC fa:16:3e:ab:99:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.681 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Using config drive#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.724 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.757 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:20 np0005593233 nova_compute[222017]: 2026-01-23 09:39:20.796 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'keypairs' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:20.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:21.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:21 np0005593233 nova_compute[222017]: 2026-01-23 09:39:21.733 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Creating config drive at /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config#033[00m
Jan 23 04:39:21 np0005593233 nova_compute[222017]: 2026-01-23 09:39:21.742 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvy6agft execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:21 np0005593233 nova_compute[222017]: 2026-01-23 09:39:21.880 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvy6agft" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:21 np0005593233 nova_compute[222017]: 2026-01-23 09:39:21.911 222021 DEBUG nova.storage.rbd_utils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:39:21 np0005593233 nova_compute[222017]: 2026-01-23 09:39:21.914 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.071 222021 DEBUG oslo_concurrency.processutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config 45a72d2f-6d73-4d45-873b-96eff48e3d22_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.157s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.072 222021 INFO nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deleting local config drive /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22/disk.config because it was imported into RBD.#033[00m
Jan 23 04:39:22 np0005593233 kernel: tap541053d6-d3: entered promiscuous mode
Jan 23 04:39:22 np0005593233 NetworkManager[48871]: <info>  [1769161162.1254] manager: (tap541053d6-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.128 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:22Z|00108|binding|INFO|Claiming lport 541053d6-d3d0-4da0-9b9c-630177f53234 for this chassis.
Jan 23 04:39:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:22Z|00109|binding|INFO|541053d6-d3d0-4da0-9b9c-630177f53234: Claiming fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.138 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:99:47 10.100.0.9'], port_security=['fa:16:3e:ab:99:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45a72d2f-6d73-4d45-873b-96eff48e3d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '7', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=541053d6-d3d0-4da0-9b9c-630177f53234) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.139 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 541053d6-d3d0-4da0-9b9c-630177f53234 in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c bound to our chassis#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.140 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:39:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:22Z|00110|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 ovn-installed in OVS
Jan 23 04:39:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:22Z|00111|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 up in Southbound
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.144 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.146 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.155 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4369c309-5590-447a-94a5-4f724952b528]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:22 np0005593233 systemd-udevd[237655]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:39:22 np0005593233 systemd-machined[190954]: New machine qemu-21-instance-0000001e.
Jan 23 04:39:22 np0005593233 NetworkManager[48871]: <info>  [1769161162.1765] device (tap541053d6-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:39:22 np0005593233 NetworkManager[48871]: <info>  [1769161162.1776] device (tap541053d6-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:39:22 np0005593233 systemd[1]: Started Virtual Machine qemu-21-instance-0000001e.
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.199 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a0452914-b7a4-4f75-b362-5b64d025c7fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.205 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5747cb38-6bfd-4d8e-82e8-4138039d5be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.233 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5ebec219-34d7-49df-8eab-1c60a3c0801d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.258 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0b29dcbc-afa2-400f-af26-0d079c5ae33d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 42994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237668, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.279 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c3579006-d2c4-4773-92c7-fc15f2d6fd49]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496128, 'tstamp': 496128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237670, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496132, 'tstamp': 496132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237670, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.281 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.282 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.283 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.286 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.286 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.286 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:22.287 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.693 222021 DEBUG nova.compute.manager [req-a7793363-9484-4dbe-a923-44e921f52dd4 req-55422bfd-d5b5-47e6-8418-e8b690c14607 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.694 222021 DEBUG oslo_concurrency.lockutils [req-a7793363-9484-4dbe-a923-44e921f52dd4 req-55422bfd-d5b5-47e6-8418-e8b690c14607 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.694 222021 DEBUG oslo_concurrency.lockutils [req-a7793363-9484-4dbe-a923-44e921f52dd4 req-55422bfd-d5b5-47e6-8418-e8b690c14607 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.694 222021 DEBUG oslo_concurrency.lockutils [req-a7793363-9484-4dbe-a923-44e921f52dd4 req-55422bfd-d5b5-47e6-8418-e8b690c14607 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.695 222021 DEBUG nova.compute.manager [req-a7793363-9484-4dbe-a923-44e921f52dd4 req-55422bfd-d5b5-47e6-8418-e8b690c14607 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:22 np0005593233 nova_compute[222017]: 2026-01-23 09:39:22.695 222021 WARNING nova.compute.manager [req-a7793363-9484-4dbe-a923-44e921f52dd4 req-55422bfd-d5b5-47e6-8418-e8b690c14607 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 04:39:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:22.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.077 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 45a72d2f-6d73-4d45-873b-96eff48e3d22 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.077 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161163.076887, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.078 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.080 222021 DEBUG nova.compute.manager [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.081 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.084 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance spawned successfully.#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.085 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:39:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:23.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.148 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.157 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.161 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.161 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.161 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.162 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.162 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.162 222021 DEBUG nova.virt.libvirt.driver [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.215 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.216 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161163.0800416, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.216 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Started (Lifecycle Event)#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.318 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.324 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.386 222021 DEBUG nova.compute.manager [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.531 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.662 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.663 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.663 222021 DEBUG nova.objects.instance [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:39:23 np0005593233 nova_compute[222017]: 2026-01-23 09:39:23.778 222021 DEBUG oslo_concurrency.lockutils [None req-3cb38673-b0ab-47c2-bb9c-ba7e9bd037a1 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:24 np0005593233 nova_compute[222017]: 2026-01-23 09:39:24.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:24.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.079 222021 DEBUG nova.compute.manager [req-a71da845-d970-44af-bafc-4c97d55a1fe5 req-025ef7dd-e8df-40a0-a33b-0e1fcde31c40 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.080 222021 DEBUG oslo_concurrency.lockutils [req-a71da845-d970-44af-bafc-4c97d55a1fe5 req-025ef7dd-e8df-40a0-a33b-0e1fcde31c40 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.080 222021 DEBUG oslo_concurrency.lockutils [req-a71da845-d970-44af-bafc-4c97d55a1fe5 req-025ef7dd-e8df-40a0-a33b-0e1fcde31c40 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.081 222021 DEBUG oslo_concurrency.lockutils [req-a71da845-d970-44af-bafc-4c97d55a1fe5 req-025ef7dd-e8df-40a0-a33b-0e1fcde31c40 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.081 222021 DEBUG nova.compute.manager [req-a71da845-d970-44af-bafc-4c97d55a1fe5 req-025ef7dd-e8df-40a0-a33b-0e1fcde31c40 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.082 222021 WARNING nova.compute.manager [req-a71da845-d970-44af-bafc-4c97d55a1fe5 req-025ef7dd-e8df-40a0-a33b-0e1fcde31c40 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:39:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:25.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:25 np0005593233 nova_compute[222017]: 2026-01-23 09:39:25.541 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:26.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:27.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:28.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:29.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:29 np0005593233 nova_compute[222017]: 2026-01-23 09:39:29.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:30 np0005593233 nova_compute[222017]: 2026-01-23 09:39:30.543 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:30.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:31.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:32.519 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:32 np0005593233 nova_compute[222017]: 2026-01-23 09:39:32.519 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:32.521 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:39:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:32.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:33.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:34 np0005593233 nova_compute[222017]: 2026-01-23 09:39:34.136 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.004000114s ======
Jan 23 04:39:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:34.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000114s
Jan 23 04:39:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:35.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:35 np0005593233 nova_compute[222017]: 2026-01-23 09:39:35.547 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:36 np0005593233 podman[237713]: 2026-01-23 09:39:36.114366168 +0000 UTC m=+0.118010783 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 04:39:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:36.523 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:36.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:37 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Jan 23 04:39:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:37.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:37Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:39:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:37Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:99:47 10.100.0.9
Jan 23 04:39:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:38.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:39.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:39 np0005593233 nova_compute[222017]: 2026-01-23 09:39:39.138 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:40 np0005593233 nova_compute[222017]: 2026-01-23 09:39:40.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:40.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:41.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.986 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.986 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.986 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.987 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.987 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.988 222021 INFO nova.compute.manager [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Terminating instance#033[00m
Jan 23 04:39:41 np0005593233 nova_compute[222017]: 2026-01-23 09:39:41.989 222021 DEBUG nova.compute.manager [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:39:42 np0005593233 kernel: tapbc45bc9a-0c (unregistering): left promiscuous mode
Jan 23 04:39:42 np0005593233 NetworkManager[48871]: <info>  [1769161182.1229] device (tapbc45bc9a-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:42Z|00112|binding|INFO|Releasing lport bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e from this chassis (sb_readonly=0)
Jan 23 04:39:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:42Z|00113|binding|INFO|Setting lport bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e down in Southbound
Jan 23 04:39:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:42Z|00114|binding|INFO|Removing iface tapbc45bc9a-0c ovn-installed in OVS
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.136 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.154 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c9:5e 10.100.0.12'], port_security=['fa:16:3e:24:c9:5e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cc1d9141-e8bc-42a4-9690-50bc53d25998', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.155 140224 INFO neutron.agent.ovn.metadata.agent [-] Port bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c unbound from our chassis#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.157 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.158 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.176 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c44811ea-e579-4d72-be17-201e9f8ab7ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:42 np0005593233 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 23 04:39:42 np0005593233 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000021.scope: Consumed 19.452s CPU time.
Jan 23 04:39:42 np0005593233 systemd-machined[190954]: Machine qemu-19-instance-00000021 terminated.
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.216 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ba11018f-4674-4e1c-8f33-33a795534341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.220 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[181b5787-d77b-4a34-acbf-8d1d1fd202ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.228 222021 INFO nova.virt.libvirt.driver [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Instance destroyed successfully.#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.229 222021 DEBUG nova.objects.instance [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'resources' on Instance uuid cc1d9141-e8bc-42a4-9690-50bc53d25998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.252 222021 DEBUG nova.virt.libvirt.vif [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:37:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-2129961106',display_name='tempest-ServersAdminTestJSON-server-2129961106',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-2129961106',id=33,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-ob9oijh4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:37:55Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=cc1d9141-e8bc-42a4-9690-50bc53d25998,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.253 222021 DEBUG nova.network.os_vif_util [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "address": "fa:16:3e:24:c9:5e", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc45bc9a-0c", "ovs_interfaceid": "bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.254 222021 DEBUG nova.network.os_vif_util [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.255 222021 DEBUG os_vif [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.256 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.256 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc45bc9a-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.257 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[489e2b3d-5fc5-4b9c-85be-81cef7ba4016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.258 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.260 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.262 222021 INFO os_vif [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:c9:5e,bridge_name='br-int',has_traffic_filtering=True,id=bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc45bc9a-0c')#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.273 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3699e83c-fb26-4f16-a06d-6a928e62cfb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 15, 'rx_bytes': 826, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 15, 'rx_bytes': 826, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496114, 'reachable_time': 42994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237762, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.287 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd4fe4b-45ad-43f6-802f-5229f5623a31]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496128, 'tstamp': 496128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237777, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1f2b13ad-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496132, 'tstamp': 496132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237777, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.289 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.290 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.292 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.292 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.292 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.292 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.640 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.641 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:42.642 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.873 222021 DEBUG nova.compute.manager [req-57ebdad7-6f82-4caf-b61c-f925aa2ca58e req-6fdd0f2d-c001-464e-b30c-8729fe2ca548 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-vif-unplugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.874 222021 DEBUG oslo_concurrency.lockutils [req-57ebdad7-6f82-4caf-b61c-f925aa2ca58e req-6fdd0f2d-c001-464e-b30c-8729fe2ca548 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.874 222021 DEBUG oslo_concurrency.lockutils [req-57ebdad7-6f82-4caf-b61c-f925aa2ca58e req-6fdd0f2d-c001-464e-b30c-8729fe2ca548 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.875 222021 DEBUG oslo_concurrency.lockutils [req-57ebdad7-6f82-4caf-b61c-f925aa2ca58e req-6fdd0f2d-c001-464e-b30c-8729fe2ca548 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.875 222021 DEBUG nova.compute.manager [req-57ebdad7-6f82-4caf-b61c-f925aa2ca58e req-6fdd0f2d-c001-464e-b30c-8729fe2ca548 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] No waiting events found dispatching network-vif-unplugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:42 np0005593233 nova_compute[222017]: 2026-01-23 09:39:42.875 222021 DEBUG nova.compute.manager [req-57ebdad7-6f82-4caf-b61c-f925aa2ca58e req-6fdd0f2d-c001-464e-b30c-8729fe2ca548 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-vif-unplugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:39:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:42.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:43.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:43 np0005593233 nova_compute[222017]: 2026-01-23 09:39:43.540 222021 INFO nova.virt.libvirt.driver [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Deleting instance files /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998_del#033[00m
Jan 23 04:39:43 np0005593233 nova_compute[222017]: 2026-01-23 09:39:43.541 222021 INFO nova.virt.libvirt.driver [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Deletion of /var/lib/nova/instances/cc1d9141-e8bc-42a4-9690-50bc53d25998_del complete#033[00m
Jan 23 04:39:43 np0005593233 nova_compute[222017]: 2026-01-23 09:39:43.639 222021 INFO nova.compute.manager [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Took 1.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:39:43 np0005593233 nova_compute[222017]: 2026-01-23 09:39:43.639 222021 DEBUG oslo.service.loopingcall [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:39:43 np0005593233 nova_compute[222017]: 2026-01-23 09:39:43.640 222021 DEBUG nova.compute.manager [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:39:43 np0005593233 nova_compute[222017]: 2026-01-23 09:39:43.640 222021 DEBUG nova.network.neutron [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:39:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:44 np0005593233 nova_compute[222017]: 2026-01-23 09:39:44.141 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:44.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.019 222021 DEBUG nova.compute.manager [req-3b7677a4-6696-4c6b-bd8e-628d3384a303 req-f64e9163-ae00-4dee-aacb-b68806e33734 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.019 222021 DEBUG oslo_concurrency.lockutils [req-3b7677a4-6696-4c6b-bd8e-628d3384a303 req-f64e9163-ae00-4dee-aacb-b68806e33734 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.020 222021 DEBUG oslo_concurrency.lockutils [req-3b7677a4-6696-4c6b-bd8e-628d3384a303 req-f64e9163-ae00-4dee-aacb-b68806e33734 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.020 222021 DEBUG oslo_concurrency.lockutils [req-3b7677a4-6696-4c6b-bd8e-628d3384a303 req-f64e9163-ae00-4dee-aacb-b68806e33734 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.020 222021 DEBUG nova.compute.manager [req-3b7677a4-6696-4c6b-bd8e-628d3384a303 req-f64e9163-ae00-4dee-aacb-b68806e33734 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] No waiting events found dispatching network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.022 222021 WARNING nova.compute.manager [req-3b7677a4-6696-4c6b-bd8e-628d3384a303 req-f64e9163-ae00-4dee-aacb-b68806e33734 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received unexpected event network-vif-plugged-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:39:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:45.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.310 222021 DEBUG nova.network.neutron [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.350 222021 INFO nova.compute.manager [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Took 1.71 seconds to deallocate network for instance.#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.441 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.442 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:45 np0005593233 nova_compute[222017]: 2026-01-23 09:39:45.589 222021 DEBUG oslo_concurrency.processutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1902703732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:46 np0005593233 nova_compute[222017]: 2026-01-23 09:39:46.038 222021 DEBUG oslo_concurrency.processutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:46 np0005593233 nova_compute[222017]: 2026-01-23 09:39:46.046 222021 DEBUG nova.compute.provider_tree [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:39:46 np0005593233 nova_compute[222017]: 2026-01-23 09:39:46.081 222021 DEBUG nova.scheduler.client.report [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:39:46 np0005593233 nova_compute[222017]: 2026-01-23 09:39:46.139 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:46 np0005593233 nova_compute[222017]: 2026-01-23 09:39:46.177 222021 INFO nova.scheduler.client.report [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Deleted allocations for instance cc1d9141-e8bc-42a4-9690-50bc53d25998#033[00m
Jan 23 04:39:46 np0005593233 nova_compute[222017]: 2026-01-23 09:39:46.288 222021 DEBUG oslo_concurrency.lockutils [None req-223b8dd1-6128-4528-bfc7-d6a1bc13c433 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "cc1d9141-e8bc-42a4-9690-50bc53d25998" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:46.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:47.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:47 np0005593233 nova_compute[222017]: 2026-01-23 09:39:47.218 222021 DEBUG nova.compute.manager [req-be3e50d6-2420-4e22-990b-5ada481892b5 req-c097c44b-5d28-40bd-993e-5ca1e4bbc74e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Received event network-vif-deleted-bc45bc9a-0cc1-4e3b-832e-f39f0f6f454e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:47 np0005593233 nova_compute[222017]: 2026-01-23 09:39:47.258 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:48.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:49 np0005593233 podman[237805]: 2026-01-23 09:39:49.063857753 +0000 UTC m=+0.067087416 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:39:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:49 np0005593233 nova_compute[222017]: 2026-01-23 09:39:49.143 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:39:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:49.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:39:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:50.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:51.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:52 np0005593233 nova_compute[222017]: 2026-01-23 09:39:52.261 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:52.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.069 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.070 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.070 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.070 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.071 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.072 222021 INFO nova.compute.manager [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Terminating instance#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.074 222021 DEBUG nova.compute.manager [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:39:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:53.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:53 np0005593233 kernel: tap541053d6-d3 (unregistering): left promiscuous mode
Jan 23 04:39:53 np0005593233 NetworkManager[48871]: <info>  [1769161193.2197] device (tap541053d6-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:53 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:53Z|00115|binding|INFO|Releasing lport 541053d6-d3d0-4da0-9b9c-630177f53234 from this chassis (sb_readonly=0)
Jan 23 04:39:53 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:53Z|00116|binding|INFO|Setting lport 541053d6-d3d0-4da0-9b9c-630177f53234 down in Southbound
Jan 23 04:39:53 np0005593233 ovn_controller[130653]: 2026-01-23T09:39:53Z|00117|binding|INFO|Removing iface tap541053d6-d3 ovn-installed in OVS
Jan 23 04:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:53.239 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:99:47 10.100.0.9'], port_security=['fa:16:3e:ab:99:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '45a72d2f-6d73-4d45-873b-96eff48e3d22', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=541053d6-d3d0-4da0-9b9c-630177f53234) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:53.240 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 541053d6-d3d0-4da0-9b9c-630177f53234 in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c unbound from our chassis#033[00m
Jan 23 04:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:53.241 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:53.242 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b39643dd-7b9d-41c3-ad58-c1cb97dd7475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:53.243 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c namespace which is not needed anymore#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.259 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:53 np0005593233 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 23 04:39:53 np0005593233 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001e.scope: Consumed 15.413s CPU time.
Jan 23 04:39:53 np0005593233 systemd-machined[190954]: Machine qemu-21-instance-0000001e terminated.
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.523 222021 INFO nova.virt.libvirt.driver [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Instance destroyed successfully.#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.524 222021 DEBUG nova.objects.instance [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'resources' on Instance uuid 45a72d2f-6d73-4d45-873b-96eff48e3d22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:53 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [NOTICE]   (235692) : haproxy version is 2.8.14-c23fe91
Jan 23 04:39:53 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [NOTICE]   (235692) : path to executable is /usr/sbin/haproxy
Jan 23 04:39:53 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [WARNING]  (235692) : Exiting Master process...
Jan 23 04:39:53 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [ALERT]    (235692) : Current worker (235694) exited with code 143 (Terminated)
Jan 23 04:39:53 np0005593233 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[235688]: [WARNING]  (235692) : All workers exited. Exiting... (0)
Jan 23 04:39:53 np0005593233 systemd[1]: libpod-e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b.scope: Deactivated successfully.
Jan 23 04:39:53 np0005593233 podman[237848]: 2026-01-23 09:39:53.54543593 +0000 UTC m=+0.176982718 container died e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.555 222021 DEBUG nova.virt.libvirt.vif [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1151494829',display_name='tempest-ServersAdminTestJSON-server-1151494829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1151494829',id=30,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:39:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-0lc5vfd4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:39:29Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=45a72d2f-6d73-4d45-873b-96eff48e3d22,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.556 222021 DEBUG nova.network.os_vif_util [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "541053d6-d3d0-4da0-9b9c-630177f53234", "address": "fa:16:3e:ab:99:47", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541053d6-d3", "ovs_interfaceid": "541053d6-d3d0-4da0-9b9c-630177f53234", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.558 222021 DEBUG nova.network.os_vif_util [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.559 222021 DEBUG os_vif [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.612 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap541053d6-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.615 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:53 np0005593233 nova_compute[222017]: 2026-01-23 09:39:53.619 222021 INFO os_vif [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:99:47,bridge_name='br-int',has_traffic_filtering=True,id=541053d6-d3d0-4da0-9b9c-630177f53234,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541053d6-d3')#033[00m
Jan 23 04:39:53 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b-userdata-shm.mount: Deactivated successfully.
Jan 23 04:39:53 np0005593233 systemd[1]: var-lib-containers-storage-overlay-109f64ecd9a400035d256309eb8b4e240eb18a41fdf2e1d28dbebed98f2b5ec3-merged.mount: Deactivated successfully.
Jan 23 04:39:53 np0005593233 podman[237848]: 2026-01-23 09:39:53.639008667 +0000 UTC m=+0.270555455 container cleanup e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:39:53 np0005593233 systemd[1]: libpod-conmon-e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b.scope: Deactivated successfully.
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.030 222021 DEBUG nova.compute.manager [req-ca4efe1f-f7c9-48b5-bddc-585a36f594c2 req-cba6128f-4666-4222-9932-35f783053dea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.030 222021 DEBUG oslo_concurrency.lockutils [req-ca4efe1f-f7c9-48b5-bddc-585a36f594c2 req-cba6128f-4666-4222-9932-35f783053dea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.031 222021 DEBUG oslo_concurrency.lockutils [req-ca4efe1f-f7c9-48b5-bddc-585a36f594c2 req-cba6128f-4666-4222-9932-35f783053dea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.031 222021 DEBUG oslo_concurrency.lockutils [req-ca4efe1f-f7c9-48b5-bddc-585a36f594c2 req-cba6128f-4666-4222-9932-35f783053dea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.031 222021 DEBUG nova.compute.manager [req-ca4efe1f-f7c9-48b5-bddc-585a36f594c2 req-cba6128f-4666-4222-9932-35f783053dea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.031 222021 DEBUG nova.compute.manager [req-ca4efe1f-f7c9-48b5-bddc-585a36f594c2 req-cba6128f-4666-4222-9932-35f783053dea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-unplugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:39:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.145 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:54 np0005593233 podman[237904]: 2026-01-23 09:39:54.781153554 +0000 UTC m=+1.112791114 container remove e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.789 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ac79b7ad-268b-4f61-809e-742f69f8f5d9]: (4, ('Fri Jan 23 09:39:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c (e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b)\ne22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b\nFri Jan 23 09:39:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c (e22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b)\ne22e35d5cf79d55333375d9fd15df63a37c1d8affdf90af6183d406614cf373b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.791 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[441a89ed-d212-4126-8695-20175ed91613]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 kernel: tap1f2b13ad-70: left promiscuous mode
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.792 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.794 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:54 np0005593233 nova_compute[222017]: 2026-01-23 09:39:54.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.815 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bf29d836-4caf-4475-ab15-aa23fad9893d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.841 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8889d2-aebe-4fc5-89cb-36017407c4b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.843 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b8bfaa34-9e94-47e2-9149-8954c636f5d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.860 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7d831faa-3881-4120-8e09-de46555bf08f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496106, 'reachable_time': 16787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237923, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.863 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:39:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:39:54.864 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5ae7fc-3204-46e4-94f0-c6c0359b2fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:54 np0005593233 systemd[1]: run-netns-ovnmeta\x2d1f2b13ad\x2d7b25\x2d4a2b\x2db4d5\x2d7432a67ce12c.mount: Deactivated successfully.
Jan 23 04:39:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:54.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:55.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:55 np0005593233 nova_compute[222017]: 2026-01-23 09:39:55.938 222021 INFO nova.virt.libvirt.driver [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deleting instance files /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22_del#033[00m
Jan 23 04:39:55 np0005593233 nova_compute[222017]: 2026-01-23 09:39:55.938 222021 INFO nova.virt.libvirt.driver [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deletion of /var/lib/nova/instances/45a72d2f-6d73-4d45-873b-96eff48e3d22_del complete#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.374 222021 DEBUG nova.compute.manager [req-946a83c4-ba73-4ac2-87e7-b05540303812 req-fda2f506-5101-4e4f-b0d2-e82be3d8b521 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.375 222021 DEBUG oslo_concurrency.lockutils [req-946a83c4-ba73-4ac2-87e7-b05540303812 req-fda2f506-5101-4e4f-b0d2-e82be3d8b521 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.375 222021 DEBUG oslo_concurrency.lockutils [req-946a83c4-ba73-4ac2-87e7-b05540303812 req-fda2f506-5101-4e4f-b0d2-e82be3d8b521 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.375 222021 DEBUG oslo_concurrency.lockutils [req-946a83c4-ba73-4ac2-87e7-b05540303812 req-fda2f506-5101-4e4f-b0d2-e82be3d8b521 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.375 222021 DEBUG nova.compute.manager [req-946a83c4-ba73-4ac2-87e7-b05540303812 req-fda2f506-5101-4e4f-b0d2-e82be3d8b521 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] No waiting events found dispatching network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.376 222021 WARNING nova.compute.manager [req-946a83c4-ba73-4ac2-87e7-b05540303812 req-fda2f506-5101-4e4f-b0d2-e82be3d8b521 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received unexpected event network-vif-plugged-541053d6-d3d0-4da0-9b9c-630177f53234 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.453 222021 INFO nova.compute.manager [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Took 3.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.453 222021 DEBUG oslo.service.loopingcall [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.453 222021 DEBUG nova.compute.manager [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:39:56 np0005593233 nova_compute[222017]: 2026-01-23 09:39:56.454 222021 DEBUG nova.network.neutron [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:39:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:56.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:57.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:57 np0005593233 nova_compute[222017]: 2026-01-23 09:39:57.226 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161182.2255569, cc1d9141-e8bc-42a4-9690-50bc53d25998 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:39:57 np0005593233 nova_compute[222017]: 2026-01-23 09:39:57.227 222021 INFO nova.compute.manager [-] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:39:57 np0005593233 nova_compute[222017]: 2026-01-23 09:39:57.287 222021 DEBUG nova.compute.manager [None req-cdb841d0-9c45-491b-8d01-b58d458ecac2 - - - - - -] [instance: cc1d9141-e8bc-42a4-9690-50bc53d25998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:39:58 np0005593233 nova_compute[222017]: 2026-01-23 09:39:58.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:58.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.147 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:39:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:39:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:59.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.268 222021 DEBUG nova.network.neutron [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.312 222021 INFO nova.compute.manager [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Took 2.86 seconds to deallocate network for instance.#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.375 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.376 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.440 222021 DEBUG nova.compute.manager [req-260b872a-4370-48d5-bf79-a21cffa708bf req-d1fdca3b-d4d3-4c2f-863c-f403e76aa8ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Received event network-vif-deleted-541053d6-d3d0-4da0-9b9c-630177f53234 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.499 222021 DEBUG oslo_concurrency.processutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3071361035' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.928 222021 DEBUG oslo_concurrency.processutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.936 222021 DEBUG nova.compute.provider_tree [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:39:59 np0005593233 nova_compute[222017]: 2026-01-23 09:39:59.963 222021 DEBUG nova.scheduler.client.report [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:00 np0005593233 nova_compute[222017]: 2026-01-23 09:40:00.009 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:00 np0005593233 nova_compute[222017]: 2026-01-23 09:40:00.087 222021 INFO nova.scheduler.client.report [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Deleted allocations for instance 45a72d2f-6d73-4d45-873b-96eff48e3d22#033[00m
Jan 23 04:40:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 04:40:00 np0005593233 nova_compute[222017]: 2026-01-23 09:40:00.350 222021 DEBUG oslo_concurrency.lockutils [None req-ee100c6b-a8eb-4f7b-adc8-d1ce793d69f9 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "45a72d2f-6d73-4d45-873b-96eff48e3d22" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:00.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:01.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:02.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:03.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:03 np0005593233 nova_compute[222017]: 2026-01-23 09:40:03.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:04 np0005593233 nova_compute[222017]: 2026-01-23 09:40:04.150 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:04.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:05.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:05 np0005593233 nova_compute[222017]: 2026-01-23 09:40:05.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:06.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:07 np0005593233 nova_compute[222017]: 2026-01-23 09:40:07.101 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:07 np0005593233 podman[237948]: 2026-01-23 09:40:07.103945272 +0000 UTC m=+0.117771666 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:40:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:07.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:07 np0005593233 nova_compute[222017]: 2026-01-23 09:40:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:08 np0005593233 nova_compute[222017]: 2026-01-23 09:40:08.522 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161193.5207434, 45a72d2f-6d73-4d45-873b-96eff48e3d22 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:08 np0005593233 nova_compute[222017]: 2026-01-23 09:40:08.522 222021 INFO nova.compute.manager [-] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:40:08 np0005593233 nova_compute[222017]: 2026-01-23 09:40:08.555 222021 DEBUG nova.compute.manager [None req-458a878f-20bf-4e8b-a74c-d3a723902756 - - - - - -] [instance: 45a72d2f-6d73-4d45-873b-96eff48e3d22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:08 np0005593233 nova_compute[222017]: 2026-01-23 09:40:08.650 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:08.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:09 np0005593233 nova_compute[222017]: 2026-01-23 09:40:09.153 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:09.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:09 np0005593233 nova_compute[222017]: 2026-01-23 09:40:09.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:09 np0005593233 nova_compute[222017]: 2026-01-23 09:40:09.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.410 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.454 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.454 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.455 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.455 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.455 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3287341892' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:10 np0005593233 nova_compute[222017]: 2026-01-23 09:40:10.919 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:10.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.097 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.098 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4804MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.098 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.098 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:11.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.693 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.694 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:40:11 np0005593233 nova_compute[222017]: 2026-01-23 09:40:11.956 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:12 np0005593233 nova_compute[222017]: 2026-01-23 09:40:12.449 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:12 np0005593233 nova_compute[222017]: 2026-01-23 09:40:12.456 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:12 np0005593233 nova_compute[222017]: 2026-01-23 09:40:12.496 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:12 np0005593233 nova_compute[222017]: 2026-01-23 09:40:12.542 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:40:12 np0005593233 nova_compute[222017]: 2026-01-23 09:40:12.542 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:12.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:40:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:40:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:13 np0005593233 nova_compute[222017]: 2026-01-23 09:40:13.516 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:13 np0005593233 nova_compute[222017]: 2026-01-23 09:40:13.653 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:14 np0005593233 nova_compute[222017]: 2026-01-23 09:40:14.188 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:14 np0005593233 nova_compute[222017]: 2026-01-23 09:40:14.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:14.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:15.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3675569127' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.417000) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216417130, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2621, "num_deletes": 508, "total_data_size": 5434437, "memory_usage": 5511056, "flush_reason": "Manual Compaction"}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216447675, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3563880, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30567, "largest_seqno": 33183, "table_properties": {"data_size": 3553785, "index_size": 5885, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 24606, "raw_average_key_size": 19, "raw_value_size": 3531218, "raw_average_value_size": 2827, "num_data_blocks": 256, "num_entries": 1249, "num_filter_entries": 1249, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161017, "oldest_key_time": 1769161017, "file_creation_time": 1769161216, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 30798 microseconds, and 9024 cpu microseconds.
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.447789) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3563880 bytes OK
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.447844) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.450318) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.450344) EVENT_LOG_v1 {"time_micros": 1769161216450335, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.450382) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5421774, prev total WAL file size 5421774, number of live WAL files 2.
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.453466) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3480KB)], [60(8662KB)]
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216453577, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 12434389, "oldest_snapshot_seqno": -1}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5577 keys, 10381832 bytes, temperature: kUnknown
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216565799, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10381832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10342653, "index_size": 24157, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13957, "raw_key_size": 143313, "raw_average_key_size": 25, "raw_value_size": 10240377, "raw_average_value_size": 1836, "num_data_blocks": 974, "num_entries": 5577, "num_filter_entries": 5577, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161216, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.566316) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10381832 bytes
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.568598) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.7 rd, 92.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 8.5 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(6.4) write-amplify(2.9) OK, records in: 6613, records dropped: 1036 output_compression: NoCompression
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.568616) EVENT_LOG_v1 {"time_micros": 1769161216568607, "job": 36, "event": "compaction_finished", "compaction_time_micros": 112290, "compaction_time_cpu_micros": 25147, "output_level": 6, "num_output_files": 1, "total_output_size": 10381832, "num_input_records": 6613, "num_output_records": 5577, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216569428, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216571085, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.453227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.571164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.571172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.571174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.571175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:40:16.571177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:16.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:17.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:17 np0005593233 nova_compute[222017]: 2026-01-23 09:40:17.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:18 np0005593233 nova_compute[222017]: 2026-01-23 09:40:18.656 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:18.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:19 np0005593233 nova_compute[222017]: 2026-01-23 09:40:19.190 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:19.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:20 np0005593233 podman[238151]: 2026-01-23 09:40:20.065924321 +0000 UTC m=+0.070658188 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:40:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:20.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:21.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:22.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:23.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:23 np0005593233 nova_compute[222017]: 2026-01-23 09:40:23.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:24 np0005593233 nova_compute[222017]: 2026-01-23 09:40:24.192 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:24.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:25.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.222 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "27b8c2df-6812-4d79-ba38-79585e06bfdc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.222 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "27b8c2df-6812-4d79-ba38-79585e06bfdc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.260 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.356 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.357 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.366 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:40:26 np0005593233 nova_compute[222017]: 2026-01-23 09:40:26.367 222021 INFO nova.compute.claims [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:40:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:26.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.066 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:27.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/301035771' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.532 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.540 222021 DEBUG nova.compute.provider_tree [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.563 222021 DEBUG nova.scheduler.client.report [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.596 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.661 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "59d13db1-5c8c-4a9f-bd88-e66f1a7b46ce" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.662 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "59d13db1-5c8c-4a9f-bd88-e66f1a7b46ce" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.693 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] No node specified, defaulting to compute-1.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.752 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "59d13db1-5c8c-4a9f-bd88-e66f1a7b46ce" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.753 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.826 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.826 222021 DEBUG nova.network.neutron [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.870 222021 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:40:27 np0005593233 nova_compute[222017]: 2026-01-23 09:40:27.894 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.029 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.031 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.032 222021 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Creating image(s)#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.067 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.097 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.126 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.133 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.210 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.212 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.213 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.213 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.245 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.251 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.480 222021 DEBUG nova.network.neutron [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.481 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:28 np0005593233 nova_compute[222017]: 2026-01-23 09:40:28.947 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.696s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:28.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.030 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] resizing rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:40:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.146 222021 DEBUG nova.objects.instance [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'migration_context' on Instance uuid 27b8c2df-6812-4d79-ba38-79585e06bfdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.164 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.164 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Ensure instance console log exists: /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.165 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.165 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.165 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.167 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.172 222021 WARNING nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.181 222021 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.181 222021 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.186 222021 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.187 222021 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.188 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.188 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.188 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.189 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.189 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.189 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.189 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.189 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.190 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.190 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.190 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.190 222021 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.193 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:29.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.228 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2166621106' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.906 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.713s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.944 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:29 np0005593233 nova_compute[222017]: 2026-01-23 09:40:29.948 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2032120283' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:30 np0005593233 nova_compute[222017]: 2026-01-23 09:40:30.400 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:30 np0005593233 nova_compute[222017]: 2026-01-23 09:40:30.403 222021 DEBUG nova.objects.instance [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'pci_devices' on Instance uuid 27b8c2df-6812-4d79-ba38-79585e06bfdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:30 np0005593233 nova_compute[222017]: 2026-01-23 09:40:30.428 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <uuid>27b8c2df-6812-4d79-ba38-79585e06bfdc</uuid>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <name>instance-00000027</name>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersOnMultiNodesTest-server-673627176-1</nova:name>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:40:29</nova:creationTime>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:user uuid="a3b5a7f627074988a8a05a20558595fe">tempest-ServersOnMultiNodesTest-288318576-project-member</nova:user>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <nova:project uuid="e8778f3a187440f3879f9d9533d45855">tempest-ServersOnMultiNodesTest-288318576</nova:project>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <entry name="serial">27b8c2df-6812-4d79-ba38-79585e06bfdc</entry>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <entry name="uuid">27b8c2df-6812-4d79-ba38-79585e06bfdc</entry>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/27b8c2df-6812-4d79-ba38-79585e06bfdc_disk">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/27b8c2df-6812-4d79-ba38-79585e06bfdc_disk.config">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/console.log" append="off"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:40:30 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:40:30 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:40:30 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:40:30 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:40:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:30.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:31.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:31 np0005593233 nova_compute[222017]: 2026-01-23 09:40:31.481 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:31 np0005593233 nova_compute[222017]: 2026-01-23 09:40:31.482 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:31 np0005593233 nova_compute[222017]: 2026-01-23 09:40:31.483 222021 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Using config drive#033[00m
Jan 23 04:40:32 np0005593233 nova_compute[222017]: 2026-01-23 09:40:32.100 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:32 np0005593233 nova_compute[222017]: 2026-01-23 09:40:32.680 222021 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Creating config drive at /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/disk.config#033[00m
Jan 23 04:40:32 np0005593233 nova_compute[222017]: 2026-01-23 09:40:32.685 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprfuzw8ow execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:32 np0005593233 nova_compute[222017]: 2026-01-23 09:40:32.816 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprfuzw8ow" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:32 np0005593233 nova_compute[222017]: 2026-01-23 09:40:32.901 222021 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:32 np0005593233 nova_compute[222017]: 2026-01-23 09:40:32.905 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/disk.config 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:32.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:33.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.297 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.298 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.345 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.467 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.467 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.475 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.475 222021 INFO nova.compute.claims [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.622 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:33 np0005593233 nova_compute[222017]: 2026-01-23 09:40:33.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/14202212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.049 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.055 222021 DEBUG nova.compute.provider_tree [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.190 222021 DEBUG nova.scheduler.client.report [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.197 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.227 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.228 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.364 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.365 222021 DEBUG nova.network.neutron [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.399 222021 INFO nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.425 222021 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/disk.config 27b8c2df-6812-4d79-ba38-79585e06bfdc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.425 222021 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Deleting local config drive /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc/disk.config because it was imported into RBD.#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.437 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:40:34 np0005593233 systemd-machined[190954]: New machine qemu-22-instance-00000027.
Jan 23 04:40:34 np0005593233 systemd[1]: Started Virtual Machine qemu-22-instance-00000027.
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.565 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.566 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.567 222021 INFO nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Creating image(s)#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.599 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.631 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.661 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.665 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:34.726 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:40:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:34.727 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.727 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.730 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.731 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.732 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.732 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.764 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.768 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:34 np0005593233 nova_compute[222017]: 2026-01-23 09:40:34.798 222021 DEBUG nova.policy [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56da68482e3a4fb582dcccad45f8f71b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05bc71a77710455e8b34ead7fec81a31', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:40:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:34.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:35.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.443 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161235.4432075, 27b8c2df-6812-4d79-ba38-79585e06bfdc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.444 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.448 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.448 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.454 222021 INFO nova.virt.libvirt.driver [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Instance spawned successfully.#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.455 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.481 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.488 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.491 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.491 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.491 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.492 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.492 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.492 222021 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.527 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.527 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161235.4474046, 27b8c2df-6812-4d79-ba38-79585e06bfdc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.528 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] VM Started (Lifecycle Event)#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.549 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.552 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.558 222021 INFO nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Took 7.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.558 222021 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.570 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.614 222021 INFO nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Took 9.29 seconds to build instance.#033[00m
Jan 23 04:40:35 np0005593233 nova_compute[222017]: 2026-01-23 09:40:35.641 222021 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "27b8c2df-6812-4d79-ba38-79585e06bfdc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:36 np0005593233 nova_compute[222017]: 2026-01-23 09:40:36.096 222021 DEBUG nova.network.neutron [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Successfully created port: dda6704d-0b42-490a-9a2d-82800c64a748 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:40:36 np0005593233 nova_compute[222017]: 2026-01-23 09:40:36.280 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:36 np0005593233 nova_compute[222017]: 2026-01-23 09:40:36.365 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] resizing rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:40:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:36.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:40:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:37.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:40:38 np0005593233 podman[238756]: 2026-01-23 09:40:38.099490432 +0000 UTC m=+0.107774892 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.390 222021 DEBUG nova.network.neutron [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Successfully updated port: dda6704d-0b42-490a-9a2d-82800c64a748 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.420 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "refresh_cache-2bf89ff4-cf8b-4ecb-ac65-d948b5039556" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.420 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquired lock "refresh_cache-2bf89ff4-cf8b-4ecb-ac65-d948b5039556" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.420 222021 DEBUG nova.network.neutron [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.627 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "27b8c2df-6812-4d79-ba38-79585e06bfdc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.628 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "27b8c2df-6812-4d79-ba38-79585e06bfdc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.628 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "27b8c2df-6812-4d79-ba38-79585e06bfdc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.629 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "27b8c2df-6812-4d79-ba38-79585e06bfdc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.629 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "27b8c2df-6812-4d79-ba38-79585e06bfdc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.631 222021 INFO nova.compute.manager [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Terminating instance#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.632 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "refresh_cache-27b8c2df-6812-4d79-ba38-79585e06bfdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.632 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquired lock "refresh_cache-27b8c2df-6812-4d79-ba38-79585e06bfdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.633 222021 DEBUG nova.network.neutron [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.673 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.693 222021 DEBUG nova.network.neutron [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.715 222021 DEBUG nova.compute.manager [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-changed-dda6704d-0b42-490a-9a2d-82800c64a748 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.716 222021 DEBUG nova.compute.manager [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Refreshing instance network info cache due to event network-changed-dda6704d-0b42-490a-9a2d-82800c64a748. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.716 222021 DEBUG oslo_concurrency.lockutils [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-2bf89ff4-cf8b-4ecb-ac65-d948b5039556" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:40:38 np0005593233 nova_compute[222017]: 2026-01-23 09:40:38.828 222021 DEBUG nova.network.neutron [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.198 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:39.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.253 222021 DEBUG nova.objects.instance [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'migration_context' on Instance uuid 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.298 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.299 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Ensure instance console log exists: /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.299 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.300 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.300 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.454 222021 DEBUG nova.network.neutron [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.483 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Releasing lock "refresh_cache-27b8c2df-6812-4d79-ba38-79585e06bfdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.484 222021 DEBUG nova.compute.manager [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:40:39 np0005593233 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 23 04:40:39 np0005593233 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000027.scope: Consumed 4.586s CPU time.
Jan 23 04:40:39 np0005593233 systemd-machined[190954]: Machine qemu-22-instance-00000027 terminated.
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.708 222021 INFO nova.virt.libvirt.driver [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Instance destroyed successfully.#033[00m
Jan 23 04:40:39 np0005593233 nova_compute[222017]: 2026-01-23 09:40:39.708 222021 DEBUG nova.objects.instance [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'resources' on Instance uuid 27b8c2df-6812-4d79-ba38-79585e06bfdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.436 222021 INFO nova.virt.libvirt.driver [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Deleting instance files /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc_del#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.437 222021 INFO nova.virt.libvirt.driver [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Deletion of /var/lib/nova/instances/27b8c2df-6812-4d79-ba38-79585e06bfdc_del complete#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.620 222021 INFO nova.compute.manager [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Took 1.14 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.620 222021 DEBUG oslo.service.loopingcall [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.621 222021 DEBUG nova.compute.manager [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.621 222021 DEBUG nova.network.neutron [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.647 222021 DEBUG nova.network.neutron [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Updating instance_info_cache with network_info: [{"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.671 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Releasing lock "refresh_cache-2bf89ff4-cf8b-4ecb-ac65-d948b5039556" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.672 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Instance network_info: |[{"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.672 222021 DEBUG oslo_concurrency.lockutils [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-2bf89ff4-cf8b-4ecb-ac65-d948b5039556" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.673 222021 DEBUG nova.network.neutron [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Refreshing network info cache for port dda6704d-0b42-490a-9a2d-82800c64a748 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.676 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Start _get_guest_xml network_info=[{"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.683 222021 WARNING nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.688 222021 DEBUG nova.virt.libvirt.host [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.690 222021 DEBUG nova.virt.libvirt.host [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.693 222021 DEBUG nova.virt.libvirt.host [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.694 222021 DEBUG nova.virt.libvirt.host [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.695 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.695 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.696 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.696 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.697 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.697 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.697 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.698 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.698 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.698 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.699 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.699 222021 DEBUG nova.virt.hardware [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.703 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.865 222021 DEBUG nova.network.neutron [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.893 222021 DEBUG nova.network.neutron [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:40 np0005593233 nova_compute[222017]: 2026-01-23 09:40:40.916 222021 INFO nova.compute.manager [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Took 0.29 seconds to deallocate network for instance.#033[00m
Jan 23 04:40:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:40.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.008 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.009 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3560029513' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.185 222021 DEBUG oslo_concurrency.processutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.207 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.235 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:41.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.240 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1079586888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.630 222021 DEBUG oslo_concurrency.processutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.638 222021 DEBUG nova.compute.provider_tree [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1416739897' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.709 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.712 222021 DEBUG nova.virt.libvirt.vif [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:40:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1795212687',display_name='tempest-ImagesTestJSON-server-1795212687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1795212687',id=41,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-nrqhn47t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:40:34Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=2bf89ff4-cf8b-4ecb-ac65-d948b5039556,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.712 222021 DEBUG nova.network.os_vif_util [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.714 222021 DEBUG nova.network.os_vif_util [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.717 222021 DEBUG nova.objects.instance [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.739 222021 DEBUG nova.scheduler.client.report [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.746 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <uuid>2bf89ff4-cf8b-4ecb-ac65-d948b5039556</uuid>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <name>instance-00000029</name>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:name>tempest-ImagesTestJSON-server-1795212687</nova:name>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:40:40</nova:creationTime>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:user uuid="56da68482e3a4fb582dcccad45f8f71b">tempest-ImagesTestJSON-1507872051-project-member</nova:user>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:project uuid="05bc71a77710455e8b34ead7fec81a31">tempest-ImagesTestJSON-1507872051</nova:project>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <nova:port uuid="dda6704d-0b42-490a-9a2d-82800c64a748">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <entry name="serial">2bf89ff4-cf8b-4ecb-ac65-d948b5039556</entry>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <entry name="uuid">2bf89ff4-cf8b-4ecb-ac65-d948b5039556</entry>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk.config">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:31:9f:d5"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <target dev="tapdda6704d-0b"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/console.log" append="off"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:40:41 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:40:41 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:40:41 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:40:41 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.748 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Preparing to wait for external event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.749 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.749 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.750 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.750 222021 DEBUG nova.virt.libvirt.vif [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:40:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1795212687',display_name='tempest-ImagesTestJSON-server-1795212687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1795212687',id=41,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-nrqhn47t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:40:34Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=2bf89ff4-cf8b-4ecb-ac65-d948b5039556,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.751 222021 DEBUG nova.network.os_vif_util [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.751 222021 DEBUG nova.network.os_vif_util [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.752 222021 DEBUG os_vif [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.753 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.753 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.754 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.758 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.758 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdda6704d-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.759 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdda6704d-0b, col_values=(('external_ids', {'iface-id': 'dda6704d-0b42-490a-9a2d-82800c64a748', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:9f:d5', 'vm-uuid': '2bf89ff4-cf8b-4ecb-ac65-d948b5039556'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:40:41 np0005593233 NetworkManager[48871]: <info>  [1769161241.7623] manager: (tapdda6704d-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.769 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.770 222021 INFO os_vif [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b')#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.830 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.859 222021 INFO nova.scheduler.client.report [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Deleted allocations for instance 27b8c2df-6812-4d79-ba38-79585e06bfdc#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.892 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.893 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.893 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No VIF found with MAC fa:16:3e:31:9f:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.894 222021 INFO nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Using config drive#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.930 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:41 np0005593233 nova_compute[222017]: 2026-01-23 09:40:41.940 222021 DEBUG oslo_concurrency.lockutils [None req-b4967cac-4bc0-4286-93eb-d3e428458a89 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "27b8c2df-6812-4d79-ba38-79585e06bfdc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.436 222021 INFO nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Creating config drive at /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/disk.config#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.441 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe2fd25l_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.575 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe2fd25l_" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.628 222021 DEBUG nova.storage.rbd_utils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.634 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/disk.config 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:42.641 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:42.641 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:42.642 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.878 222021 DEBUG oslo_concurrency.processutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/disk.config 2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.243s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.879 222021 INFO nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Deleting local config drive /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556/disk.config because it was imported into RBD.#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.889 222021 DEBUG nova.network.neutron [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Updated VIF entry in instance network info cache for port dda6704d-0b42-490a-9a2d-82800c64a748. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.890 222021 DEBUG nova.network.neutron [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Updating instance_info_cache with network_info: [{"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.909 222021 DEBUG oslo_concurrency.lockutils [req-9921a6ad-d953-45dd-8952-c570d1b29ee6 req-01c8c690-28d0-4372-8d3e-0ef4f003d255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-2bf89ff4-cf8b-4ecb-ac65-d948b5039556" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:40:42 np0005593233 kernel: tapdda6704d-0b: entered promiscuous mode
Jan 23 04:40:42 np0005593233 NetworkManager[48871]: <info>  [1769161242.9515] manager: (tapdda6704d-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Jan 23 04:40:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:40:42Z|00118|binding|INFO|Claiming lport dda6704d-0b42-490a-9a2d-82800c64a748 for this chassis.
Jan 23 04:40:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:40:42Z|00119|binding|INFO|dda6704d-0b42-490a-9a2d-82800c64a748: Claiming fa:16:3e:31:9f:d5 10.100.0.12
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.955 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:42 np0005593233 nova_compute[222017]: 2026-01-23 09:40:42.969 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:42.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:42.979 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:9f:d5 10.100.0.12'], port_security=['fa:16:3e:31:9f:d5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2bf89ff4-cf8b-4ecb-ac65-d948b5039556', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dda6704d-0b42-490a-9a2d-82800c64a748) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:42.981 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dda6704d-0b42-490a-9a2d-82800c64a748 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d bound to our chassis#033[00m
Jan 23 04:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:42.982 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c2696fd4-5fd7-4934-88ac-40162fad555d#033[00m
Jan 23 04:40:42 np0005593233 systemd-udevd[238979]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.002 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6048f6a-17fa-4008-9da5-e7cf02cf3596]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.003 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc2696fd4-51 in ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:40:43 np0005593233 systemd-machined[190954]: New machine qemu-23-instance-00000029.
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.006 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc2696fd4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.006 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[21da73bf-9bd9-49ee-a177-cc25256c9fd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.007 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7199dc-b044-4087-be70-418b202727c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 NetworkManager[48871]: <info>  [1769161243.0102] device (tapdda6704d-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:40:43 np0005593233 NetworkManager[48871]: <info>  [1769161243.0109] device (tapdda6704d-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.020 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[17fe3c4f-ddd4-48f9-8b15-3b713e36d102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 systemd[1]: Started Virtual Machine qemu-23-instance-00000029.
Jan 23 04:40:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:40:43Z|00120|binding|INFO|Setting lport dda6704d-0b42-490a-9a2d-82800c64a748 ovn-installed in OVS
Jan 23 04:40:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:40:43Z|00121|binding|INFO|Setting lport dda6704d-0b42-490a-9a2d-82800c64a748 up in Southbound
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.047 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.048 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2a6293-d546-418c-8b37-883faa0f89f3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.078 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9df5f549-e76d-4d95-bf6d-5b5a4250972c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.084 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[84b54589-95a2-44db-9c90-3d90fd2e0eae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 NetworkManager[48871]: <info>  [1769161243.0858] manager: (tapc2696fd4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Jan 23 04:40:43 np0005593233 systemd-udevd[238984]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.164 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[87348729-c37a-4a1d-9c8f-3ed1cbe15958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.169 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5ffd52-d877-43d0-b857-dc87b2c14d2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 NetworkManager[48871]: <info>  [1769161243.1896] device (tapc2696fd4-50): carrier: link connected
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.198 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f3b425-c117-49e4-b9d5-a2db6d2de1d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.217 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7d58fac2-20c0-4a6e-9565-12e55f812dd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515582, 'reachable_time': 24868, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239014, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.236 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d9676d4f-db43-4d61-89dd-a889eb12ad2c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:20d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515582, 'tstamp': 515582}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239015, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:43.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.256 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a77a3a68-86fb-48ed-aa61-33141d5b92db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515582, 'reachable_time': 24868, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239016, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.299 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2f58fd-c2eb-445b-a295-e89451c411e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.375 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6baa59-825c-4e63-919d-c2cb4d74432a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.377 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.378 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.379 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2696fd4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:43 np0005593233 NetworkManager[48871]: <info>  [1769161243.3816] manager: (tapc2696fd4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 23 04:40:43 np0005593233 kernel: tapc2696fd4-50: entered promiscuous mode
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.381 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.387 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc2696fd4-50, col_values=(('external_ids', {'iface-id': '38b24332-af6b-47d2-95fe-400f5feeadcb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.389 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:40:43Z|00122|binding|INFO|Releasing lport 38b24332-af6b-47d2-95fe-400f5feeadcb from this chassis (sb_readonly=0)
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.391 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.392 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5302b56e-36f1-44c9-a8f7-bc254371b1bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.393 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:40:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:43.394 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'env', 'PROCESS_TAG=haproxy-c2696fd4-5fd7-4934-88ac-40162fad555d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c2696fd4-5fd7-4934-88ac-40162fad555d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.652 222021 DEBUG nova.compute.manager [req-a1a53a25-20da-4c62-b521-591546f13258 req-ea002983-2b9f-40bc-93db-274cf1aea012 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.652 222021 DEBUG oslo_concurrency.lockutils [req-a1a53a25-20da-4c62-b521-591546f13258 req-ea002983-2b9f-40bc-93db-274cf1aea012 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.653 222021 DEBUG oslo_concurrency.lockutils [req-a1a53a25-20da-4c62-b521-591546f13258 req-ea002983-2b9f-40bc-93db-274cf1aea012 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.653 222021 DEBUG oslo_concurrency.lockutils [req-a1a53a25-20da-4c62-b521-591546f13258 req-ea002983-2b9f-40bc-93db-274cf1aea012 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:43 np0005593233 nova_compute[222017]: 2026-01-23 09:40:43.653 222021 DEBUG nova.compute.manager [req-a1a53a25-20da-4c62-b521-591546f13258 req-ea002983-2b9f-40bc-93db-274cf1aea012 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Processing event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:40:43 np0005593233 podman[239048]: 2026-01-23 09:40:43.82510624 +0000 UTC m=+0.056043913 container create f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:40:43 np0005593233 systemd[1]: Started libpod-conmon-f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308.scope.
Jan 23 04:40:43 np0005593233 podman[239048]: 2026-01-23 09:40:43.795978903 +0000 UTC m=+0.026916586 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:40:43 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:40:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f314c064b92d2d13d3a4fed4896e05073037743767131502f05a469b4294c4f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:40:43 np0005593233 podman[239048]: 2026-01-23 09:40:43.933384325 +0000 UTC m=+0.164321998 container init f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:40:43 np0005593233 podman[239048]: 2026-01-23 09:40:43.940087465 +0000 UTC m=+0.171025128 container start f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:40:43 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [NOTICE]   (239067) : New worker (239076) forked
Jan 23 04:40:43 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [NOTICE]   (239067) : Loading success.
Jan 23 04:40:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.200 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.465 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.467 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161244.464198, 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.468 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] VM Started (Lifecycle Event)#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.473 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.478 222021 INFO nova.virt.libvirt.driver [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Instance spawned successfully.#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.479 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.511 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.518 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.523 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.524 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.524 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.525 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.525 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.526 222021 DEBUG nova.virt.libvirt.driver [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.555 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.556 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161244.466976, 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.556 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.590 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.595 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161244.4717767, 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.595 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.602 222021 INFO nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Took 10.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.602 222021 DEBUG nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.620 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.625 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.661 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.696 222021 INFO nova.compute.manager [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Took 11.29 seconds to build instance.#033[00m
Jan 23 04:40:44 np0005593233 nova_compute[222017]: 2026-01-23 09:40:44.716 222021 DEBUG oslo_concurrency.lockutils [None req-6ad5a480-1624-4809-b6ed-23e90f2e31d7 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:40:44.729 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:44.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:45.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:46 np0005593233 nova_compute[222017]: 2026-01-23 09:40:46.763 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:46.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:47.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:48 np0005593233 nova_compute[222017]: 2026-01-23 09:40:48.379 222021 DEBUG nova.compute.manager [req-90b3744b-1f2a-4550-b296-ab1cf5e26e74 req-d7976b21-dc40-44dc-8519-2c4851159886 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:40:48 np0005593233 nova_compute[222017]: 2026-01-23 09:40:48.380 222021 DEBUG oslo_concurrency.lockutils [req-90b3744b-1f2a-4550-b296-ab1cf5e26e74 req-d7976b21-dc40-44dc-8519-2c4851159886 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:48 np0005593233 nova_compute[222017]: 2026-01-23 09:40:48.380 222021 DEBUG oslo_concurrency.lockutils [req-90b3744b-1f2a-4550-b296-ab1cf5e26e74 req-d7976b21-dc40-44dc-8519-2c4851159886 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:48 np0005593233 nova_compute[222017]: 2026-01-23 09:40:48.380 222021 DEBUG oslo_concurrency.lockutils [req-90b3744b-1f2a-4550-b296-ab1cf5e26e74 req-d7976b21-dc40-44dc-8519-2c4851159886 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:48 np0005593233 nova_compute[222017]: 2026-01-23 09:40:48.380 222021 DEBUG nova.compute.manager [req-90b3744b-1f2a-4550-b296-ab1cf5e26e74 req-d7976b21-dc40-44dc-8519-2c4851159886 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] No waiting events found dispatching network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:40:48 np0005593233 nova_compute[222017]: 2026-01-23 09:40:48.380 222021 WARNING nova.compute.manager [req-90b3744b-1f2a-4550-b296-ab1cf5e26e74 req-d7976b21-dc40-44dc-8519-2c4851159886 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received unexpected event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:40:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:48.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.203 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:49.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.639 222021 INFO nova.compute.manager [None req-543f5514-aca3-4051-99fe-b0a35c240f52 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Pausing#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.641 222021 DEBUG nova.objects.instance [None req-543f5514-aca3-4051-99fe-b0a35c240f52 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'flavor' on Instance uuid 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.679 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161249.6789703, 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.679 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.682 222021 DEBUG nova.compute.manager [None req-543f5514-aca3-4051-99fe-b0a35c240f52 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.711 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.715 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:49 np0005593233 nova_compute[222017]: 2026-01-23 09:40:49.747 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 23 04:40:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:50.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:51 np0005593233 podman[239120]: 2026-01-23 09:40:51.062018858 +0000 UTC m=+0.066317885 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 04:40:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:51.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:51 np0005593233 nova_compute[222017]: 2026-01-23 09:40:51.768 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:52 np0005593233 nova_compute[222017]: 2026-01-23 09:40:52.802 222021 DEBUG nova.compute.manager [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:52 np0005593233 nova_compute[222017]: 2026-01-23 09:40:52.849 222021 INFO nova.compute.manager [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] instance snapshotting#033[00m
Jan 23 04:40:52 np0005593233 nova_compute[222017]: 2026-01-23 09:40:52.849 222021 WARNING nova.compute.manager [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Jan 23 04:40:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:52.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:53 np0005593233 nova_compute[222017]: 2026-01-23 09:40:53.240 222021 INFO nova.virt.libvirt.driver [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Beginning live snapshot process#033[00m
Jan 23 04:40:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:53.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:40:53 np0005593233 nova_compute[222017]: 2026-01-23 09:40:53.421 222021 DEBUG nova.virt.libvirt.imagebackend [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:40:53 np0005593233 nova_compute[222017]: 2026-01-23 09:40:53.664 222021 DEBUG nova.storage.rbd_utils [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(460b6747b4dc4759967c065a5a3da01b) on rbd image(2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:40:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:54 np0005593233 nova_compute[222017]: 2026-01-23 09:40:54.249 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:54 np0005593233 nova_compute[222017]: 2026-01-23 09:40:54.707 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161239.7057047, 27b8c2df-6812-4d79-ba38-79585e06bfdc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:54 np0005593233 nova_compute[222017]: 2026-01-23 09:40:54.708 222021 INFO nova.compute.manager [-] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:40:54 np0005593233 nova_compute[222017]: 2026-01-23 09:40:54.771 222021 DEBUG nova.compute.manager [None req-86a78642-7a74-4336-9cf5-bdbfb14001de - - - - - -] [instance: 27b8c2df-6812-4d79-ba38-79585e06bfdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:54.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 23 04:40:55 np0005593233 nova_compute[222017]: 2026-01-23 09:40:55.201 222021 DEBUG nova.storage.rbd_utils [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] cloning vms/2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk@460b6747b4dc4759967c065a5a3da01b to images/cb2aa1fa-520b-4509-8a21-f3e422b84e79 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:40:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:55.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:55 np0005593233 nova_compute[222017]: 2026-01-23 09:40:55.337 222021 DEBUG nova.storage.rbd_utils [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] flattening images/cb2aa1fa-520b-4509-8a21-f3e422b84e79 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:40:56 np0005593233 nova_compute[222017]: 2026-01-23 09:40:56.284 222021 DEBUG nova.storage.rbd_utils [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] removing snapshot(460b6747b4dc4759967c065a5a3da01b) on rbd image(2bf89ff4-cf8b-4ecb-ac65-d948b5039556_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:40:56 np0005593233 nova_compute[222017]: 2026-01-23 09:40:56.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:40:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:56.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:40:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 23 04:40:57 np0005593233 nova_compute[222017]: 2026-01-23 09:40:57.235 222021 DEBUG nova.storage.rbd_utils [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(snap) on rbd image(cb2aa1fa-520b-4509-8a21-f3e422b84e79) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:40:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:57.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 23 04:40:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:58.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:59 np0005593233 nova_compute[222017]: 2026-01-23 09:40:59.251 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:40:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:40:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:59.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:00.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:01.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:01 np0005593233 nova_compute[222017]: 2026-01-23 09:41:01.776 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:01 np0005593233 nova_compute[222017]: 2026-01-23 09:41:01.800 222021 INFO nova.virt.libvirt.driver [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Snapshot image upload complete#033[00m
Jan 23 04:41:01 np0005593233 nova_compute[222017]: 2026-01-23 09:41:01.801 222021 INFO nova.compute.manager [None req-4d0fbaea-efb3-4c63-bd41-c093c482fea5 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Took 8.95 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 04:41:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:02.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:03.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 23 04:41:04 np0005593233 nova_compute[222017]: 2026-01-23 09:41:04.288 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:05.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:05.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.771 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.772 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.772 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.773 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.773 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.774 222021 INFO nova.compute.manager [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Terminating instance#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.775 222021 DEBUG nova.compute.manager [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:41:05 np0005593233 kernel: tapdda6704d-0b (unregistering): left promiscuous mode
Jan 23 04:41:05 np0005593233 NetworkManager[48871]: <info>  [1769161265.8177] device (tapdda6704d-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:41:05 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:05Z|00123|binding|INFO|Releasing lport dda6704d-0b42-490a-9a2d-82800c64a748 from this chassis (sb_readonly=0)
Jan 23 04:41:05 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:05Z|00124|binding|INFO|Setting lport dda6704d-0b42-490a-9a2d-82800c64a748 down in Southbound
Jan 23 04:41:05 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:05Z|00125|binding|INFO|Removing iface tapdda6704d-0b ovn-installed in OVS
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.826 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.829 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:05.833 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:9f:d5 10.100.0.12'], port_security=['fa:16:3e:31:9f:d5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2bf89ff4-cf8b-4ecb-ac65-d948b5039556', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dda6704d-0b42-490a-9a2d-82800c64a748) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:05.835 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dda6704d-0b42-490a-9a2d-82800c64a748 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d unbound from our chassis#033[00m
Jan 23 04:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:05.836 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2696fd4-5fd7-4934-88ac-40162fad555d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:05.837 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b8691b-2a73-4b08-bfe9-d1755cd42c5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:05.837 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace which is not needed anymore#033[00m
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.848 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:05 np0005593233 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000029.scope: Deactivated successfully.
Jan 23 04:41:05 np0005593233 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000029.scope: Consumed 6.455s CPU time.
Jan 23 04:41:05 np0005593233 systemd-machined[190954]: Machine qemu-23-instance-00000029 terminated.
Jan 23 04:41:05 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [NOTICE]   (239067) : haproxy version is 2.8.14-c23fe91
Jan 23 04:41:05 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [NOTICE]   (239067) : path to executable is /usr/sbin/haproxy
Jan 23 04:41:05 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [WARNING]  (239067) : Exiting Master process...
Jan 23 04:41:05 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [ALERT]    (239067) : Current worker (239076) exited with code 143 (Terminated)
Jan 23 04:41:05 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[239063]: [WARNING]  (239067) : All workers exited. Exiting... (0)
Jan 23 04:41:05 np0005593233 systemd[1]: libpod-f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308.scope: Deactivated successfully.
Jan 23 04:41:05 np0005593233 podman[239308]: 2026-01-23 09:41:05.984477099 +0000 UTC m=+0.049000343 container died f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 04:41:05 np0005593233 nova_compute[222017]: 2026-01-23 09:41:05.998 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.015 222021 INFO nova.virt.libvirt.driver [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Instance destroyed successfully.#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.016 222021 DEBUG nova.objects.instance [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'resources' on Instance uuid 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:06 np0005593233 systemd[1]: var-lib-containers-storage-overlay-8f314c064b92d2d13d3a4fed4896e05073037743767131502f05a469b4294c4f-merged.mount: Deactivated successfully.
Jan 23 04:41:06 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308-userdata-shm.mount: Deactivated successfully.
Jan 23 04:41:06 np0005593233 podman[239308]: 2026-01-23 09:41:06.034941962 +0000 UTC m=+0.099465186 container cleanup f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:41:06 np0005593233 systemd[1]: libpod-conmon-f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308.scope: Deactivated successfully.
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.061 222021 DEBUG nova.virt.libvirt.vif [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:40:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1795212687',display_name='tempest-ImagesTestJSON-server-1795212687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1795212687',id=41,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:40:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-nrqhn47t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:01Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=2bf89ff4-cf8b-4ecb-ac65-d948b5039556,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.063 222021 DEBUG nova.network.os_vif_util [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "dda6704d-0b42-490a-9a2d-82800c64a748", "address": "fa:16:3e:31:9f:d5", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdda6704d-0b", "ovs_interfaceid": "dda6704d-0b42-490a-9a2d-82800c64a748", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.064 222021 DEBUG nova.network.os_vif_util [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.064 222021 DEBUG os_vif [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.066 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.067 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdda6704d-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.071 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.074 222021 INFO os_vif [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:9f:d5,bridge_name='br-int',has_traffic_filtering=True,id=dda6704d-0b42-490a-9a2d-82800c64a748,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdda6704d-0b')#033[00m
Jan 23 04:41:06 np0005593233 podman[239349]: 2026-01-23 09:41:06.118838354 +0000 UTC m=+0.052432740 container remove f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.129 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f035d141-1ac2-4904-a4e1-e21d40821b82]: (4, ('Fri Jan 23 09:41:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308)\nf3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308\nFri Jan 23 09:41:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (f3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308)\nf3c3ee807a267491ec0894c16263d2cbbd1bc829c72dd0a36518c7777d308308\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.131 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6796a42c-b740-473e-af5b-25c8f9e006a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.131 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:06 np0005593233 kernel: tapc2696fd4-50: left promiscuous mode
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.138 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fec3c9f8-baef-4d2c-8ade-8b824a3ce9ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.150 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.156 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eb8a7d89-7e92-4760-af4b-cd97bc0cf59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.158 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dd695833-faf3-46f5-a353-886dd7cce142]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.173 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[437d454c-26f2-44c8-a4a9-44ff1188c7bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515570, 'reachable_time': 33717, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239378, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 systemd[1]: run-netns-ovnmeta\x2dc2696fd4\x2d5fd7\x2d4934\x2d88ac\x2d40162fad555d.mount: Deactivated successfully.
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.178 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:41:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:06.178 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b14d7f-bad9-437a-a2ee-f312ba6b86d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.229 222021 DEBUG nova.compute.manager [req-1480a68e-3131-4662-85fb-8ed95ebf474b req-40362ef8-2a77-43d5-9eac-d8acdd8fdb6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-vif-unplugged-dda6704d-0b42-490a-9a2d-82800c64a748 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.230 222021 DEBUG oslo_concurrency.lockutils [req-1480a68e-3131-4662-85fb-8ed95ebf474b req-40362ef8-2a77-43d5-9eac-d8acdd8fdb6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.230 222021 DEBUG oslo_concurrency.lockutils [req-1480a68e-3131-4662-85fb-8ed95ebf474b req-40362ef8-2a77-43d5-9eac-d8acdd8fdb6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.231 222021 DEBUG oslo_concurrency.lockutils [req-1480a68e-3131-4662-85fb-8ed95ebf474b req-40362ef8-2a77-43d5-9eac-d8acdd8fdb6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.231 222021 DEBUG nova.compute.manager [req-1480a68e-3131-4662-85fb-8ed95ebf474b req-40362ef8-2a77-43d5-9eac-d8acdd8fdb6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] No waiting events found dispatching network-vif-unplugged-dda6704d-0b42-490a-9a2d-82800c64a748 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:06 np0005593233 nova_compute[222017]: 2026-01-23 09:41:06.231 222021 DEBUG nova.compute.manager [req-1480a68e-3131-4662-85fb-8ed95ebf474b req-40362ef8-2a77-43d5-9eac-d8acdd8fdb6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-vif-unplugged-dda6704d-0b42-490a-9a2d-82800c64a748 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:41:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:07.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:07 np0005593233 nova_compute[222017]: 2026-01-23 09:41:07.085 222021 INFO nova.virt.libvirt.driver [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Deleting instance files /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556_del#033[00m
Jan 23 04:41:07 np0005593233 nova_compute[222017]: 2026-01-23 09:41:07.086 222021 INFO nova.virt.libvirt.driver [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Deletion of /var/lib/nova/instances/2bf89ff4-cf8b-4ecb-ac65-d948b5039556_del complete#033[00m
Jan 23 04:41:07 np0005593233 nova_compute[222017]: 2026-01-23 09:41:07.179 222021 INFO nova.compute.manager [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Took 1.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:41:07 np0005593233 nova_compute[222017]: 2026-01-23 09:41:07.180 222021 DEBUG oslo.service.loopingcall [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:41:07 np0005593233 nova_compute[222017]: 2026-01-23 09:41:07.181 222021 DEBUG nova.compute.manager [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:41:07 np0005593233 nova_compute[222017]: 2026-01-23 09:41:07.182 222021 DEBUG nova.network.neutron [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:41:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:07.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.314 222021 DEBUG nova.network.neutron [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.353 222021 INFO nova.compute.manager [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Took 1.17 seconds to deallocate network for instance.#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.436 222021 DEBUG nova.compute.manager [req-b43e4e7b-7b48-4812-bdbc-9c627436ec35 req-01fa4efd-49c2-4842-be01-b2b6d70d878a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.436 222021 DEBUG oslo_concurrency.lockutils [req-b43e4e7b-7b48-4812-bdbc-9c627436ec35 req-01fa4efd-49c2-4842-be01-b2b6d70d878a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.437 222021 DEBUG oslo_concurrency.lockutils [req-b43e4e7b-7b48-4812-bdbc-9c627436ec35 req-01fa4efd-49c2-4842-be01-b2b6d70d878a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.437 222021 DEBUG oslo_concurrency.lockutils [req-b43e4e7b-7b48-4812-bdbc-9c627436ec35 req-01fa4efd-49c2-4842-be01-b2b6d70d878a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.437 222021 DEBUG nova.compute.manager [req-b43e4e7b-7b48-4812-bdbc-9c627436ec35 req-01fa4efd-49c2-4842-be01-b2b6d70d878a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] No waiting events found dispatching network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.437 222021 WARNING nova.compute.manager [req-b43e4e7b-7b48-4812-bdbc-9c627436ec35 req-01fa4efd-49c2-4842-be01-b2b6d70d878a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received unexpected event network-vif-plugged-dda6704d-0b42-490a-9a2d-82800c64a748 for instance with vm_state paused and task_state deleting.#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.447 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.447 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.532 222021 DEBUG oslo_concurrency.processutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:08 np0005593233 nova_compute[222017]: 2026-01-23 09:41:08.952 222021 DEBUG nova.compute.manager [req-d709d045-9243-47b5-8636-6012e358aa35 req-6d2b1bd7-1344-447f-bc5f-e6482d1c9c2c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Received event network-vif-deleted-dda6704d-0b42-490a-9a2d-82800c64a748 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.002 222021 DEBUG oslo_concurrency.processutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:09.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.011 222021 DEBUG nova.compute.provider_tree [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.041 222021 DEBUG nova.scheduler.client.report [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.072 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.099 222021 INFO nova.scheduler.client.report [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Deleted allocations for instance 2bf89ff4-cf8b-4ecb-ac65-d948b5039556#033[00m
Jan 23 04:41:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:09 np0005593233 podman[239400]: 2026-01-23 09:41:09.120441481 +0000 UTC m=+0.111979831 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.212 222021 DEBUG oslo_concurrency.lockutils [None req-4bcf2aec-5248-4607-87d4-8d19ba914a32 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "2bf89ff4-cf8b-4ecb-ac65-d948b5039556" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:09.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:09 np0005593233 nova_compute[222017]: 2026-01-23 09:41:09.290 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:10 np0005593233 nova_compute[222017]: 2026-01-23 09:41:10.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:10 np0005593233 nova_compute[222017]: 2026-01-23 09:41:10.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:41:10 np0005593233 nova_compute[222017]: 2026-01-23 09:41:10.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:41:10 np0005593233 nova_compute[222017]: 2026-01-23 09:41:10.413 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:41:10 np0005593233 nova_compute[222017]: 2026-01-23 09:41:10.413 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:10 np0005593233 nova_compute[222017]: 2026-01-23 09:41:10.414 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:41:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:11.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.072 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:11.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.432 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.432 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1801710745' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:11 np0005593233 nova_compute[222017]: 2026-01-23 09:41:11.941 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.163 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.165 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4771MB free_disk=20.978435516357422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.165 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.166 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.270 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.270 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.308 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1357301256' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.797 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.805 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.832 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.867 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:41:12 np0005593233 nova_compute[222017]: 2026-01-23 09:41:12.868 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:13.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:13.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:13 np0005593233 nova_compute[222017]: 2026-01-23 09:41:13.869 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 23 04:41:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:14 np0005593233 nova_compute[222017]: 2026-01-23 09:41:14.295 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:14 np0005593233 nova_compute[222017]: 2026-01-23 09:41:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:15.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:15.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:16 np0005593233 nova_compute[222017]: 2026-01-23 09:41:16.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:17.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:17.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:41:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.8 total, 600.0 interval#012Cumulative writes: 17K writes, 68K keys, 17K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.03 MB/s#012Cumulative WAL: 17K writes, 5458 syncs, 3.14 writes per sync, written: 0.06 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8394 writes, 33K keys, 8394 commit groups, 1.0 writes per commit group, ingest: 32.06 MB, 0.05 MB/s#012Interval WAL: 8394 writes, 3424 syncs, 2.45 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.368 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.368 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.397 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.516 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.517 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.525 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.526 222021 INFO nova.compute.claims [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:41:18 np0005593233 nova_compute[222017]: 2026-01-23 09:41:18.674 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:19.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:19.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3159608216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.618 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.945s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.624 222021 DEBUG nova.compute.provider_tree [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.643 222021 DEBUG nova.scheduler.client.report [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.679 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.680 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:41:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.754 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.755 222021 DEBUG nova.network.neutron [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.787 222021 INFO nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.817 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.924 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.926 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:41:19 np0005593233 nova_compute[222017]: 2026-01-23 09:41:19.926 222021 INFO nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Creating image(s)#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.022 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.050 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.080 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.084 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.115 222021 DEBUG nova.policy [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70b82be77f6f46caba34213f79897362', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4039861df5dd4fc0ab6daf192b8e1f33', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.151 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.152 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.153 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.153 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.185 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:20 np0005593233 nova_compute[222017]: 2026-01-23 09:41:20.189 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b61fe33d-386a-4578-b98d-01a8ec801768_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.013 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161266.0114112, 2bf89ff4-cf8b-4ecb-ac65-d948b5039556 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.016 222021 INFO nova.compute.manager [-] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:41:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:21.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.051 222021 DEBUG nova.compute.manager [None req-f9c30842-15c3-440c-840f-952912767a0c - - - - - -] [instance: 2bf89ff4-cf8b-4ecb-ac65-d948b5039556] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.095 222021 DEBUG nova.network.neutron [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Successfully created port: 1c98935c-d4a8-4f99-8dee-37240aecb6af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:41:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:21.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.652 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b61fe33d-386a-4578-b98d-01a8ec801768_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:21 np0005593233 nova_compute[222017]: 2026-01-23 09:41:21.749 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] resizing rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:41:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 23 04:41:22 np0005593233 podman[239764]: 2026-01-23 09:41:22.07110035 +0000 UTC m=+0.082583446 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:41:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.508 222021 DEBUG nova.network.neutron [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Successfully updated port: 1c98935c-d4a8-4f99-8dee-37240aecb6af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.512 222021 DEBUG nova.compute.manager [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-changed-1c98935c-d4a8-4f99-8dee-37240aecb6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.513 222021 DEBUG nova.compute.manager [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Refreshing instance network info cache due to event network-changed-1c98935c-d4a8-4f99-8dee-37240aecb6af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.513 222021 DEBUG oslo_concurrency.lockutils [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.514 222021 DEBUG oslo_concurrency.lockutils [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.514 222021 DEBUG nova.network.neutron [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Refreshing network info cache for port 1c98935c-d4a8-4f99-8dee-37240aecb6af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.583 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.593 222021 DEBUG nova.objects.instance [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lazy-loading 'migration_context' on Instance uuid b61fe33d-386a-4578-b98d-01a8ec801768 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.615 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.615 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Ensure instance console log exists: /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.616 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.616 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.616 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:22 np0005593233 nova_compute[222017]: 2026-01-23 09:41:22.869 222021 DEBUG nova.network.neutron [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:41:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:23.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:23.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:23 np0005593233 nova_compute[222017]: 2026-01-23 09:41:23.652 222021 DEBUG nova.network.neutron [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:23 np0005593233 nova_compute[222017]: 2026-01-23 09:41:23.836 222021 DEBUG oslo_concurrency.lockutils [req-f2b4a9f9-ba6f-4cb6-90f2-91c460dedb4a req-c34b3fdb-d801-4dff-94c7-bb4423867596 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:23 np0005593233 nova_compute[222017]: 2026-01-23 09:41:23.837 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquired lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:23 np0005593233 nova_compute[222017]: 2026-01-23 09:41:23.837 222021 DEBUG nova.network.neutron [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:41:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:41:24 np0005593233 nova_compute[222017]: 2026-01-23 09:41:24.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:25.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:25.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:26 np0005593233 nova_compute[222017]: 2026-01-23 09:41:26.152 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:27.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:27.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:28 np0005593233 nova_compute[222017]: 2026-01-23 09:41:28.471 222021 DEBUG nova.network.neutron [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:41:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:29.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.300 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:29.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.783 222021 DEBUG nova.network.neutron [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updating instance_info_cache with network_info: [{"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.810 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Releasing lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.810 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Instance network_info: |[{"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.813 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Start _get_guest_xml network_info=[{"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.820 222021 WARNING nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.826 222021 DEBUG nova.virt.libvirt.host [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.827 222021 DEBUG nova.virt.libvirt.host [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.832 222021 DEBUG nova.virt.libvirt.host [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.833 222021 DEBUG nova.virt.libvirt.host [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.835 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.835 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.836 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.836 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.836 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.837 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.837 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.837 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.838 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.838 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.838 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.838 222021 DEBUG nova.virt.hardware [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:41:29 np0005593233 nova_compute[222017]: 2026-01-23 09:41:29.843 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.307 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.343 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.350 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:41:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/478302913' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.875 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.885 222021 DEBUG nova.virt.libvirt.vif [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:41:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1595740695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1595740695',id=43,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4039861df5dd4fc0ab6daf192b8e1f33',ramdisk_id='',reservation_id='r-n59ftdj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1438631927',owner_user_name='tempest-AttachInterfacesV270Test-1438631927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:41:19Z,user_data=None,user_id='70b82be77f6f46caba34213f79897362',uuid=b61fe33d-386a-4578-b98d-01a8ec801768,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.886 222021 DEBUG nova.network.os_vif_util [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converting VIF {"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.889 222021 DEBUG nova.network.os_vif_util [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:30 np0005593233 nova_compute[222017]: 2026-01-23 09:41:30.892 222021 DEBUG nova.objects.instance [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lazy-loading 'pci_devices' on Instance uuid b61fe33d-386a-4578-b98d-01a8ec801768 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:31.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.058 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <uuid>b61fe33d-386a-4578-b98d-01a8ec801768</uuid>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <name>instance-0000002b</name>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:name>tempest-AttachInterfacesV270Test-server-1595740695</nova:name>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:41:29</nova:creationTime>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:user uuid="70b82be77f6f46caba34213f79897362">tempest-AttachInterfacesV270Test-1438631927-project-member</nova:user>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:project uuid="4039861df5dd4fc0ab6daf192b8e1f33">tempest-AttachInterfacesV270Test-1438631927</nova:project>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <nova:port uuid="1c98935c-d4a8-4f99-8dee-37240aecb6af">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <entry name="serial">b61fe33d-386a-4578-b98d-01a8ec801768</entry>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <entry name="uuid">b61fe33d-386a-4578-b98d-01a8ec801768</entry>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b61fe33d-386a-4578-b98d-01a8ec801768_disk">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b61fe33d-386a-4578-b98d-01a8ec801768_disk.config">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:4b:f5:b0"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <target dev="tap1c98935c-d4"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/console.log" append="off"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:41:31 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:41:31 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:41:31 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:41:31 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.059 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Preparing to wait for external event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.060 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.060 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.061 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.061 222021 DEBUG nova.virt.libvirt.vif [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:41:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1595740695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1595740695',id=43,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4039861df5dd4fc0ab6daf192b8e1f33',ramdisk_id='',reservation_id='r-n59ftdj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1438631927',owner_user_name='tempest-AttachInterfacesV270Test-1438631927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:41:19Z,user_data=None,user_id='70b82be77f6f46caba34213f79897362',uuid=b61fe33d-386a-4578-b98d-01a8ec801768,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.062 222021 DEBUG nova.network.os_vif_util [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converting VIF {"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.062 222021 DEBUG nova.network.os_vif_util [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.063 222021 DEBUG os_vif [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.064 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.064 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.068 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c98935c-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.069 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c98935c-d4, col_values=(('external_ids', {'iface-id': '1c98935c-d4a8-4f99-8dee-37240aecb6af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:f5:b0', 'vm-uuid': 'b61fe33d-386a-4578-b98d-01a8ec801768'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:31 np0005593233 NetworkManager[48871]: <info>  [1769161291.0726] manager: (tap1c98935c-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.075 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.079 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.080 222021 INFO os_vif [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4')#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.162 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.162 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.162 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No VIF found with MAC fa:16:3e:4b:f5:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.163 222021 INFO nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Using config drive#033[00m
Jan 23 04:41:31 np0005593233 nova_compute[222017]: 2026-01-23 09:41:31.190 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:31.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.000 222021 INFO nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Creating config drive at /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/disk.config#033[00m
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.006 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcbf6aqdd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.147 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcbf6aqdd" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.178 222021 DEBUG nova.storage.rbd_utils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] rbd image b61fe33d-386a-4578-b98d-01a8ec801768_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.182 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/disk.config b61fe33d-386a-4578-b98d-01a8ec801768_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.486 222021 DEBUG oslo_concurrency.processutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/disk.config b61fe33d-386a-4578-b98d-01a8ec801768_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.304s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.487 222021 INFO nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Deleting local config drive /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768/disk.config because it was imported into RBD.#033[00m
Jan 23 04:41:32 np0005593233 kernel: tap1c98935c-d4: entered promiscuous mode
Jan 23 04:41:32 np0005593233 NetworkManager[48871]: <info>  [1769161292.5620] manager: (tap1c98935c-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 23 04:41:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:32Z|00126|binding|INFO|Claiming lport 1c98935c-d4a8-4f99-8dee-37240aecb6af for this chassis.
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.563 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:32Z|00127|binding|INFO|1c98935c-d4a8-4f99-8dee-37240aecb6af: Claiming fa:16:3e:4b:f5:b0 10.100.0.14
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.587 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:f5:b0 10.100.0.14'], port_security=['fa:16:3e:4b:f5:b0 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b61fe33d-386a-4578-b98d-01a8ec801768', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4039861df5dd4fc0ab6daf192b8e1f33', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8dd25b8-9fa8-4d72-9627-5b1762d90421', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3678c959-7208-4e49-9ab9-1265420cc147, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1c98935c-d4a8-4f99-8dee-37240aecb6af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.589 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1c98935c-d4a8-4f99-8dee-37240aecb6af in datapath 176448f5-ee8b-4ce0-9332-a7b0d58a78db bound to our chassis#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.591 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 176448f5-ee8b-4ce0-9332-a7b0d58a78db#033[00m
Jan 23 04:41:32 np0005593233 systemd-udevd[239999]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.608 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5957b858-c233-446e-9992-3d5ca09bf808]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.609 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap176448f5-e1 in ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:41:32 np0005593233 systemd-machined[190954]: New machine qemu-24-instance-0000002b.
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.612 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap176448f5-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.613 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[72f41378-538b-4726-a8c3-e2eebb9eba89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.614 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7412a5c0-7c47-43e7-836c-5fa0c69370b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 NetworkManager[48871]: <info>  [1769161292.6258] device (tap1c98935c-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:41:32 np0005593233 NetworkManager[48871]: <info>  [1769161292.6270] device (tap1c98935c-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.631 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[9bffcdef-02dc-499e-9e6d-5fd6555fff36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 systemd[1]: Started Virtual Machine qemu-24-instance-0000002b.
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.665 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7446c8-b2b0-4e2e-9480-ce7225909784]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:32Z|00128|binding|INFO|Setting lport 1c98935c-d4a8-4f99-8dee-37240aecb6af ovn-installed in OVS
Jan 23 04:41:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:32Z|00129|binding|INFO|Setting lport 1c98935c-d4a8-4f99-8dee-37240aecb6af up in Southbound
Jan 23 04:41:32 np0005593233 nova_compute[222017]: 2026-01-23 09:41:32.669 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.707 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9f60281b-0cc2-4c34-ac1a-4fbafd4b2f3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.713 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[af6fd751-3def-4237-92aa-e2e9a3a55a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 NetworkManager[48871]: <info>  [1769161292.7158] manager: (tap176448f5-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.753 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[20a2df24-6c17-4f45-8777-12b1255f870c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.758 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a6499bb6-aaf7-4752-98a5-68fab9a16e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 NetworkManager[48871]: <info>  [1769161292.7908] device (tap176448f5-e0): carrier: link connected
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.798 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[66ba777f-1d76-4ad0-bceb-e2cbbd8fabb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.821 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[23024dd0-4ccc-4f95-86cd-dde70839ba39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap176448f5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:d8:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520542, 'reachable_time': 39112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240031, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.844 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3b8143-013c-457c-98ba-68e74b344fa6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:d890'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520542, 'tstamp': 520542}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240032, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.865 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb19eae-b6e5-4496-b8af-9b881ac7eeec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap176448f5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:d8:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520542, 'reachable_time': 39112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240033, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.907 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f87123fd-f9a0-4db5-ac99-e15f005899f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.997 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[155861f4-0c29-459e-a53c-50b719f18b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:32.999 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap176448f5-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.000 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.001 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap176448f5-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:33 np0005593233 NetworkManager[48871]: <info>  [1769161293.0052] manager: (tap176448f5-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 23 04:41:33 np0005593233 kernel: tap176448f5-e0: entered promiscuous mode
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.010 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap176448f5-e0, col_values=(('external_ids', {'iface-id': '329448f5-7a2f-4438-90ef-fc02134f70af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.012 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:33 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:33Z|00130|binding|INFO|Releasing lport 329448f5-7a2f-4438-90ef-fc02134f70af from this chassis (sb_readonly=0)
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.015 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/176448f5-ee8b-4ce0-9332-a7b0d58a78db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/176448f5-ee8b-4ce0-9332-a7b0d58a78db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.017 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5839be3e-92e9-41bd-9684-953bde00e1e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.018 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-176448f5-ee8b-4ce0-9332-a7b0d58a78db
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/176448f5-ee8b-4ce0-9332-a7b0d58a78db.pid.haproxy
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 176448f5-ee8b-4ce0-9332-a7b0d58a78db
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:41:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:33.020 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'env', 'PROCESS_TAG=haproxy-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/176448f5-ee8b-4ce0-9332-a7b0d58a78db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:33.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:33.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.326 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161293.3249502, b61fe33d-386a-4578-b98d-01a8ec801768 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.327 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] VM Started (Lifecycle Event)#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.400 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.406 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161293.325218, b61fe33d-386a-4578-b98d-01a8ec801768 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.407 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.435 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.441 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:41:33 np0005593233 podman[240106]: 2026-01-23 09:41:33.457330118 +0000 UTC m=+0.069955558 container create f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:41:33 np0005593233 nova_compute[222017]: 2026-01-23 09:41:33.467 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:41:33 np0005593233 systemd[1]: Started libpod-conmon-f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242.scope.
Jan 23 04:41:33 np0005593233 podman[240106]: 2026-01-23 09:41:33.418864875 +0000 UTC m=+0.031490335 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:41:33 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:41:33 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26d8884bc2e75d9d6fcd1dadc0f7d71913e3100347c6077746ba87b92bb1fc92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:41:33 np0005593233 podman[240106]: 2026-01-23 09:41:33.558171032 +0000 UTC m=+0.170796492 container init f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:41:33 np0005593233 podman[240106]: 2026-01-23 09:41:33.563777711 +0000 UTC m=+0.176403161 container start f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:41:33 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [NOTICE]   (240126) : New worker (240128) forked
Jan 23 04:41:33 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [NOTICE]   (240126) : Loading success.
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.190 222021 DEBUG nova.compute.manager [req-a79a643f-7e3c-4eb8-a7fe-4b4dfc96c2b4 req-3a4c7213-35c0-4882-9d97-fb014028565b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.191 222021 DEBUG oslo_concurrency.lockutils [req-a79a643f-7e3c-4eb8-a7fe-4b4dfc96c2b4 req-3a4c7213-35c0-4882-9d97-fb014028565b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.191 222021 DEBUG oslo_concurrency.lockutils [req-a79a643f-7e3c-4eb8-a7fe-4b4dfc96c2b4 req-3a4c7213-35c0-4882-9d97-fb014028565b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.191 222021 DEBUG oslo_concurrency.lockutils [req-a79a643f-7e3c-4eb8-a7fe-4b4dfc96c2b4 req-3a4c7213-35c0-4882-9d97-fb014028565b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.192 222021 DEBUG nova.compute.manager [req-a79a643f-7e3c-4eb8-a7fe-4b4dfc96c2b4 req-3a4c7213-35c0-4882-9d97-fb014028565b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Processing event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.193 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.198 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161294.1981583, b61fe33d-386a-4578-b98d-01a8ec801768 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.199 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.201 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.205 222021 INFO nova.virt.libvirt.driver [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Instance spawned successfully.#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.205 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.240 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.246 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.246 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.247 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.248 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.248 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.249 222021 DEBUG nova.virt.libvirt.driver [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.254 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.291 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.303 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.337 222021 INFO nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Took 14.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.338 222021 DEBUG nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.433 222021 INFO nova.compute.manager [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Took 15.95 seconds to build instance.#033[00m
Jan 23 04:41:34 np0005593233 nova_compute[222017]: 2026-01-23 09:41:34.470 222021 DEBUG oslo_concurrency.lockutils [None req-c69926a7-cf78-4fba-be98-af045f902f3b 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:35.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:41:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:35.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.073 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:36.161 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:36.163 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.198 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.361 222021 DEBUG oslo_concurrency.lockutils [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "interface-b61fe33d-386a-4578-b98d-01a8ec801768-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.363 222021 DEBUG oslo_concurrency.lockutils [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "interface-b61fe33d-386a-4578-b98d-01a8ec801768-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.364 222021 DEBUG nova.objects.instance [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lazy-loading 'flavor' on Instance uuid b61fe33d-386a-4578-b98d-01a8ec801768 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.403 222021 DEBUG nova.objects.instance [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lazy-loading 'pci_requests' on Instance uuid b61fe33d-386a-4578-b98d-01a8ec801768 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.433 222021 DEBUG nova.network.neutron [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.588 222021 DEBUG nova.compute.manager [req-5105c62a-e59e-45e7-93a0-ca027c6a6d9a req-0004ba3a-3750-4d4b-b261-f99324e619ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.589 222021 DEBUG oslo_concurrency.lockutils [req-5105c62a-e59e-45e7-93a0-ca027c6a6d9a req-0004ba3a-3750-4d4b-b261-f99324e619ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.590 222021 DEBUG oslo_concurrency.lockutils [req-5105c62a-e59e-45e7-93a0-ca027c6a6d9a req-0004ba3a-3750-4d4b-b261-f99324e619ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.591 222021 DEBUG oslo_concurrency.lockutils [req-5105c62a-e59e-45e7-93a0-ca027c6a6d9a req-0004ba3a-3750-4d4b-b261-f99324e619ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.592 222021 DEBUG nova.compute.manager [req-5105c62a-e59e-45e7-93a0-ca027c6a6d9a req-0004ba3a-3750-4d4b-b261-f99324e619ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:36 np0005593233 nova_compute[222017]: 2026-01-23 09:41:36.592 222021 WARNING nova.compute.manager [req-5105c62a-e59e-45e7-93a0-ca027c6a6d9a req-0004ba3a-3750-4d4b-b261-f99324e619ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received unexpected event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af for instance with vm_state active and task_state None.#033[00m
Jan 23 04:41:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:37.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:41:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:37.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:41:37 np0005593233 nova_compute[222017]: 2026-01-23 09:41:37.733 222021 DEBUG nova.policy [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70b82be77f6f46caba34213f79897362', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4039861df5dd4fc0ab6daf192b8e1f33', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:41:38 np0005593233 nova_compute[222017]: 2026-01-23 09:41:38.949 222021 DEBUG nova.network.neutron [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Successfully created port: 1988217a-ee6e-4363-8df6-77ef96edbc22 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:41:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:39.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:39 np0005593233 nova_compute[222017]: 2026-01-23 09:41:39.306 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:39.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:40 np0005593233 podman[240137]: 2026-01-23 09:41:40.147661993 +0000 UTC m=+0.135403027 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:41:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:41.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:41 np0005593233 nova_compute[222017]: 2026-01-23 09:41:41.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:41.165 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:41.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:42 np0005593233 nova_compute[222017]: 2026-01-23 09:41:42.130 222021 DEBUG nova.network.neutron [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Successfully updated port: 1988217a-ee6e-4363-8df6-77ef96edbc22 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:41:42 np0005593233 nova_compute[222017]: 2026-01-23 09:41:42.183 222021 DEBUG oslo_concurrency.lockutils [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:42 np0005593233 nova_compute[222017]: 2026-01-23 09:41:42.184 222021 DEBUG oslo_concurrency.lockutils [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquired lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:42 np0005593233 nova_compute[222017]: 2026-01-23 09:41:42.185 222021 DEBUG nova.network.neutron [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:41:42 np0005593233 nova_compute[222017]: 2026-01-23 09:41:42.494 222021 WARNING nova.network.neutron [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] 176448f5-ee8b-4ce0-9332-a7b0d58a78db already exists in list: networks containing: ['176448f5-ee8b-4ce0-9332-a7b0d58a78db']. ignoring it#033[00m
Jan 23 04:41:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:42.642 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:42.643 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:42.643 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:43.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:43.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:44 np0005593233 nova_compute[222017]: 2026-01-23 09:41:44.169 222021 DEBUG nova.compute.manager [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-changed-1988217a-ee6e-4363-8df6-77ef96edbc22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:44 np0005593233 nova_compute[222017]: 2026-01-23 09:41:44.170 222021 DEBUG nova.compute.manager [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Refreshing instance network info cache due to event network-changed-1988217a-ee6e-4363-8df6-77ef96edbc22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:41:44 np0005593233 nova_compute[222017]: 2026-01-23 09:41:44.170 222021 DEBUG oslo_concurrency.lockutils [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:44 np0005593233 nova_compute[222017]: 2026-01-23 09:41:44.308 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:41:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2908204606' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:41:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:41:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2908204606' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:41:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:45.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:45.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.515 222021 DEBUG nova.network.neutron [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updating instance_info_cache with network_info: [{"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.543 222021 DEBUG oslo_concurrency.lockutils [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Releasing lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.546 222021 DEBUG oslo_concurrency.lockutils [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.546 222021 DEBUG nova.network.neutron [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Refreshing network info cache for port 1988217a-ee6e-4363-8df6-77ef96edbc22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.553 222021 DEBUG nova.virt.libvirt.vif [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1595740695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1595740695',id=43,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:41:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4039861df5dd4fc0ab6daf192b8e1f33',ramdisk_id='',reservation_id='r-n59ftdj3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1438631927',owner_user_name='tempest-AttachInterfacesV270Test-1438631927-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:34Z,user_data=None,user_id='70b82be77f6f46caba34213f79897362',uuid=b61fe33d-386a-4578-b98d-01a8ec801768,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.554 222021 DEBUG nova.network.os_vif_util [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converting VIF {"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.555 222021 DEBUG nova.network.os_vif_util [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.556 222021 DEBUG os_vif [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.557 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.558 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.559 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.563 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.564 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1988217a-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.565 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1988217a-ee, col_values=(('external_ids', {'iface-id': '1988217a-ee6e-4363-8df6-77ef96edbc22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:a3:40', 'vm-uuid': 'b61fe33d-386a-4578-b98d-01a8ec801768'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.568 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 NetworkManager[48871]: <info>  [1769161305.5700] manager: (tap1988217a-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.577 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.581 222021 INFO os_vif [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee')#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.582 222021 DEBUG nova.virt.libvirt.vif [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1595740695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1595740695',id=43,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:41:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4039861df5dd4fc0ab6daf192b8e1f33',ramdisk_id='',reservation_id='r-n59ftdj3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1438631927',owner_user_name='tempest-AttachInterfacesV270Test-1438631927-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:34Z,user_data=None,user_id='70b82be77f6f46caba34213f79897362',uuid=b61fe33d-386a-4578-b98d-01a8ec801768,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.583 222021 DEBUG nova.network.os_vif_util [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converting VIF {"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.584 222021 DEBUG nova.network.os_vif_util [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.588 222021 DEBUG nova.virt.libvirt.guest [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] attach device xml: <interface type="ethernet">
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <mac address="fa:16:3e:df:a3:40"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <model type="virtio"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <mtu size="1442"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <target dev="tap1988217a-ee"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]: </interface>
Jan 23 04:41:45 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:41:45 np0005593233 kernel: tap1988217a-ee: entered promiscuous mode
Jan 23 04:41:45 np0005593233 NetworkManager[48871]: <info>  [1769161305.6098] manager: (tap1988217a-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Jan 23 04:41:45 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:45Z|00131|binding|INFO|Claiming lport 1988217a-ee6e-4363-8df6-77ef96edbc22 for this chassis.
Jan 23 04:41:45 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:45Z|00132|binding|INFO|1988217a-ee6e-4363-8df6-77ef96edbc22: Claiming fa:16:3e:df:a3:40 10.100.0.5
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.622 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:a3:40 10.100.0.5'], port_security=['fa:16:3e:df:a3:40 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b61fe33d-386a-4578-b98d-01a8ec801768', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4039861df5dd4fc0ab6daf192b8e1f33', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8dd25b8-9fa8-4d72-9627-5b1762d90421', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3678c959-7208-4e49-9ab9-1265420cc147, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1988217a-ee6e-4363-8df6-77ef96edbc22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.623 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1988217a-ee6e-4363-8df6-77ef96edbc22 in datapath 176448f5-ee8b-4ce0-9332-a7b0d58a78db bound to our chassis#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.624 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 176448f5-ee8b-4ce0-9332-a7b0d58a78db#033[00m
Jan 23 04:41:45 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:45Z|00133|binding|INFO|Setting lport 1988217a-ee6e-4363-8df6-77ef96edbc22 ovn-installed in OVS
Jan 23 04:41:45 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:45Z|00134|binding|INFO|Setting lport 1988217a-ee6e-4363-8df6-77ef96edbc22 up in Southbound
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.649 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.661 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7240e736-d052-4aed-a7f3-f5aa98f64e73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:45 np0005593233 systemd-udevd[240169]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:41:45 np0005593233 NetworkManager[48871]: <info>  [1769161305.6783] device (tap1988217a-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:41:45 np0005593233 NetworkManager[48871]: <info>  [1769161305.6790] device (tap1988217a-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.697 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f084fd-a3f3-472c-94d6-ff834e562f3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.702 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[eae14c62-f914-4374-a0eb-c857a0bc553c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.746 222021 DEBUG nova.virt.libvirt.driver [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.747 222021 DEBUG nova.virt.libvirt.driver [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.747 222021 DEBUG nova.virt.libvirt.driver [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No VIF found with MAC fa:16:3e:4b:f5:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.747 222021 DEBUG nova.virt.libvirt.driver [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] No VIF found with MAC fa:16:3e:df:a3:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.751 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e7aa0562-c8c6-43f6-8bf7-8328336d8918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.774 222021 DEBUG nova.virt.libvirt.guest [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:name>tempest-AttachInterfacesV270Test-server-1595740695</nova:name>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:creationTime>2026-01-23 09:41:45</nova:creationTime>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:flavor name="m1.nano">
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:memory>128</nova:memory>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:disk>1</nova:disk>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:swap>0</nova:swap>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  </nova:flavor>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:owner>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:user uuid="70b82be77f6f46caba34213f79897362">tempest-AttachInterfacesV270Test-1438631927-project-member</nova:user>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:project uuid="4039861df5dd4fc0ab6daf192b8e1f33">tempest-AttachInterfacesV270Test-1438631927</nova:project>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  </nova:owner>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  <nova:ports>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:port uuid="1c98935c-d4a8-4f99-8dee-37240aecb6af">
Jan 23 04:41:45 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    <nova:port uuid="1988217a-ee6e-4363-8df6-77ef96edbc22">
Jan 23 04:41:45 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:41:45 np0005593233 nova_compute[222017]:  </nova:ports>
Jan 23 04:41:45 np0005593233 nova_compute[222017]: </nova:instance>
Jan 23 04:41:45 np0005593233 nova_compute[222017]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.775 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[23429d87-1978-423f-925e-2c45b38a74bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap176448f5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:d8:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 6, 'rx_bytes': 532, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520542, 'reachable_time': 39112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240177, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.799 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[df3bc921-87d1-4edb-8a60-43747b11fc49]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap176448f5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520557, 'tstamp': 520557}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240178, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap176448f5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520561, 'tstamp': 520561}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240178, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.802 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap176448f5-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.804 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.805 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap176448f5-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.806 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.807 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap176448f5-e0, col_values=(('external_ids', {'iface-id': '329448f5-7a2f-4438-90ef-fc02134f70af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:45.807 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:45 np0005593233 nova_compute[222017]: 2026-01-23 09:41:45.817 222021 DEBUG oslo_concurrency.lockutils [None req-4c9f6425-240b-4d8f-9239-31c01a101af6 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "interface-b61fe33d-386a-4578-b98d-01a8ec801768-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:46 np0005593233 nova_compute[222017]: 2026-01-23 09:41:46.271 222021 DEBUG nova.compute.manager [req-1e0661fb-9c5f-4e7c-941f-3cc6ea0f1904 req-1f593229-b76e-4558-a42a-78bfcb07f90c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:46 np0005593233 nova_compute[222017]: 2026-01-23 09:41:46.272 222021 DEBUG oslo_concurrency.lockutils [req-1e0661fb-9c5f-4e7c-941f-3cc6ea0f1904 req-1f593229-b76e-4558-a42a-78bfcb07f90c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:46 np0005593233 nova_compute[222017]: 2026-01-23 09:41:46.273 222021 DEBUG oslo_concurrency.lockutils [req-1e0661fb-9c5f-4e7c-941f-3cc6ea0f1904 req-1f593229-b76e-4558-a42a-78bfcb07f90c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:46 np0005593233 nova_compute[222017]: 2026-01-23 09:41:46.273 222021 DEBUG oslo_concurrency.lockutils [req-1e0661fb-9c5f-4e7c-941f-3cc6ea0f1904 req-1f593229-b76e-4558-a42a-78bfcb07f90c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:46 np0005593233 nova_compute[222017]: 2026-01-23 09:41:46.274 222021 DEBUG nova.compute.manager [req-1e0661fb-9c5f-4e7c-941f-3cc6ea0f1904 req-1f593229-b76e-4558-a42a-78bfcb07f90c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:46 np0005593233 nova_compute[222017]: 2026-01-23 09:41:46.275 222021 WARNING nova.compute.manager [req-1e0661fb-9c5f-4e7c-941f-3cc6ea0f1904 req-1f593229-b76e-4558-a42a-78bfcb07f90c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received unexpected event network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:41:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:47.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:47.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.554 222021 DEBUG nova.compute.manager [req-e3ebb193-d80c-4af4-919b-3923b03a8edc req-9f6ff317-72d7-4b5c-8324-118ca131cd0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.555 222021 DEBUG oslo_concurrency.lockutils [req-e3ebb193-d80c-4af4-919b-3923b03a8edc req-9f6ff317-72d7-4b5c-8324-118ca131cd0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.555 222021 DEBUG oslo_concurrency.lockutils [req-e3ebb193-d80c-4af4-919b-3923b03a8edc req-9f6ff317-72d7-4b5c-8324-118ca131cd0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.555 222021 DEBUG oslo_concurrency.lockutils [req-e3ebb193-d80c-4af4-919b-3923b03a8edc req-9f6ff317-72d7-4b5c-8324-118ca131cd0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.555 222021 DEBUG nova.compute.manager [req-e3ebb193-d80c-4af4-919b-3923b03a8edc req-9f6ff317-72d7-4b5c-8324-118ca131cd0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.556 222021 WARNING nova.compute.manager [req-e3ebb193-d80c-4af4-919b-3923b03a8edc req-9f6ff317-72d7-4b5c-8324-118ca131cd0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received unexpected event network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.621 222021 DEBUG nova.network.neutron [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updated VIF entry in instance network info cache for port 1988217a-ee6e-4363-8df6-77ef96edbc22. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.622 222021 DEBUG nova.network.neutron [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updating instance_info_cache with network_info: [{"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:48 np0005593233 nova_compute[222017]: 2026-01-23 09:41:48.657 222021 DEBUG oslo_concurrency.lockutils [req-edc0615c-7e96-4cec-9f77-a9606055c70b req-7575949a-d7ca-415d-acb4-e4b7055bb2f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b61fe33d-386a-4578-b98d-01a8ec801768" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.020 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.022 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.022 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.023 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.023 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.025 222021 INFO nova.compute.manager [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Terminating instance#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.026 222021 DEBUG nova.compute.manager [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:41:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:49.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.310 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:49.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:49 np0005593233 kernel: tap1c98935c-d4 (unregistering): left promiscuous mode
Jan 23 04:41:49 np0005593233 NetworkManager[48871]: <info>  [1769161309.5148] device (tap1c98935c-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:41:49 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:49Z|00135|binding|INFO|Releasing lport 1c98935c-d4a8-4f99-8dee-37240aecb6af from this chassis (sb_readonly=0)
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.526 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:49Z|00136|binding|INFO|Setting lport 1c98935c-d4a8-4f99-8dee-37240aecb6af down in Southbound
Jan 23 04:41:49 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:49Z|00137|binding|INFO|Removing iface tap1c98935c-d4 ovn-installed in OVS
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.528 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.534 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:f5:b0 10.100.0.14'], port_security=['fa:16:3e:4b:f5:b0 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b61fe33d-386a-4578-b98d-01a8ec801768', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4039861df5dd4fc0ab6daf192b8e1f33', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8dd25b8-9fa8-4d72-9627-5b1762d90421', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3678c959-7208-4e49-9ab9-1265420cc147, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1c98935c-d4a8-4f99-8dee-37240aecb6af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.536 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1c98935c-d4a8-4f99-8dee-37240aecb6af in datapath 176448f5-ee8b-4ce0-9332-a7b0d58a78db unbound from our chassis#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.537 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 176448f5-ee8b-4ce0-9332-a7b0d58a78db#033[00m
Jan 23 04:41:49 np0005593233 kernel: tap1988217a-ee (unregistering): left promiscuous mode
Jan 23 04:41:49 np0005593233 NetworkManager[48871]: <info>  [1769161309.5646] device (tap1988217a-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.567 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.571 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddf2e89-4c1c-48a1-bc3c-92b3667cb6eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:49Z|00138|binding|INFO|Releasing lport 1988217a-ee6e-4363-8df6-77ef96edbc22 from this chassis (sb_readonly=0)
Jan 23 04:41:49 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:49Z|00139|binding|INFO|Setting lport 1988217a-ee6e-4363-8df6-77ef96edbc22 down in Southbound
Jan 23 04:41:49 np0005593233 ovn_controller[130653]: 2026-01-23T09:41:49Z|00140|binding|INFO|Removing iface tap1988217a-ee ovn-installed in OVS
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.582 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.587 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:a3:40 10.100.0.5'], port_security=['fa:16:3e:df:a3:40 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b61fe33d-386a-4578-b98d-01a8ec801768', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4039861df5dd4fc0ab6daf192b8e1f33', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8dd25b8-9fa8-4d72-9627-5b1762d90421', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3678c959-7208-4e49-9ab9-1265420cc147, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1988217a-ee6e-4363-8df6-77ef96edbc22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.620 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b80983-b0e6-4d0a-ad37-e21b9be3b75a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.624 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a2444601-55b0-4985-aac0-44a5f8b5758d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 23 04:41:49 np0005593233 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002b.scope: Consumed 14.147s CPU time.
Jan 23 04:41:49 np0005593233 systemd-machined[190954]: Machine qemu-24-instance-0000002b terminated.
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.657 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc2abe4-17ac-4281-8f42-487af69ae54c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.678 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[39813562-3815-4ead-9925-49e302ceca47]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap176448f5-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:d8:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520542, 'reachable_time': 39112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240197, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.701 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[851e7e95-78a4-4db8-b35e-9e8b4c318c9b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap176448f5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520557, 'tstamp': 520557}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240198, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap176448f5-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520561, 'tstamp': 520561}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240198, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.704 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap176448f5-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.706 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.716 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap176448f5-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.717 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.717 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap176448f5-e0, col_values=(('external_ids', {'iface-id': '329448f5-7a2f-4438-90ef-fc02134f70af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.718 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.719 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1988217a-ee6e-4363-8df6-77ef96edbc22 in datapath 176448f5-ee8b-4ce0-9332-a7b0d58a78db unbound from our chassis#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.721 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 176448f5-ee8b-4ce0-9332-a7b0d58a78db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.723 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c7cca755-4d14-4ce6-8872-ec1ef44f1642]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:49.724 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db namespace which is not needed anymore#033[00m
Jan 23 04:41:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:49 np0005593233 NetworkManager[48871]: <info>  [1769161309.8552] manager: (tap1c98935c-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Jan 23 04:41:49 np0005593233 NetworkManager[48871]: <info>  [1769161309.8713] manager: (tap1988217a-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.886 222021 INFO nova.virt.libvirt.driver [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Instance destroyed successfully.#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.886 222021 DEBUG nova.objects.instance [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lazy-loading 'resources' on Instance uuid b61fe33d-386a-4578-b98d-01a8ec801768 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:49 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [NOTICE]   (240126) : haproxy version is 2.8.14-c23fe91
Jan 23 04:41:49 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [NOTICE]   (240126) : path to executable is /usr/sbin/haproxy
Jan 23 04:41:49 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [WARNING]  (240126) : Exiting Master process...
Jan 23 04:41:49 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [WARNING]  (240126) : Exiting Master process...
Jan 23 04:41:49 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [ALERT]    (240126) : Current worker (240128) exited with code 143 (Terminated)
Jan 23 04:41:49 np0005593233 neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db[240122]: [WARNING]  (240126) : All workers exited. Exiting... (0)
Jan 23 04:41:49 np0005593233 systemd[1]: libpod-f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242.scope: Deactivated successfully.
Jan 23 04:41:49 np0005593233 podman[240223]: 2026-01-23 09:41:49.950323167 +0000 UTC m=+0.074999051 container died f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.980 222021 DEBUG nova.virt.libvirt.vif [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1595740695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1595740695',id=43,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:41:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4039861df5dd4fc0ab6daf192b8e1f33',ramdisk_id='',reservation_id='r-n59ftdj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1438631927',owner_user_name='tempest-AttachInterfacesV270Test-1438631927-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:34Z,user_data=None,user_id='70b82be77f6f46caba34213f79897362',uuid=b61fe33d-386a-4578-b98d-01a8ec801768,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.981 222021 DEBUG nova.network.os_vif_util [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converting VIF {"id": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "address": "fa:16:3e:4b:f5:b0", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c98935c-d4", "ovs_interfaceid": "1c98935c-d4a8-4f99-8dee-37240aecb6af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.982 222021 DEBUG nova.network.os_vif_util [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.982 222021 DEBUG os_vif [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:41:49 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242-userdata-shm.mount: Deactivated successfully.
Jan 23 04:41:49 np0005593233 systemd[1]: var-lib-containers-storage-overlay-26d8884bc2e75d9d6fcd1dadc0f7d71913e3100347c6077746ba87b92bb1fc92-merged.mount: Deactivated successfully.
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.990 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.991 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c98935c-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.993 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.996 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:41:49 np0005593233 nova_compute[222017]: 2026-01-23 09:41:49.998 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:50 np0005593233 podman[240223]: 2026-01-23 09:41:50.001376877 +0000 UTC m=+0.126052761 container cleanup f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.003 222021 INFO os_vif [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:f5:b0,bridge_name='br-int',has_traffic_filtering=True,id=1c98935c-d4a8-4f99-8dee-37240aecb6af,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c98935c-d4')#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.005 222021 DEBUG nova.virt.libvirt.vif [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1595740695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1595740695',id=43,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:41:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4039861df5dd4fc0ab6daf192b8e1f33',ramdisk_id='',reservation_id='r-n59ftdj3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1438631927',owner_user_name='tempest-AttachInterfacesV270Test-1438631927-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:34Z,user_data=None,user_id='70b82be77f6f46caba34213f79897362',uuid=b61fe33d-386a-4578-b98d-01a8ec801768,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.006 222021 DEBUG nova.network.os_vif_util [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converting VIF {"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.007 222021 DEBUG nova.network.os_vif_util [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.007 222021 DEBUG os_vif [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.010 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1988217a-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.016 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.018 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.021 222021 INFO os_vif [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:a3:40,bridge_name='br-int',has_traffic_filtering=True,id=1988217a-ee6e-4363-8df6-77ef96edbc22,network=Network(176448f5-ee8b-4ce0-9332-a7b0d58a78db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1988217a-ee')#033[00m
Jan 23 04:41:50 np0005593233 systemd[1]: libpod-conmon-f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242.scope: Deactivated successfully.
Jan 23 04:41:50 np0005593233 podman[240270]: 2026-01-23 09:41:50.092147145 +0000 UTC m=+0.056374712 container remove f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.099 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d5052970-e65c-44db-b48a-1529263e8bb8]: (4, ('Fri Jan 23 09:41:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db (f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242)\nf5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242\nFri Jan 23 09:41:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db (f5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242)\nf5b6704f94d42b33a7cbfa84c30294334653e406dd115ce04117afd0fc2e1242\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.101 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[03d4876e-7168-45e7-9f4c-a23463b8d054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.102 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap176448f5-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.105 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:50 np0005593233 kernel: tap176448f5-e0: left promiscuous mode
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.126 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.129 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[98ebb7e3-71ba-4129-9336-0f19397616ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.143 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[757e02cd-2fef-4862-93af-d7b50b863a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.145 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[28546e50-a33b-4f95-a773-3eea8e11e848]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.169 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e3696a-0f0b-487f-b090-14e97114c553]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520533, 'reachable_time': 17063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240303, 'error': None, 'target': 'ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 systemd[1]: run-netns-ovnmeta\x2d176448f5\x2dee8b\x2d4ce0\x2d9332\x2da7b0d58a78db.mount: Deactivated successfully.
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.175 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-176448f5-ee8b-4ce0-9332-a7b0d58a78db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:41:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:41:50.176 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[10fe9fe8-30e0-4488-accb-0a5614ca7a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.195 222021 DEBUG nova.compute.manager [req-245e326a-455f-451c-8a85-202f162ee7b2 req-d2ddd14e-15f4-4f46-9151-b2e6400af881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-unplugged-1c98935c-d4a8-4f99-8dee-37240aecb6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.196 222021 DEBUG oslo_concurrency.lockutils [req-245e326a-455f-451c-8a85-202f162ee7b2 req-d2ddd14e-15f4-4f46-9151-b2e6400af881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.196 222021 DEBUG oslo_concurrency.lockutils [req-245e326a-455f-451c-8a85-202f162ee7b2 req-d2ddd14e-15f4-4f46-9151-b2e6400af881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.196 222021 DEBUG oslo_concurrency.lockutils [req-245e326a-455f-451c-8a85-202f162ee7b2 req-d2ddd14e-15f4-4f46-9151-b2e6400af881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.197 222021 DEBUG nova.compute.manager [req-245e326a-455f-451c-8a85-202f162ee7b2 req-d2ddd14e-15f4-4f46-9151-b2e6400af881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-unplugged-1c98935c-d4a8-4f99-8dee-37240aecb6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.197 222021 DEBUG nova.compute.manager [req-245e326a-455f-451c-8a85-202f162ee7b2 req-d2ddd14e-15f4-4f46-9151-b2e6400af881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-unplugged-1c98935c-d4a8-4f99-8dee-37240aecb6af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.676 222021 DEBUG nova.compute.manager [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-unplugged-1988217a-ee6e-4363-8df6-77ef96edbc22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.677 222021 DEBUG oslo_concurrency.lockutils [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.677 222021 DEBUG oslo_concurrency.lockutils [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.677 222021 DEBUG oslo_concurrency.lockutils [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.677 222021 DEBUG nova.compute.manager [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-unplugged-1988217a-ee6e-4363-8df6-77ef96edbc22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.677 222021 DEBUG nova.compute.manager [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-unplugged-1988217a-ee6e-4363-8df6-77ef96edbc22 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.678 222021 DEBUG nova.compute.manager [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.678 222021 DEBUG oslo_concurrency.lockutils [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.678 222021 DEBUG oslo_concurrency.lockutils [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.678 222021 DEBUG oslo_concurrency.lockutils [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.678 222021 DEBUG nova.compute.manager [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:50 np0005593233 nova_compute[222017]: 2026-01-23 09:41:50.679 222021 WARNING nova.compute.manager [req-b74e3878-f889-48d1-84b0-5c36da1ad048 req-77394849-204e-4a3c-ad9a-7af803f60284 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received unexpected event network-vif-plugged-1988217a-ee6e-4363-8df6-77ef96edbc22 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:41:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:51.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:51.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:52 np0005593233 nova_compute[222017]: 2026-01-23 09:41:52.333 222021 DEBUG nova.compute.manager [req-759a6e19-4c63-4a3b-80f1-22ea8a9ae9c3 req-5e7db5d1-7579-4932-bc87-cd1f50b6b112 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:52 np0005593233 nova_compute[222017]: 2026-01-23 09:41:52.333 222021 DEBUG oslo_concurrency.lockutils [req-759a6e19-4c63-4a3b-80f1-22ea8a9ae9c3 req-5e7db5d1-7579-4932-bc87-cd1f50b6b112 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:52 np0005593233 nova_compute[222017]: 2026-01-23 09:41:52.333 222021 DEBUG oslo_concurrency.lockutils [req-759a6e19-4c63-4a3b-80f1-22ea8a9ae9c3 req-5e7db5d1-7579-4932-bc87-cd1f50b6b112 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:52 np0005593233 nova_compute[222017]: 2026-01-23 09:41:52.334 222021 DEBUG oslo_concurrency.lockutils [req-759a6e19-4c63-4a3b-80f1-22ea8a9ae9c3 req-5e7db5d1-7579-4932-bc87-cd1f50b6b112 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:52 np0005593233 nova_compute[222017]: 2026-01-23 09:41:52.334 222021 DEBUG nova.compute.manager [req-759a6e19-4c63-4a3b-80f1-22ea8a9ae9c3 req-5e7db5d1-7579-4932-bc87-cd1f50b6b112 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] No waiting events found dispatching network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:52 np0005593233 nova_compute[222017]: 2026-01-23 09:41:52.334 222021 WARNING nova.compute.manager [req-759a6e19-4c63-4a3b-80f1-22ea8a9ae9c3 req-5e7db5d1-7579-4932-bc87-cd1f50b6b112 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received unexpected event network-vif-plugged-1c98935c-d4a8-4f99-8dee-37240aecb6af for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:41:53 np0005593233 podman[240305]: 2026-01-23 09:41:53.070903312 +0000 UTC m=+0.085641893 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:41:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:53.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:53 np0005593233 nova_compute[222017]: 2026-01-23 09:41:53.166 222021 INFO nova.virt.libvirt.driver [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Deleting instance files /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768_del#033[00m
Jan 23 04:41:53 np0005593233 nova_compute[222017]: 2026-01-23 09:41:53.167 222021 INFO nova.virt.libvirt.driver [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Deletion of /var/lib/nova/instances/b61fe33d-386a-4578-b98d-01a8ec801768_del complete#033[00m
Jan 23 04:41:53 np0005593233 nova_compute[222017]: 2026-01-23 09:41:53.250 222021 INFO nova.compute.manager [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Took 4.22 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:41:53 np0005593233 nova_compute[222017]: 2026-01-23 09:41:53.251 222021 DEBUG oslo.service.loopingcall [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:41:53 np0005593233 nova_compute[222017]: 2026-01-23 09:41:53.251 222021 DEBUG nova.compute.manager [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:41:53 np0005593233 nova_compute[222017]: 2026-01-23 09:41:53.251 222021 DEBUG nova.network.neutron [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:41:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:53.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:54 np0005593233 nova_compute[222017]: 2026-01-23 09:41:54.312 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.013 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.066 222021 DEBUG nova.compute.manager [req-056cf0d1-458e-4655-a7c0-339b52fa1256 req-8aca4ba6-0485-4b7c-87a0-80f3a4f50d80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-deleted-1c98935c-d4a8-4f99-8dee-37240aecb6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.067 222021 INFO nova.compute.manager [req-056cf0d1-458e-4655-a7c0-339b52fa1256 req-8aca4ba6-0485-4b7c-87a0-80f3a4f50d80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Neutron deleted interface 1c98935c-d4a8-4f99-8dee-37240aecb6af; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.067 222021 DEBUG nova.network.neutron [req-056cf0d1-458e-4655-a7c0-339b52fa1256 req-8aca4ba6-0485-4b7c-87a0-80f3a4f50d80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updating instance_info_cache with network_info: [{"id": "1988217a-ee6e-4363-8df6-77ef96edbc22", "address": "fa:16:3e:df:a3:40", "network": {"id": "176448f5-ee8b-4ce0-9332-a7b0d58a78db", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-2060115566-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4039861df5dd4fc0ab6daf192b8e1f33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1988217a-ee", "ovs_interfaceid": "1988217a-ee6e-4363-8df6-77ef96edbc22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.123 222021 DEBUG nova.compute.manager [req-056cf0d1-458e-4655-a7c0-339b52fa1256 req-8aca4ba6-0485-4b7c-87a0-80f3a4f50d80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Detach interface failed, port_id=1c98935c-d4a8-4f99-8dee-37240aecb6af, reason: Instance b61fe33d-386a-4578-b98d-01a8ec801768 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:41:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:55.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.773 222021 DEBUG nova.network.neutron [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.821 222021 INFO nova.compute.manager [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Took 2.57 seconds to deallocate network for instance.#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.864 222021 DEBUG nova.compute.manager [req-60a0dc8d-5622-4dce-9f8a-bde5aa5160e8 req-e9e5ec20-8e5d-4ccc-9e3e-8abaa1401446 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Received event network-vif-deleted-1988217a-ee6e-4363-8df6-77ef96edbc22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.886 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:55 np0005593233 nova_compute[222017]: 2026-01-23 09:41:55.887 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:56 np0005593233 nova_compute[222017]: 2026-01-23 09:41:56.038 222021 DEBUG oslo_concurrency.processutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4232801607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:56 np0005593233 nova_compute[222017]: 2026-01-23 09:41:56.714 222021 DEBUG oslo_concurrency.processutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.676s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:56 np0005593233 nova_compute[222017]: 2026-01-23 09:41:56.724 222021 DEBUG nova.compute.provider_tree [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:41:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:41:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6496 writes, 34K keys, 6496 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6496 writes, 6496 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1614 writes, 8191 keys, 1614 commit groups, 1.0 writes per commit group, ingest: 16.45 MB, 0.03 MB/s#012Interval WAL: 1614 writes, 1614 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     63.3      0.66              0.19        18    0.037       0      0       0.0       0.0#012  L6      1/0    9.90 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.6     96.8     79.7      1.86              0.49        17    0.109     85K   9974       0.0       0.0#012 Sum      1/0    9.90 MB   0.0      0.2     0.0      0.1       0.2      0.1       0.0   4.6     71.4     75.4      2.51              0.68        35    0.072     85K   9974       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.8     82.8     84.4      0.59              0.16         8    0.074     24K   3114       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0     96.8     79.7      1.86              0.49        17    0.109     85K   9974       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     63.5      0.66              0.19        17    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.041, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.08 MB/s write, 0.18 GB read, 0.07 MB/s read, 2.5 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 19.83 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000163 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1148,19.15 MB,6.29803%) FilterBlock(35,248.48 KB,0.0798225%) IndexBlock(35,456.06 KB,0.146504%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:41:56 np0005593233 nova_compute[222017]: 2026-01-23 09:41:56.907 222021 DEBUG nova.scheduler.client.report [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:41:56 np0005593233 nova_compute[222017]: 2026-01-23 09:41:56.943 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:57 np0005593233 nova_compute[222017]: 2026-01-23 09:41:57.000 222021 INFO nova.scheduler.client.report [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Deleted allocations for instance b61fe33d-386a-4578-b98d-01a8ec801768#033[00m
Jan 23 04:41:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:57.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:57 np0005593233 nova_compute[222017]: 2026-01-23 09:41:57.107 222021 DEBUG oslo_concurrency.lockutils [None req-a7e6ade8-030f-4aad-bba9-6be85b3aba89 70b82be77f6f46caba34213f79897362 4039861df5dd4fc0ab6daf192b8e1f33 - - default default] Lock "b61fe33d-386a-4578-b98d-01a8ec801768" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:57.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:41:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:59.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:41:59 np0005593233 nova_compute[222017]: 2026-01-23 09:41:59.316 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:41:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:41:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:59.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:41:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:00 np0005593233 nova_compute[222017]: 2026-01-23 09:42:00.016 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:01.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:01.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:03.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:03 np0005593233 nova_compute[222017]: 2026-01-23 09:42:03.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:03.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 23 04:42:04 np0005593233 nova_compute[222017]: 2026-01-23 09:42:04.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:04 np0005593233 nova_compute[222017]: 2026-01-23 09:42:04.884 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161309.8830233, b61fe33d-386a-4578-b98d-01a8ec801768 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:04 np0005593233 nova_compute[222017]: 2026-01-23 09:42:04.885 222021 INFO nova.compute.manager [-] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:42:04 np0005593233 nova_compute[222017]: 2026-01-23 09:42:04.917 222021 DEBUG nova.compute.manager [None req-67058b5e-4035-44b6-8658-55eb61c21f43 - - - - - -] [instance: b61fe33d-386a-4578-b98d-01a8ec801768] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:05 np0005593233 nova_compute[222017]: 2026-01-23 09:42:05.019 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:05.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 23 04:42:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:05.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 23 04:42:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:07.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:07.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:07 np0005593233 nova_compute[222017]: 2026-01-23 09:42:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:09.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:09 np0005593233 nova_compute[222017]: 2026-01-23 09:42:09.319 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:09.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:09 np0005593233 nova_compute[222017]: 2026-01-23 09:42:09.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:10 np0005593233 nova_compute[222017]: 2026-01-23 09:42:10.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:10 np0005593233 nova_compute[222017]: 2026-01-23 09:42:10.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:10 np0005593233 nova_compute[222017]: 2026-01-23 09:42:10.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:42:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:11.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:11 np0005593233 podman[240347]: 2026-01-23 09:42:11.145066737 +0000 UTC m=+0.150331340 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Jan 23 04:42:11 np0005593233 nova_compute[222017]: 2026-01-23 09:42:11.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:11 np0005593233 nova_compute[222017]: 2026-01-23 09:42:11.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:42:11 np0005593233 nova_compute[222017]: 2026-01-23 09:42:11.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:42:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:11.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:11 np0005593233 nova_compute[222017]: 2026-01-23 09:42:11.404 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:42:11 np0005593233 nova_compute[222017]: 2026-01-23 09:42:11.404 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.455 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.456 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.456 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.457 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.457 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 23 04:42:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2559483049' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:12 np0005593233 nova_compute[222017]: 2026-01-23 09:42:12.943 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:13.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.129 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.131 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4760MB free_disk=20.876399993896484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.131 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.131 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.217 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.217 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.237 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:13.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3017086253' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.726 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.735 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.773 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:42:13 np0005593233 nova_compute[222017]: 2026-01-23 09:42:13.798 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 23 04:42:14 np0005593233 nova_compute[222017]: 2026-01-23 09:42:14.321 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:15 np0005593233 nova_compute[222017]: 2026-01-23 09:42:15.040 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:15.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:15.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:16 np0005593233 nova_compute[222017]: 2026-01-23 09:42:16.800 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:42:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:17.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:42:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:17.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:18 np0005593233 nova_compute[222017]: 2026-01-23 09:42:18.939 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:18 np0005593233 nova_compute[222017]: 2026-01-23 09:42:18.940 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:18 np0005593233 nova_compute[222017]: 2026-01-23 09:42:18.966 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.094 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.095 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.100581) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339100768, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1621, "num_deletes": 253, "total_data_size": 3549962, "memory_usage": 3595048, "flush_reason": "Manual Compaction"}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.104 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.105 222021 INFO nova.compute.claims [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:42:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:19.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339119852, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1478971, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33188, "largest_seqno": 34804, "table_properties": {"data_size": 1473559, "index_size": 2616, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14502, "raw_average_key_size": 21, "raw_value_size": 1461617, "raw_average_value_size": 2146, "num_data_blocks": 116, "num_entries": 681, "num_filter_entries": 681, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161217, "oldest_key_time": 1769161217, "file_creation_time": 1769161339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 19359 microseconds, and 8472 cpu microseconds.
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.119958) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1478971 bytes OK
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.119985) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.122190) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.122206) EVENT_LOG_v1 {"time_micros": 1769161339122201, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.122232) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 3542334, prev total WAL file size 3542334, number of live WAL files 2.
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.123832) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303035' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1444KB)], [63(10138KB)]
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339124009, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 11860803, "oldest_snapshot_seqno": -1}
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.210 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 5791 keys, 8860452 bytes, temperature: kUnknown
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339235437, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 8860452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8822704, "index_size": 22150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14533, "raw_key_size": 148144, "raw_average_key_size": 25, "raw_value_size": 8719599, "raw_average_value_size": 1505, "num_data_blocks": 894, "num_entries": 5791, "num_filter_entries": 5791, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.236220) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 8860452 bytes
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.238389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.2 rd, 79.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 9.9 +0.0 blob) out(8.4 +0.0 blob), read-write-amplify(14.0) write-amplify(6.0) OK, records in: 6258, records dropped: 467 output_compression: NoCompression
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.238426) EVENT_LOG_v1 {"time_micros": 1769161339238409, "job": 38, "event": "compaction_finished", "compaction_time_micros": 111633, "compaction_time_cpu_micros": 27780, "output_level": 6, "num_output_files": 1, "total_output_size": 8860452, "num_input_records": 6258, "num_output_records": 5791, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339239449, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339244145, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.123580) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.244303) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.244313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.244317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.244321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:42:19.244325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.324 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:19.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/214696084' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.704 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.712 222021 DEBUG nova.compute.provider_tree [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.731 222021 DEBUG nova.scheduler.client.report [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.772 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.773 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.847 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.848 222021 DEBUG nova.network.neutron [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.873 222021 INFO nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:42:19 np0005593233 nova_compute[222017]: 2026-01-23 09:42:19.895 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.005 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.006 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.007 222021 INFO nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Creating image(s)#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.043 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.073 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.103 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.107 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.140 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.207 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.208 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.209 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.209 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.237 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.240 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 9ec28845-a80d-489b-84e9-5d38da983cdc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:20 np0005593233 nova_compute[222017]: 2026-01-23 09:42:20.818 222021 DEBUG nova.policy [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56da68482e3a4fb582dcccad45f8f71b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05bc71a77710455e8b34ead7fec81a31', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:42:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:21.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:21.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.071 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 9ec28845-a80d-489b-84e9-5d38da983cdc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.830s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.159 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] resizing rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.511 222021 DEBUG nova.objects.instance [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'migration_context' on Instance uuid 9ec28845-a80d-489b-84e9-5d38da983cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.534 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.535 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Ensure instance console log exists: /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.536 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.537 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.537 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:22 np0005593233 nova_compute[222017]: 2026-01-23 09:42:22.564 222021 DEBUG nova.network.neutron [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Successfully created port: 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:42:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:23.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:23.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.673 222021 DEBUG nova.network.neutron [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Successfully updated port: 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.689 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "refresh_cache-9ec28845-a80d-489b-84e9-5d38da983cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.689 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquired lock "refresh_cache-9ec28845-a80d-489b-84e9-5d38da983cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.690 222021 DEBUG nova.network.neutron [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.815 222021 DEBUG nova.compute.manager [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received event network-changed-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.815 222021 DEBUG nova.compute.manager [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Refreshing instance network info cache due to event network-changed-405ba4a0-902c-4d74-a4d1-bf0eb43b2622. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.815 222021 DEBUG oslo_concurrency.lockutils [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-9ec28845-a80d-489b-84e9-5d38da983cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:23 np0005593233 nova_compute[222017]: 2026-01-23 09:42:23.892 222021 DEBUG nova.network.neutron [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:42:24 np0005593233 podman[240606]: 2026-01-23 09:42:24.053045823 +0000 UTC m=+0.063467213 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 23 04:42:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 23 04:42:24 np0005593233 nova_compute[222017]: 2026-01-23 09:42:24.325 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:24 np0005593233 nova_compute[222017]: 2026-01-23 09:42:24.972 222021 DEBUG nova.network.neutron [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Updating instance_info_cache with network_info: [{"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.009 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Releasing lock "refresh_cache-9ec28845-a80d-489b-84e9-5d38da983cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.009 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Instance network_info: |[{"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.010 222021 DEBUG oslo_concurrency.lockutils [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-9ec28845-a80d-489b-84e9-5d38da983cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.010 222021 DEBUG nova.network.neutron [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Refreshing network info cache for port 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.013 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Start _get_guest_xml network_info=[{"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.018 222021 WARNING nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.024 222021 DEBUG nova.virt.libvirt.host [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.024 222021 DEBUG nova.virt.libvirt.host [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.027 222021 DEBUG nova.virt.libvirt.host [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.028 222021 DEBUG nova.virt.libvirt.host [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.029 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.029 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.030 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.030 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.030 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.030 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.030 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.031 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.031 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.031 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.031 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.032 222021 DEBUG nova.virt.hardware [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.034 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:25.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:25.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3585280852' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.520 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.557 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:25 np0005593233 nova_compute[222017]: 2026-01-23 09:42:25.562 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2100370042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.007 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.010 222021 DEBUG nova.virt.libvirt.vif [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-888724699',display_name='tempest-ImagesTestJSON-server-888724699',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-888724699',id=46,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-tylndpaz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:19Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=9ec28845-a80d-489b-84e9-5d38da983cdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.010 222021 DEBUG nova.network.os_vif_util [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.011 222021 DEBUG nova.network.os_vif_util [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.012 222021 DEBUG nova.objects.instance [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9ec28845-a80d-489b-84e9-5d38da983cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.060 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <uuid>9ec28845-a80d-489b-84e9-5d38da983cdc</uuid>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <name>instance-0000002e</name>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:name>tempest-ImagesTestJSON-server-888724699</nova:name>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:42:25</nova:creationTime>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:user uuid="56da68482e3a4fb582dcccad45f8f71b">tempest-ImagesTestJSON-1507872051-project-member</nova:user>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:project uuid="05bc71a77710455e8b34ead7fec81a31">tempest-ImagesTestJSON-1507872051</nova:project>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <nova:port uuid="405ba4a0-902c-4d74-a4d1-bf0eb43b2622">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <entry name="serial">9ec28845-a80d-489b-84e9-5d38da983cdc</entry>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <entry name="uuid">9ec28845-a80d-489b-84e9-5d38da983cdc</entry>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/9ec28845-a80d-489b-84e9-5d38da983cdc_disk">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/9ec28845-a80d-489b-84e9-5d38da983cdc_disk.config">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:6d:dc:c4"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <target dev="tap405ba4a0-90"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/console.log" append="off"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:42:26 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:42:26 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:42:26 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:42:26 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.062 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Preparing to wait for external event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.063 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.063 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.063 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.064 222021 DEBUG nova.virt.libvirt.vif [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-888724699',display_name='tempest-ImagesTestJSON-server-888724699',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-888724699',id=46,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-tylndpaz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:19Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=9ec28845-a80d-489b-84e9-5d38da983cdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.065 222021 DEBUG nova.network.os_vif_util [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.065 222021 DEBUG nova.network.os_vif_util [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.066 222021 DEBUG os_vif [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.067 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.067 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.068 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.072 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.073 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ba4a0-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.073 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap405ba4a0-90, col_values=(('external_ids', {'iface-id': '405ba4a0-902c-4d74-a4d1-bf0eb43b2622', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:dc:c4', 'vm-uuid': '9ec28845-a80d-489b-84e9-5d38da983cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.075 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:26 np0005593233 NetworkManager[48871]: <info>  [1769161346.0773] manager: (tap405ba4a0-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.079 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.085 222021 INFO os_vif [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90')#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.173 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.174 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.175 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No VIF found with MAC fa:16:3e:6d:dc:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.176 222021 INFO nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Using config drive#033[00m
Jan 23 04:42:26 np0005593233 nova_compute[222017]: 2026-01-23 09:42:26.215 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.084 222021 INFO nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Creating config drive at /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/disk.config#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.091 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpivoo979b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:27.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.240 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpivoo979b" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.276 222021 DEBUG nova.storage.rbd_utils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 9ec28845-a80d-489b-84e9-5d38da983cdc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.281 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/disk.config 9ec28845-a80d-489b-84e9-5d38da983cdc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.364 222021 DEBUG nova.network.neutron [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Updated VIF entry in instance network info cache for port 405ba4a0-902c-4d74-a4d1-bf0eb43b2622. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.365 222021 DEBUG nova.network.neutron [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Updating instance_info_cache with network_info: [{"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.385 222021 DEBUG oslo_concurrency.lockutils [req-1092dec2-dfef-4963-8119-9a045a295e2e req-f68144c2-8f96-444f-819c-dbe923db3575 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-9ec28845-a80d-489b-84e9-5d38da983cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:27.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.502 222021 DEBUG oslo_concurrency.processutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/disk.config 9ec28845-a80d-489b-84e9-5d38da983cdc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.503 222021 INFO nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Deleting local config drive /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc/disk.config because it was imported into RBD.#033[00m
Jan 23 04:42:27 np0005593233 kernel: tap405ba4a0-90: entered promiscuous mode
Jan 23 04:42:27 np0005593233 NetworkManager[48871]: <info>  [1769161347.5868] manager: (tap405ba4a0-90): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:27Z|00141|binding|INFO|Claiming lport 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 for this chassis.
Jan 23 04:42:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:27Z|00142|binding|INFO|405ba4a0-902c-4d74-a4d1-bf0eb43b2622: Claiming fa:16:3e:6d:dc:c4 10.100.0.6
Jan 23 04:42:27 np0005593233 systemd-udevd[240757]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.621 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.627 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:dc:c4 10.100.0.6'], port_security=['fa:16:3e:6d:dc:c4 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9ec28845-a80d-489b-84e9-5d38da983cdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=405ba4a0-902c-4d74-a4d1-bf0eb43b2622) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.628 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d bound to our chassis#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.629 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c2696fd4-5fd7-4934-88ac-40162fad555d#033[00m
Jan 23 04:42:27 np0005593233 NetworkManager[48871]: <info>  [1769161347.6380] device (tap405ba4a0-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:42:27 np0005593233 NetworkManager[48871]: <info>  [1769161347.6388] device (tap405ba4a0-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:42:27 np0005593233 systemd-machined[190954]: New machine qemu-25-instance-0000002e.
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.646 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[32d7bc37-689e-4146-ae32-25c212100bd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.648 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc2696fd4-51 in ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.650 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc2696fd4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.650 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[660bb997-a784-40b1-af99-a382c5106e4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.651 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8f3ad5-3e4f-4139-92cb-c4fe54ff3b7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.664 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd87d06-ed62-422e-be91-f6ffd00da4d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.683 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f13326a0-8360-4005-a90a-06f0763a29c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:27Z|00143|binding|INFO|Setting lport 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 ovn-installed in OVS
Jan 23 04:42:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:27Z|00144|binding|INFO|Setting lport 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 up in Southbound
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.687 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:27 np0005593233 systemd[1]: Started Virtual Machine qemu-25-instance-0000002e.
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.724 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[82c98a45-de33-41ca-912d-153d934c2e01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 NetworkManager[48871]: <info>  [1769161347.7342] manager: (tapc2696fd4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.732 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f0659324-ad9f-4924-a668-a91b225198b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 systemd-udevd[240763]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.785 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[51257022-79d3-4268-8734-e19a1e7c0137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.789 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8e26154d-711b-4491-8053-6153ac4b1822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 NetworkManager[48871]: <info>  [1769161347.8209] device (tapc2696fd4-50): carrier: link connected
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.829 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7a5ddb9f-ac36-426f-ad69-bc7bbd4b2716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.853 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1f68ade6-ef0e-4722-9053-6c55ef1a1031]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526045, 'reachable_time': 35713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240793, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.872 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d99989d8-c3df-4485-b778-3e8766ddd3dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:20d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526045, 'tstamp': 526045}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240794, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.895 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[323022de-a9df-4256-9198-a34e71861101]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526045, 'reachable_time': 35713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240795, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.931 222021 DEBUG nova.compute.manager [req-234e64a5-51b6-4475-9828-a59e0f16af99 req-6b12f51c-0433-4fae-ab2c-e7fabeb620c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.931 222021 DEBUG oslo_concurrency.lockutils [req-234e64a5-51b6-4475-9828-a59e0f16af99 req-6b12f51c-0433-4fae-ab2c-e7fabeb620c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.932 222021 DEBUG oslo_concurrency.lockutils [req-234e64a5-51b6-4475-9828-a59e0f16af99 req-6b12f51c-0433-4fae-ab2c-e7fabeb620c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.932 222021 DEBUG oslo_concurrency.lockutils [req-234e64a5-51b6-4475-9828-a59e0f16af99 req-6b12f51c-0433-4fae-ab2c-e7fabeb620c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:27 np0005593233 nova_compute[222017]: 2026-01-23 09:42:27.933 222021 DEBUG nova.compute.manager [req-234e64a5-51b6-4475-9828-a59e0f16af99 req-6b12f51c-0433-4fae-ab2c-e7fabeb620c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Processing event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:42:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:27.940 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[588afbd3-af67-446c-bea7-87d379ccf319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.020 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[33d5fc95-fcaf-4baa-8c5b-bbefc404d308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.022 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.022 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.023 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2696fd4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:28 np0005593233 NetworkManager[48871]: <info>  [1769161348.0255] manager: (tapc2696fd4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 23 04:42:28 np0005593233 kernel: tapc2696fd4-50: entered promiscuous mode
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.024 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.028 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc2696fd4-50, col_values=(('external_ids', {'iface-id': '38b24332-af6b-47d2-95fe-400f5feeadcb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:28Z|00145|binding|INFO|Releasing lport 38b24332-af6b-47d2-95fe-400f5feeadcb from this chassis (sb_readonly=0)
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.045 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.046 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c1502e84-79f2-4c16-862b-63caa1713a8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.047 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:42:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:28.048 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'env', 'PROCESS_TAG=haproxy-c2696fd4-5fd7-4934-88ac-40162fad555d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c2696fd4-5fd7-4934-88ac-40162fad555d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.215 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.216 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161348.2163463, 9ec28845-a80d-489b-84e9-5d38da983cdc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.218 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] VM Started (Lifecycle Event)#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.225 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.229 222021 INFO nova.virt.libvirt.driver [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Instance spawned successfully.#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.230 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.243 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.246 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.276 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.277 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161348.217417, 9ec28845-a80d-489b-84e9-5d38da983cdc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.277 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.282 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.282 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.283 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.283 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.283 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.284 222021 DEBUG nova.virt.libvirt.driver [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:28 np0005593233 podman[240869]: 2026-01-23 09:42:28.421414605 +0000 UTC m=+0.064242456 container create f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.432 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.437 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161348.2246926, 9ec28845-a80d-489b-84e9-5d38da983cdc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.437 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:42:28 np0005593233 podman[240869]: 2026-01-23 09:42:28.392700779 +0000 UTC m=+0.035528650 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:42:28 np0005593233 systemd[1]: Started libpod-conmon-f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4.scope.
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.490 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.495 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.517 222021 INFO nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Took 8.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.518 222021 DEBUG nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.526 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:42:28 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:42:28 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ed51991f6b441df1df01266ff66899c57b7e871b9177f69e1d23824f8aa36ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:42:28 np0005593233 podman[240869]: 2026-01-23 09:42:28.554792723 +0000 UTC m=+0.197620604 container init f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:42:28 np0005593233 podman[240869]: 2026-01-23 09:42:28.571878978 +0000 UTC m=+0.214706829 container start f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.586 222021 INFO nova.compute.manager [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Took 9.53 seconds to build instance.#033[00m
Jan 23 04:42:28 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [NOTICE]   (240889) : New worker (240891) forked
Jan 23 04:42:28 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [NOTICE]   (240889) : Loading success.
Jan 23 04:42:28 np0005593233 nova_compute[222017]: 2026-01-23 09:42:28.616 222021 DEBUG oslo_concurrency.lockutils [None req-45f8344a-0f92-413a-a261-69061e4be7b0 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:29.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:29 np0005593233 nova_compute[222017]: 2026-01-23 09:42:29.327 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:29.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.084 222021 DEBUG nova.compute.manager [req-a422784e-aabb-4160-b628-8272d304922e req-7d028a90-67fd-4a02-8d37-ad4b351678c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.084 222021 DEBUG oslo_concurrency.lockutils [req-a422784e-aabb-4160-b628-8272d304922e req-7d028a90-67fd-4a02-8d37-ad4b351678c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.085 222021 DEBUG oslo_concurrency.lockutils [req-a422784e-aabb-4160-b628-8272d304922e req-7d028a90-67fd-4a02-8d37-ad4b351678c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.085 222021 DEBUG oslo_concurrency.lockutils [req-a422784e-aabb-4160-b628-8272d304922e req-7d028a90-67fd-4a02-8d37-ad4b351678c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.085 222021 DEBUG nova.compute.manager [req-a422784e-aabb-4160-b628-8272d304922e req-7d028a90-67fd-4a02-8d37-ad4b351678c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] No waiting events found dispatching network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.085 222021 WARNING nova.compute.manager [req-a422784e-aabb-4160-b628-8272d304922e req-7d028a90-67fd-4a02-8d37-ad4b351678c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received unexpected event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.838 222021 DEBUG nova.objects.instance [None req-a22ef289-36b0-4976-bef3-ef731338b4e8 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9ec28845-a80d-489b-84e9-5d38da983cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.894 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161350.8945882, 9ec28845-a80d-489b-84e9-5d38da983cdc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.895 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.926 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.933 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:42:30 np0005593233 nova_compute[222017]: 2026-01-23 09:42:30.959 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:31.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:31 np0005593233 kernel: tap405ba4a0-90 (unregistering): left promiscuous mode
Jan 23 04:42:31 np0005593233 NetworkManager[48871]: <info>  [1769161351.6909] device (tap405ba4a0-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:42:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:31Z|00146|binding|INFO|Releasing lport 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 from this chassis (sb_readonly=0)
Jan 23 04:42:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:31Z|00147|binding|INFO|Setting lport 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 down in Southbound
Jan 23 04:42:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:42:31Z|00148|binding|INFO|Removing iface tap405ba4a0-90 ovn-installed in OVS
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.709 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.712 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:31.716 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:dc:c4 10.100.0.6'], port_security=['fa:16:3e:6d:dc:c4 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9ec28845-a80d-489b-84e9-5d38da983cdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=405ba4a0-902c-4d74-a4d1-bf0eb43b2622) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:42:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:31.718 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 405ba4a0-902c-4d74-a4d1-bf0eb43b2622 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d unbound from our chassis#033[00m
Jan 23 04:42:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:31.719 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2696fd4-5fd7-4934-88ac-40162fad555d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:42:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:31.721 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1efab9b2-245f-4e43-9580-a3ae710da839]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:31.721 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace which is not needed anymore#033[00m
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:31 np0005593233 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 23 04:42:31 np0005593233 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002e.scope: Consumed 3.336s CPU time.
Jan 23 04:42:31 np0005593233 systemd-machined[190954]: Machine qemu-25-instance-0000002e terminated.
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:31 np0005593233 nova_compute[222017]: 2026-01-23 09:42:31.892 222021 DEBUG nova.compute.manager [None req-a22ef289-36b0-4976-bef3-ef731338b4e8 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:31 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [NOTICE]   (240889) : haproxy version is 2.8.14-c23fe91
Jan 23 04:42:31 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [NOTICE]   (240889) : path to executable is /usr/sbin/haproxy
Jan 23 04:42:31 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [WARNING]  (240889) : Exiting Master process...
Jan 23 04:42:31 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [ALERT]    (240889) : Current worker (240891) exited with code 143 (Terminated)
Jan 23 04:42:31 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[240885]: [WARNING]  (240889) : All workers exited. Exiting... (0)
Jan 23 04:42:31 np0005593233 systemd[1]: libpod-f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4.scope: Deactivated successfully.
Jan 23 04:42:31 np0005593233 podman[241045]: 2026-01-23 09:42:31.916502945 +0000 UTC m=+0.067218430 container died f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:42:31 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4-userdata-shm.mount: Deactivated successfully.
Jan 23 04:42:31 np0005593233 systemd[1]: var-lib-containers-storage-overlay-7ed51991f6b441df1df01266ff66899c57b7e871b9177f69e1d23824f8aa36ef-merged.mount: Deactivated successfully.
Jan 23 04:42:31 np0005593233 podman[241045]: 2026-01-23 09:42:31.95718825 +0000 UTC m=+0.107903725 container cleanup f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:42:31 np0005593233 systemd[1]: libpod-conmon-f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4.scope: Deactivated successfully.
Jan 23 04:42:32 np0005593233 podman[241126]: 2026-01-23 09:42:32.029395491 +0000 UTC m=+0.047458879 container remove f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.036 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0318b6c4-5b19-438f-a1a3-5244de4eaeee]: (4, ('Fri Jan 23 09:42:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4)\nf06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4\nFri Jan 23 09:42:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (f06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4)\nf06c0ddc775412532a38c8164218e30c7f404bd2c8467c9605a3d78818e98bd4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.039 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[226f4af3-13b7-40c9-9bef-e6484343bb61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.041 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:32 np0005593233 kernel: tapc2696fd4-50: left promiscuous mode
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.071 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.076 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c8eb3794-a166-4621-a4ae-64bbeea662aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.094 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1671ef-c43a-4436-88f3-3748b0510d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.095 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b54ec290-c97f-423f-8fd2-0230486f8d0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.114 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[68608993-9f30-4da0-9a75-5c89c6e8ffae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526034, 'reachable_time': 44803, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241182, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.117 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:42:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:32.117 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[07ef6aec-7b09-4cc3-930a-4177e958c62f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:32 np0005593233 systemd[1]: run-netns-ovnmeta\x2dc2696fd4\x2d5fd7\x2d4934\x2d88ac\x2d40162fad555d.mount: Deactivated successfully.
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.251 222021 DEBUG nova.compute.manager [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received event network-vif-unplugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.251 222021 DEBUG oslo_concurrency.lockutils [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.252 222021 DEBUG oslo_concurrency.lockutils [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.252 222021 DEBUG oslo_concurrency.lockutils [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.253 222021 DEBUG nova.compute.manager [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] No waiting events found dispatching network-vif-unplugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.253 222021 WARNING nova.compute.manager [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received unexpected event network-vif-unplugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.253 222021 DEBUG nova.compute.manager [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.254 222021 DEBUG oslo_concurrency.lockutils [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.254 222021 DEBUG oslo_concurrency.lockutils [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.255 222021 DEBUG oslo_concurrency.lockutils [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.255 222021 DEBUG nova.compute.manager [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] No waiting events found dispatching network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:42:32 np0005593233 nova_compute[222017]: 2026-01-23 09:42:32.255 222021 WARNING nova.compute.manager [req-f71017c5-5740-4e73-adea-2cb0a86832c8 req-63e27fa9-f236-4e10-8ebd-0d2d70388b66 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received unexpected event network-vif-plugged-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:42:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:42:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:42:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:33.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:34 np0005593233 nova_compute[222017]: 2026-01-23 09:42:34.294 222021 DEBUG nova.compute.manager [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:34 np0005593233 nova_compute[222017]: 2026-01-23 09:42:34.330 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593233 nova_compute[222017]: 2026-01-23 09:42:34.360 222021 INFO nova.compute.manager [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] instance snapshotting#033[00m
Jan 23 04:42:34 np0005593233 nova_compute[222017]: 2026-01-23 09:42:34.361 222021 WARNING nova.compute.manager [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Jan 23 04:42:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:34 np0005593233 nova_compute[222017]: 2026-01-23 09:42:34.858 222021 INFO nova.virt.libvirt.driver [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Beginning cold snapshot process#033[00m
Jan 23 04:42:35 np0005593233 nova_compute[222017]: 2026-01-23 09:42:35.068 222021 DEBUG nova.virt.libvirt.imagebackend [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:42:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:35.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:35.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:35 np0005593233 nova_compute[222017]: 2026-01-23 09:42:35.452 222021 DEBUG nova.storage.rbd_utils [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(929711d8362f40e5b52e9667b2fddf1a) on rbd image(9ec28845-a80d-489b-84e9-5d38da983cdc_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:42:36 np0005593233 nova_compute[222017]: 2026-01-23 09:42:36.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 23 04:42:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:37.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:37 np0005593233 nova_compute[222017]: 2026-01-23 09:42:37.388 222021 DEBUG nova.storage.rbd_utils [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] cloning vms/9ec28845-a80d-489b-84e9-5d38da983cdc_disk@929711d8362f40e5b52e9667b2fddf1a to images/565ba31d-24d7-44f1-ba74-58e1935b27b9 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:42:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:37.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:37 np0005593233 nova_compute[222017]: 2026-01-23 09:42:37.531 222021 DEBUG nova.storage.rbd_utils [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] flattening images/565ba31d-24d7-44f1-ba74-58e1935b27b9 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:42:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:37.640 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:42:37 np0005593233 nova_compute[222017]: 2026-01-23 09:42:37.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:37.642 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:42:38 np0005593233 nova_compute[222017]: 2026-01-23 09:42:38.012 222021 DEBUG nova.storage.rbd_utils [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] removing snapshot(929711d8362f40e5b52e9667b2fddf1a) on rbd image(9ec28845-a80d-489b-84e9-5d38da983cdc_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:42:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 23 04:42:38 np0005593233 nova_compute[222017]: 2026-01-23 09:42:38.392 222021 DEBUG nova.storage.rbd_utils [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(snap) on rbd image(565ba31d-24d7-44f1-ba74-58e1935b27b9) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:42:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:39.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:39 np0005593233 nova_compute[222017]: 2026-01-23 09:42:39.331 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 23 04:42:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:41 np0005593233 nova_compute[222017]: 2026-01-23 09:42:41.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:41.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 23 04:42:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:41.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:41 np0005593233 nova_compute[222017]: 2026-01-23 09:42:41.776 222021 INFO nova.virt.libvirt.driver [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Snapshot image upload complete#033[00m
Jan 23 04:42:41 np0005593233 nova_compute[222017]: 2026-01-23 09:42:41.776 222021 INFO nova.compute.manager [None req-d8f2887a-f75b-44a2-901a-6aeaed1fffa2 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Took 7.41 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 04:42:42 np0005593233 podman[241424]: 2026-01-23 09:42:42.090183367 +0000 UTC m=+0.098368445 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:42.644 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:42.645 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:42.646 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:42:42.646 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:43.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:43.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:44 np0005593233 nova_compute[222017]: 2026-01-23 09:42:44.333 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:42:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2305197386' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:42:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:42:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2305197386' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:42:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:45.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:42:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3206530782' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:42:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:42:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3206530782' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:42:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:45.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.122 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.296 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.296 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.297 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.297 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.298 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.300 222021 INFO nova.compute.manager [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Terminating instance#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.302 222021 DEBUG nova.compute.manager [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.313 222021 INFO nova.virt.libvirt.driver [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Instance destroyed successfully.#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.314 222021 DEBUG nova.objects.instance [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'resources' on Instance uuid 9ec28845-a80d-489b-84e9-5d38da983cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.334 222021 DEBUG nova.virt.libvirt.vif [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-888724699',display_name='tempest-ImagesTestJSON-server-888724699',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-888724699',id=46,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:42:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-tylndpaz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:42:41Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=9ec28845-a80d-489b-84e9-5d38da983cdc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.335 222021 DEBUG nova.network.os_vif_util [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "address": "fa:16:3e:6d:dc:c4", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap405ba4a0-90", "ovs_interfaceid": "405ba4a0-902c-4d74-a4d1-bf0eb43b2622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.337 222021 DEBUG nova.network.os_vif_util [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.338 222021 DEBUG os_vif [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.341 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.341 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ba4a0-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.345 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.350 222021 INFO os_vif [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:dc:c4,bridge_name='br-int',has_traffic_filtering=True,id=405ba4a0-902c-4d74-a4d1-bf0eb43b2622,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap405ba4a0-90')#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.893 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161351.8915806, 9ec28845-a80d-489b-84e9-5d38da983cdc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.894 222021 INFO nova.compute.manager [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.915 222021 INFO nova.virt.libvirt.driver [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Deleting instance files /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc_del#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.916 222021 INFO nova.virt.libvirt.driver [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Deletion of /var/lib/nova/instances/9ec28845-a80d-489b-84e9-5d38da983cdc_del complete#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.922 222021 DEBUG nova.compute.manager [None req-09de6024-05da-4e72-adf9-2e085c6a0bf5 - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.927 222021 DEBUG nova.compute.manager [None req-09de6024-05da-4e72-adf9-2e085c6a0bf5 - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: suspended, current task_state: deleting, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:42:46 np0005593233 nova_compute[222017]: 2026-01-23 09:42:46.968 222021 INFO nova.compute.manager [None req-09de6024-05da-4e72-adf9-2e085c6a0bf5 - - - - - -] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 23 04:42:47 np0005593233 nova_compute[222017]: 2026-01-23 09:42:47.001 222021 INFO nova.compute.manager [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Took 0.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:42:47 np0005593233 nova_compute[222017]: 2026-01-23 09:42:47.002 222021 DEBUG oslo.service.loopingcall [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:42:47 np0005593233 nova_compute[222017]: 2026-01-23 09:42:47.003 222021 DEBUG nova.compute.manager [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:42:47 np0005593233 nova_compute[222017]: 2026-01-23 09:42:47.003 222021 DEBUG nova.network.neutron [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:42:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:47.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:47.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 23 04:42:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:49.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:49 np0005593233 nova_compute[222017]: 2026-01-23 09:42:49.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:49.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:50 np0005593233 nova_compute[222017]: 2026-01-23 09:42:50.905 222021 DEBUG nova.network.neutron [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:50 np0005593233 nova_compute[222017]: 2026-01-23 09:42:50.952 222021 INFO nova.compute.manager [-] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Took 3.95 seconds to deallocate network for instance.#033[00m
Jan 23 04:42:50 np0005593233 nova_compute[222017]: 2026-01-23 09:42:50.991 222021 DEBUG nova.compute.manager [req-dce68d94-2cf5-4344-adb3-379d3daafd25 req-0ed18901-83c5-44d7-ae6a-c5099286a6fe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 9ec28845-a80d-489b-84e9-5d38da983cdc] Received event network-vif-deleted-405ba4a0-902c-4d74-a4d1-bf0eb43b2622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.009 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.010 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.114 222021 DEBUG oslo_concurrency.processutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:51.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:51.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1490861852' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.571 222021 DEBUG oslo_concurrency.processutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.581 222021 DEBUG nova.compute.provider_tree [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.627 222021 DEBUG nova.scheduler.client.report [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.689 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:51 np0005593233 nova_compute[222017]: 2026-01-23 09:42:51.767 222021 INFO nova.scheduler.client.report [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Deleted allocations for instance 9ec28845-a80d-489b-84e9-5d38da983cdc#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.350 222021 DEBUG oslo_concurrency.lockutils [None req-a92756ae-0bd3-475f-84fe-9c9bd2c2a607 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "9ec28845-a80d-489b-84e9-5d38da983cdc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.752 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.753 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.779 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.867 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.867 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.877 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.877 222021 INFO nova.compute.claims [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:42:52 np0005593233 nova_compute[222017]: 2026-01-23 09:42:52.993 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:53.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:53.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2790004081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.488 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.496 222021 DEBUG nova.compute.provider_tree [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.514 222021 DEBUG nova.scheduler.client.report [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.552 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.554 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.621 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.621 222021 DEBUG nova.network.neutron [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.659 222021 INFO nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.710 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.857 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.860 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.861 222021 INFO nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Creating image(s)#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.905 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.937 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.962 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.966 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:53 np0005593233 nova_compute[222017]: 2026-01-23 09:42:53.992 222021 DEBUG nova.policy [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56da68482e3a4fb582dcccad45f8f71b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05bc71a77710455e8b34ead7fec81a31', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.028 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.029 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.030 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.030 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.061 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.066 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.337 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 23 04:42:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:54 np0005593233 nova_compute[222017]: 2026-01-23 09:42:54.969 222021 DEBUG nova.network.neutron [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Successfully created port: 9bf1dba3-6743-4176-b024-221df7bb7634 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:42:55 np0005593233 podman[241609]: 2026-01-23 09:42:55.0808774 +0000 UTC m=+0.089573505 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:42:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:55.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.438 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.373s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:55.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.525 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] resizing rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.639 222021 DEBUG nova.objects.instance [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'migration_context' on Instance uuid 47eda3a7-c47a-48cc-8381-a702e2e27bfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.659 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.659 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Ensure instance console log exists: /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.660 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.660 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:55 np0005593233 nova_compute[222017]: 2026-01-23 09:42:55.660 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.076 222021 DEBUG nova.network.neutron [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Successfully updated port: 9bf1dba3-6743-4176-b024-221df7bb7634 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.099 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.099 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquired lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.099 222021 DEBUG nova.network.neutron [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.230 222021 DEBUG nova.compute.manager [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-changed-9bf1dba3-6743-4176-b024-221df7bb7634 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.230 222021 DEBUG nova.compute.manager [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Refreshing instance network info cache due to event network-changed-9bf1dba3-6743-4176-b024-221df7bb7634. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.231 222021 DEBUG oslo_concurrency.lockutils [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.317 222021 DEBUG nova.network.neutron [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:42:56 np0005593233 nova_compute[222017]: 2026-01-23 09:42:56.351 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:57.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:42:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:57.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.687 222021 DEBUG nova.network.neutron [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Updating instance_info_cache with network_info: [{"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.710 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Releasing lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.711 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Instance network_info: |[{"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.711 222021 DEBUG oslo_concurrency.lockutils [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.712 222021 DEBUG nova.network.neutron [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Refreshing network info cache for port 9bf1dba3-6743-4176-b024-221df7bb7634 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.714 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Start _get_guest_xml network_info=[{"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.721 222021 WARNING nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.734 222021 DEBUG nova.virt.libvirt.host [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.736 222021 DEBUG nova.virt.libvirt.host [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.742 222021 DEBUG nova.virt.libvirt.host [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.743 222021 DEBUG nova.virt.libvirt.host [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.744 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.744 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.745 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.745 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.745 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.746 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.746 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.746 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.746 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.746 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.747 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.747 222021 DEBUG nova.virt.hardware [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:42:57 np0005593233 nova_compute[222017]: 2026-01-23 09:42:57.750 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/828363270' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.236 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.264 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.269 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2281109612' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.723 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.725 222021 DEBUG nova.virt.libvirt.vif [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1340106533',display_name='tempest-ImagesTestJSON-server-1340106533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1340106533',id=49,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-929q7r12',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:53Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=47eda3a7-c47a-48cc-8381-a702e2e27bfc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.725 222021 DEBUG nova.network.os_vif_util [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.726 222021 DEBUG nova.network.os_vif_util [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.727 222021 DEBUG nova.objects.instance [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 47eda3a7-c47a-48cc-8381-a702e2e27bfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.891 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <uuid>47eda3a7-c47a-48cc-8381-a702e2e27bfc</uuid>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <name>instance-00000031</name>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:name>tempest-ImagesTestJSON-server-1340106533</nova:name>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:42:57</nova:creationTime>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:user uuid="56da68482e3a4fb582dcccad45f8f71b">tempest-ImagesTestJSON-1507872051-project-member</nova:user>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:project uuid="05bc71a77710455e8b34ead7fec81a31">tempest-ImagesTestJSON-1507872051</nova:project>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <nova:port uuid="9bf1dba3-6743-4176-b024-221df7bb7634">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <entry name="serial">47eda3a7-c47a-48cc-8381-a702e2e27bfc</entry>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <entry name="uuid">47eda3a7-c47a-48cc-8381-a702e2e27bfc</entry>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk.config">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:28:e5:52"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <target dev="tap9bf1dba3-67"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/console.log" append="off"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:42:58 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:42:58 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:42:58 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:42:58 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.893 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Preparing to wait for external event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.894 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.894 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.895 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.896 222021 DEBUG nova.virt.libvirt.vif [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1340106533',display_name='tempest-ImagesTestJSON-server-1340106533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1340106533',id=49,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-929q7r12',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:53Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=47eda3a7-c47a-48cc-8381-a702e2e27bfc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.896 222021 DEBUG nova.network.os_vif_util [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.897 222021 DEBUG nova.network.os_vif_util [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.897 222021 DEBUG os_vif [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.898 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.899 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.899 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.902 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.902 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bf1dba3-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.903 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9bf1dba3-67, col_values=(('external_ids', {'iface-id': '9bf1dba3-6743-4176-b024-221df7bb7634', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:e5:52', 'vm-uuid': '47eda3a7-c47a-48cc-8381-a702e2e27bfc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.904 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:58 np0005593233 NetworkManager[48871]: <info>  [1769161378.9056] manager: (tap9bf1dba3-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.914 222021 INFO os_vif [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67')#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.984 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.985 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.986 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No VIF found with MAC fa:16:3e:28:e5:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:42:58 np0005593233 nova_compute[222017]: 2026-01-23 09:42:58.987 222021 INFO nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Using config drive#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.026 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:59.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.227 222021 DEBUG nova.network.neutron [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Updated VIF entry in instance network info cache for port 9bf1dba3-6743-4176-b024-221df7bb7634. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.228 222021 DEBUG nova.network.neutron [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Updating instance_info_cache with network_info: [{"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.246 222021 DEBUG oslo_concurrency.lockutils [req-b00a7bdc-438b-4ce7-9859-a399bde35fe4 req-1171ad95-363a-4a6e-924a-8e4374fd3cdf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.339 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.390 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.439 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 47eda3a7-c47a-48cc-8381-a702e2e27bfc _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.440 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:42:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:42:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:59.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.516 222021 INFO nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Creating config drive at /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/disk.config#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.527 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbay3jymm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.673 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbay3jymm" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.723 222021 DEBUG nova.storage.rbd_utils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:59 np0005593233 nova_compute[222017]: 2026-01-23 09:42:59.728 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/disk.config 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:01.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.279 222021 DEBUG oslo_concurrency.processutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/disk.config 47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.280 222021 INFO nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Deleting local config drive /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc/disk.config because it was imported into RBD.#033[00m
Jan 23 04:43:01 np0005593233 kernel: tap9bf1dba3-67: entered promiscuous mode
Jan 23 04:43:01 np0005593233 NetworkManager[48871]: <info>  [1769161381.3643] manager: (tap9bf1dba3-67): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Jan 23 04:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:01Z|00149|binding|INFO|Claiming lport 9bf1dba3-6743-4176-b024-221df7bb7634 for this chassis.
Jan 23 04:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:01Z|00150|binding|INFO|9bf1dba3-6743-4176-b024-221df7bb7634: Claiming fa:16:3e:28:e5:52 10.100.0.7
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.365 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.387 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e5:52 10.100.0.7'], port_security=['fa:16:3e:28:e5:52 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '47eda3a7-c47a-48cc-8381-a702e2e27bfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=9bf1dba3-6743-4176-b024-221df7bb7634) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.388 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 9bf1dba3-6743-4176-b024-221df7bb7634 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d bound to our chassis#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.389 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c2696fd4-5fd7-4934-88ac-40162fad555d#033[00m
Jan 23 04:43:01 np0005593233 systemd-udevd[241837]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.401 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[033ff190-1275-481a-b01f-f47c95d585be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.402 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc2696fd4-51 in ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.405 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc2696fd4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.405 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ac70a1-8e13-4058-8c12-5311f8ced6bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 NetworkManager[48871]: <info>  [1769161381.4067] device (tap9bf1dba3-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:43:01 np0005593233 NetworkManager[48871]: <info>  [1769161381.4075] device (tap9bf1dba3-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:01Z|00151|binding|INFO|Setting lport 9bf1dba3-6743-4176-b024-221df7bb7634 ovn-installed in OVS
Jan 23 04:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:01Z|00152|binding|INFO|Setting lport 9bf1dba3-6743-4176-b024-221df7bb7634 up in Southbound
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.407 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[71f58ed2-24df-48d9-bb8f-d11ebc6f26d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.409 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 systemd-machined[190954]: New machine qemu-26-instance-00000031.
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.422 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[3823044c-3271-4628-831e-3702a43a3160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 systemd[1]: Started Virtual Machine qemu-26-instance-00000031.
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.450 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb8014c-d1a7-4ad2-860d-2d712caf9149]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:01.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.516 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2649a193-eaa2-4ffe-97e4-71dba98cf268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.524 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5e37f1-5f6b-4bd5-a4bb-a060df9e8a3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 NetworkManager[48871]: <info>  [1769161381.5265] manager: (tapc2696fd4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.562 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[32ec5ca9-11b5-488a-b983-4671a279cd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.565 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fed81ec2-2e39-4f60-b2a6-b476494620dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 NetworkManager[48871]: <info>  [1769161381.5865] device (tapc2696fd4-50): carrier: link connected
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.592 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac34320-fec3-4d0e-a9f6-54b8067efbb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.612 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca90d119-910d-44b1-95a0-c6106043cc54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529421, 'reachable_time': 29481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241872, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.628 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[76115730-42f1-46d0-8468-342f6cf6c1c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:20d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529421, 'tstamp': 529421}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241873, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.645 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cbeae085-f07e-40d1-91d6-11cc20aacacb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529421, 'reachable_time': 29481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241874, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.685 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5730c186-e018-4f34-87a0-394ea4d3609b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.757 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3dddde22-bba9-4a52-8382-525dd8fa08d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.759 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.759 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.760 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2696fd4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 NetworkManager[48871]: <info>  [1769161381.7634] manager: (tapc2696fd4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 23 04:43:01 np0005593233 kernel: tapc2696fd4-50: entered promiscuous mode
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.765 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.766 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc2696fd4-50, col_values=(('external_ids', {'iface-id': '38b24332-af6b-47d2-95fe-400f5feeadcb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:01Z|00153|binding|INFO|Releasing lport 38b24332-af6b-47d2-95fe-400f5feeadcb from this chassis (sb_readonly=0)
Jan 23 04:43:01 np0005593233 nova_compute[222017]: 2026-01-23 09:43:01.787 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.787 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.788 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c26efaa9-bbdd-49c3-979b-34d0dc3e3b60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.789 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:43:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:01.819 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'env', 'PROCESS_TAG=haproxy-c2696fd4-5fd7-4934-88ac-40162fad555d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c2696fd4-5fd7-4934-88ac-40162fad555d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.055 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161382.0546706, 47eda3a7-c47a-48cc-8381-a702e2e27bfc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.057 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] VM Started (Lifecycle Event)#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.092 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.099 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161382.055255, 47eda3a7-c47a-48cc-8381-a702e2e27bfc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.099 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.130 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.135 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.169 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:02 np0005593233 podman[241948]: 2026-01-23 09:43:02.268986812 +0000 UTC m=+0.062702091 container create f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:43:02 np0005593233 systemd[1]: Started libpod-conmon-f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380.scope.
Jan 23 04:43:02 np0005593233 podman[241948]: 2026-01-23 09:43:02.232084014 +0000 UTC m=+0.025799343 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:43:02 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:43:02 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93cb20647b20fd69904119de94bc1ba42fcfdd42e6e89837a0e8ded5583b8467/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:43:02 np0005593233 podman[241948]: 2026-01-23 09:43:02.389490845 +0000 UTC m=+0.183206154 container init f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.391 222021 DEBUG nova.compute.manager [req-d1df9d69-d079-454c-b490-adec72c5f2b8 req-ec125883-0704-4ce8-a7d5-62f332bae621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.392 222021 DEBUG oslo_concurrency.lockutils [req-d1df9d69-d079-454c-b490-adec72c5f2b8 req-ec125883-0704-4ce8-a7d5-62f332bae621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.393 222021 DEBUG oslo_concurrency.lockutils [req-d1df9d69-d079-454c-b490-adec72c5f2b8 req-ec125883-0704-4ce8-a7d5-62f332bae621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.394 222021 DEBUG oslo_concurrency.lockutils [req-d1df9d69-d079-454c-b490-adec72c5f2b8 req-ec125883-0704-4ce8-a7d5-62f332bae621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.394 222021 DEBUG nova.compute.manager [req-d1df9d69-d079-454c-b490-adec72c5f2b8 req-ec125883-0704-4ce8-a7d5-62f332bae621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Processing event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.395 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:43:02 np0005593233 podman[241948]: 2026-01-23 09:43:02.400635681 +0000 UTC m=+0.194350950 container start f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.401 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161382.4012928, 47eda3a7-c47a-48cc-8381-a702e2e27bfc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.402 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.406 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.412 222021 INFO nova.virt.libvirt.driver [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Instance spawned successfully.#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.413 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.432 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:02 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [NOTICE]   (241968) : New worker (241970) forked
Jan 23 04:43:02 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [NOTICE]   (241968) : Loading success.
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.444 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.455 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.456 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.457 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.457 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.458 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.459 222021 DEBUG nova.virt.libvirt.driver [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.475 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.543 222021 INFO nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Took 8.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.544 222021 DEBUG nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.635 222021 INFO nova.compute.manager [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Took 9.79 seconds to build instance.#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.662 222021 DEBUG oslo_concurrency.lockutils [None req-4723eabd-853d-4618-bc12-a6cf4cff2a3e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.663 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.664 222021 INFO nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:02 np0005593233 nova_compute[222017]: 2026-01-23 09:43:02.664 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:03.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:43:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:43:03 np0005593233 nova_compute[222017]: 2026-01-23 09:43:03.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.345 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.599 222021 DEBUG nova.compute.manager [req-a22c4a0f-a99b-45b2-976e-a3ea682bf9f6 req-19a3cabb-73fa-4d82-bafb-309a6bb1b6ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.600 222021 DEBUG oslo_concurrency.lockutils [req-a22c4a0f-a99b-45b2-976e-a3ea682bf9f6 req-19a3cabb-73fa-4d82-bafb-309a6bb1b6ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.601 222021 DEBUG oslo_concurrency.lockutils [req-a22c4a0f-a99b-45b2-976e-a3ea682bf9f6 req-19a3cabb-73fa-4d82-bafb-309a6bb1b6ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.602 222021 DEBUG oslo_concurrency.lockutils [req-a22c4a0f-a99b-45b2-976e-a3ea682bf9f6 req-19a3cabb-73fa-4d82-bafb-309a6bb1b6ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.602 222021 DEBUG nova.compute.manager [req-a22c4a0f-a99b-45b2-976e-a3ea682bf9f6 req-19a3cabb-73fa-4d82-bafb-309a6bb1b6ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] No waiting events found dispatching network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:04 np0005593233 nova_compute[222017]: 2026-01-23 09:43:04.603 222021 WARNING nova.compute.manager [req-a22c4a0f-a99b-45b2-976e-a3ea682bf9f6 req-19a3cabb-73fa-4d82-bafb-309a6bb1b6ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received unexpected event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:43:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:05 np0005593233 nova_compute[222017]: 2026-01-23 09:43:05.014 222021 DEBUG nova.compute.manager [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:05 np0005593233 nova_compute[222017]: 2026-01-23 09:43:05.087 222021 INFO nova.compute.manager [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] instance snapshotting#033[00m
Jan 23 04:43:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:05.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:05.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:05 np0005593233 nova_compute[222017]: 2026-01-23 09:43:05.606 222021 INFO nova.virt.libvirt.driver [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Beginning live snapshot process#033[00m
Jan 23 04:43:05 np0005593233 nova_compute[222017]: 2026-01-23 09:43:05.763 222021 DEBUG nova.virt.libvirt.imagebackend [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:43:06 np0005593233 nova_compute[222017]: 2026-01-23 09:43:06.040 222021 DEBUG nova.storage.rbd_utils [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(50726b62901440798c23e1a2ad5a6037) on rbd image(47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:43:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 23 04:43:07 np0005593233 nova_compute[222017]: 2026-01-23 09:43:07.084 222021 DEBUG nova.storage.rbd_utils [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] cloning vms/47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk@50726b62901440798c23e1a2ad5a6037 to images/b5a73d98-c27b-4745-95e1-6675f24e35ae clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:43:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:07.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:07.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:07 np0005593233 nova_compute[222017]: 2026-01-23 09:43:07.688 222021 DEBUG nova.storage.rbd_utils [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] flattening images/b5a73d98-c27b-4745-95e1-6675f24e35ae flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:43:07 np0005593233 nova_compute[222017]: 2026-01-23 09:43:07.997 222021 DEBUG nova.storage.rbd_utils [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] removing snapshot(50726b62901440798c23e1a2ad5a6037) on rbd image(47eda3a7-c47a-48cc-8381-a702e2e27bfc_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:43:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 23 04:43:08 np0005593233 nova_compute[222017]: 2026-01-23 09:43:08.911 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:09 np0005593233 nova_compute[222017]: 2026-01-23 09:43:09.109 222021 DEBUG nova.storage.rbd_utils [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(snap) on rbd image(b5a73d98-c27b-4745-95e1-6675f24e35ae) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:43:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:43:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:09.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:43:09 np0005593233 nova_compute[222017]: 2026-01-23 09:43:09.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:09 np0005593233 nova_compute[222017]: 2026-01-23 09:43:09.436 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:09 np0005593233 nova_compute[222017]: 2026-01-23 09:43:09.437 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:09.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 23 04:43:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:11.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:11 np0005593233 nova_compute[222017]: 2026-01-23 09:43:11.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:11 np0005593233 nova_compute[222017]: 2026-01-23 09:43:11.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:43:11 np0005593233 nova_compute[222017]: 2026-01-23 09:43:11.478 222021 INFO nova.virt.libvirt.driver [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Snapshot image upload complete#033[00m
Jan 23 04:43:11 np0005593233 nova_compute[222017]: 2026-01-23 09:43:11.479 222021 INFO nova.compute.manager [None req-026d245a-6c78-4c79-8689-e3f621e17777 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Took 6.39 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 04:43:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:11.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:12 np0005593233 nova_compute[222017]: 2026-01-23 09:43:12.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:12 np0005593233 nova_compute[222017]: 2026-01-23 09:43:12.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:13 np0005593233 podman[242121]: 2026-01-23 09:43:13.087095429 +0000 UTC m=+0.098990950 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:43:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:13.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.400 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.400 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.400 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:43:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:13.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.915 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.968 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.968 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.969 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:43:13 np0005593233 nova_compute[222017]: 2026-01-23 09:43:13.969 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 47eda3a7-c47a-48cc-8381-a702e2e27bfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 23 04:43:14 np0005593233 nova_compute[222017]: 2026-01-23 09:43:14.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:43:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:15.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:43:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:15.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:15Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:28:e5:52 10.100.0.7
Jan 23 04:43:15 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:15Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:28:e5:52 10.100.0.7
Jan 23 04:43:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:17.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:17 np0005593233 nova_compute[222017]: 2026-01-23 09:43:17.920 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Updating instance_info_cache with network_info: [{"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:17 np0005593233 nova_compute[222017]: 2026-01-23 09:43:17.971 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-47eda3a7-c47a-48cc-8381-a702e2e27bfc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:17 np0005593233 nova_compute[222017]: 2026-01-23 09:43:17.972 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:43:17 np0005593233 nova_compute[222017]: 2026-01-23 09:43:17.972 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:17 np0005593233 nova_compute[222017]: 2026-01-23 09:43:17.973 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:17 np0005593233 nova_compute[222017]: 2026-01-23 09:43:17.973 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.015 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.016 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.017 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.017 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.017 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/224284696' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.510 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.635 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000031 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.636 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000031 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.862 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.864 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4530MB free_disk=20.86325454711914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.864 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.865 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:18 np0005593233 nova_compute[222017]: 2026-01-23 09:43:18.918 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.102 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 47eda3a7-c47a-48cc-8381-a702e2e27bfc actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.103 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.103 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:43:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:19.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.274 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.350 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:19.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3705575993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.728 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.735 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.772 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.830 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:43:19 np0005593233 nova_compute[222017]: 2026-01-23 09:43:19.831 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:21.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:21.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.570383) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402570451, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1109, "num_deletes": 256, "total_data_size": 2058661, "memory_usage": 2080080, "flush_reason": "Manual Compaction"}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402597558, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1356601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34809, "largest_seqno": 35913, "table_properties": {"data_size": 1351584, "index_size": 2477, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11783, "raw_average_key_size": 20, "raw_value_size": 1341192, "raw_average_value_size": 2348, "num_data_blocks": 107, "num_entries": 571, "num_filter_entries": 571, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161340, "oldest_key_time": 1769161340, "file_creation_time": 1769161402, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 27275 microseconds, and 9519 cpu microseconds.
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.597656) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1356601 bytes OK
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.597682) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.601442) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.601461) EVENT_LOG_v1 {"time_micros": 1769161402601455, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.601480) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 2053090, prev total WAL file size 2053090, number of live WAL files 2.
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.602872) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1324KB)], [66(8652KB)]
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402602937, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 10217053, "oldest_snapshot_seqno": -1}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 5830 keys, 8327103 bytes, temperature: kUnknown
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402699813, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8327103, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8289600, "index_size": 21841, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 149800, "raw_average_key_size": 25, "raw_value_size": 8186160, "raw_average_value_size": 1404, "num_data_blocks": 874, "num_entries": 5830, "num_filter_entries": 5830, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161402, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.700182) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8327103 bytes
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.703488) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.3 rd, 85.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 8.4 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(13.7) write-amplify(6.1) OK, records in: 6362, records dropped: 532 output_compression: NoCompression
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.703515) EVENT_LOG_v1 {"time_micros": 1769161402703503, "job": 40, "event": "compaction_finished", "compaction_time_micros": 97024, "compaction_time_cpu_micros": 24235, "output_level": 6, "num_output_files": 1, "total_output_size": 8327103, "num_input_records": 6362, "num_output_records": 5830, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402704034, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402706221, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.602651) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.706328) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.706335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.706338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.706340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:43:22.706343) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:23.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:23.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:23 np0005593233 nova_compute[222017]: 2026-01-23 09:43:23.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.102 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.103 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.239 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.335 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.336 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.347 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.348 222021 INFO nova.compute.claims [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.353 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.416 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.416 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.438 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:43:24 np0005593233 nova_compute[222017]: 2026-01-23 09:43:24.608 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2650706706' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.078 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.085 222021 DEBUG nova.compute.provider_tree [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.106 222021 DEBUG nova.scheduler.client.report [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.137 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.137 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.197 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.198 222021 DEBUG nova.network.neutron [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:43:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:25.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.226 222021 INFO nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.250 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.385 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.387 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.388 222021 INFO nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Creating image(s)#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.423 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.461 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.499 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.503 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:25.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.587 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.589 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.590 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.590 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.626 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.631 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a20beb60-edf4-4d74-b1fe-d7a806a43094_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:25 np0005593233 podman[242305]: 2026-01-23 09:43:25.829045905 +0000 UTC m=+0.079339548 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.867 222021 DEBUG nova.policy [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c24a9322e0b749479afb44a44db2c404', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1636cc2fefd24ee3803797b94a3a30a4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.913 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a20beb60-edf4-4d74-b1fe-d7a806a43094_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:25 np0005593233 nova_compute[222017]: 2026-01-23 09:43:25.988 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] resizing rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:43:26 np0005593233 nova_compute[222017]: 2026-01-23 09:43:26.100 222021 DEBUG nova.objects.instance [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lazy-loading 'migration_context' on Instance uuid a20beb60-edf4-4d74-b1fe-d7a806a43094 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:26 np0005593233 nova_compute[222017]: 2026-01-23 09:43:26.553 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:43:26 np0005593233 nova_compute[222017]: 2026-01-23 09:43:26.554 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Ensure instance console log exists: /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:43:26 np0005593233 nova_compute[222017]: 2026-01-23 09:43:26.554 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:26 np0005593233 nova_compute[222017]: 2026-01-23 09:43:26.555 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:26 np0005593233 nova_compute[222017]: 2026-01-23 09:43:26.555 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:27.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:27 np0005593233 nova_compute[222017]: 2026-01-23 09:43:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:27 np0005593233 nova_compute[222017]: 2026-01-23 09:43:27.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:43:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:27.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:27 np0005593233 nova_compute[222017]: 2026-01-23 09:43:27.611 222021 DEBUG nova.network.neutron [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Successfully created port: 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.741 222021 DEBUG nova.network.neutron [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Successfully updated port: 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.768 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:28.769 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:28.771 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.771 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.771 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquired lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.772 222021 DEBUG nova.network.neutron [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.899 222021 DEBUG nova.compute.manager [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-changed-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.899 222021 DEBUG nova.compute.manager [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Refreshing instance network info cache due to event network-changed-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.900 222021 DEBUG oslo_concurrency.lockutils [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:28 np0005593233 nova_compute[222017]: 2026-01-23 09:43:28.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593233 nova_compute[222017]: 2026-01-23 09:43:29.045 222021 DEBUG nova.network.neutron [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:43:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:29.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:29 np0005593233 nova_compute[222017]: 2026-01-23 09:43:29.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:29.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.656 222021 DEBUG nova.network.neutron [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updating instance_info_cache with network_info: [{"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.681 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Releasing lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.681 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Instance network_info: |[{"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.682 222021 DEBUG oslo_concurrency.lockutils [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.682 222021 DEBUG nova.network.neutron [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Refreshing network info cache for port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.685 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Start _get_guest_xml network_info=[{"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.691 222021 WARNING nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.696 222021 DEBUG nova.virt.libvirt.host [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.697 222021 DEBUG nova.virt.libvirt.host [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.707 222021 DEBUG nova.virt.libvirt.host [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.708 222021 DEBUG nova.virt.libvirt.host [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.709 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.710 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.710 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.710 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.710 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.711 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.711 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.711 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.711 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.712 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.712 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.712 222021 DEBUG nova.virt.hardware [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:43:30 np0005593233 nova_compute[222017]: 2026-01-23 09:43:30.715 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:43:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/257076242' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.199 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:31.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.252 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.257 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:31.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:43:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1705089841' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.770 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.772 222021 DEBUG nova.virt.libvirt.vif [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-848442657',id=51,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1636cc2fefd24ee3803797b94a3a30a4',ramdisk_id='',reservation_id='r-l5q4bx3c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-421410905',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-421410905-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:25Z,user_data=None,user_id='c24a9322e0b749479afb44a44db2c404',uuid=a20beb60-edf4-4d74-b1fe-d7a806a43094,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.773 222021 DEBUG nova.network.os_vif_util [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Converting VIF {"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.776 222021 DEBUG nova.network.os_vif_util [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.777 222021 DEBUG nova.objects.instance [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lazy-loading 'pci_devices' on Instance uuid a20beb60-edf4-4d74-b1fe-d7a806a43094 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.808 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <uuid>a20beb60-edf4-4d74-b1fe-d7a806a43094</uuid>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <name>instance-00000033</name>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657</nova:name>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:43:30</nova:creationTime>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:user uuid="c24a9322e0b749479afb44a44db2c404">tempest-FloatingIPsAssociationNegativeTestJSON-421410905-project-member</nova:user>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:project uuid="1636cc2fefd24ee3803797b94a3a30a4">tempest-FloatingIPsAssociationNegativeTestJSON-421410905</nova:project>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <nova:port uuid="07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <entry name="serial">a20beb60-edf4-4d74-b1fe-d7a806a43094</entry>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <entry name="uuid">a20beb60-edf4-4d74-b1fe-d7a806a43094</entry>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a20beb60-edf4-4d74-b1fe-d7a806a43094_disk">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a20beb60-edf4-4d74-b1fe-d7a806a43094_disk.config">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:7a:14:ec"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <target dev="tap07a3d7bb-3d"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/console.log" append="off"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:43:31 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:43:31 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:43:31 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:43:31 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.809 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Preparing to wait for external event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.809 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.810 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.810 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.811 222021 DEBUG nova.virt.libvirt.vif [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-848442657',id=51,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1636cc2fefd24ee3803797b94a3a30a4',ramdisk_id='',reservation_id='r-l5q4bx3c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-421410905',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-421410905-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:25Z,user_data=None,user_id='c24a9322e0b749479afb44a44db2c404',uuid=a20beb60-edf4-4d74-b1fe-d7a806a43094,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.811 222021 DEBUG nova.network.os_vif_util [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Converting VIF {"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.812 222021 DEBUG nova.network.os_vif_util [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.813 222021 DEBUG os_vif [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.814 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.814 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.815 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.822 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07a3d7bb-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.824 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap07a3d7bb-3d, col_values=(('external_ids', {'iface-id': '07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:14:ec', 'vm-uuid': 'a20beb60-edf4-4d74-b1fe-d7a806a43094'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.830 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:43:31 np0005593233 NetworkManager[48871]: <info>  [1769161411.8306] manager: (tap07a3d7bb-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.838 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.840 222021 INFO os_vif [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d')#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.913 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.913 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.913 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] No VIF found with MAC fa:16:3e:7a:14:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.914 222021 INFO nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Using config drive#033[00m
Jan 23 04:43:31 np0005593233 nova_compute[222017]: 2026-01-23 09:43:31.946 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.201 222021 INFO nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Creating config drive at /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/disk.config#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.213 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8zfkgts execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:33.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.252 222021 DEBUG nova.network.neutron [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updated VIF entry in instance network info cache for port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.255 222021 DEBUG nova.network.neutron [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updating instance_info_cache with network_info: [{"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.304 222021 DEBUG oslo_concurrency.lockutils [req-e7a28487-b8ec-43de-8ea0-79ed3fc87e8a req-ae9ea5c3-e67d-489a-b57e-ab402ac6a54b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.360 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8zfkgts" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.402 222021 DEBUG nova.storage.rbd_utils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] rbd image a20beb60-edf4-4d74-b1fe-d7a806a43094_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.409 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/disk.config a20beb60-edf4-4d74-b1fe-d7a806a43094_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:33.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.617 222021 DEBUG oslo_concurrency.processutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/disk.config a20beb60-edf4-4d74-b1fe-d7a806a43094_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.618 222021 INFO nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Deleting local config drive /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094/disk.config because it was imported into RBD.#033[00m
Jan 23 04:43:33 np0005593233 kernel: tap07a3d7bb-3d: entered promiscuous mode
Jan 23 04:43:33 np0005593233 NetworkManager[48871]: <info>  [1769161413.6851] manager: (tap07a3d7bb-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Jan 23 04:43:33 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:33Z|00154|binding|INFO|Claiming lport 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab for this chassis.
Jan 23 04:43:33 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:33Z|00155|binding|INFO|07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab: Claiming fa:16:3e:7a:14:ec 10.100.0.14
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.689 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.697 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.711 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:14:ec 10.100.0.14'], port_security=['fa:16:3e:7a:14:ec 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a20beb60-edf4-4d74-b1fe-d7a806a43094', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1636cc2fefd24ee3803797b94a3a30a4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2169c6e-10e7-450c-9de2-03f687338bd7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40d01e86-7571-4c5e-ac86-7993de7c4ca7, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.771 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab in datapath eccab5f7-6d9c-4e5d-aaca-8a125da0cddb bound to our chassis#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.773 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eccab5f7-6d9c-4e5d-aaca-8a125da0cddb#033[00m
Jan 23 04:43:33 np0005593233 systemd-udevd[242535]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:33 np0005593233 systemd-machined[190954]: New machine qemu-27-instance-00000033.
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.793 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc983f3-4a8a-4631-aca4-b29ceb6ffd87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.794 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeccab5f7-61 in ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.797 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeccab5f7-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.797 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[957fa2f9-8fb8-4d96-8d84-e824b280c070]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.798 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d708ba2d-2019-42b7-94b2-f5394a9a937a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 NetworkManager[48871]: <info>  [1769161413.8055] device (tap07a3d7bb-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:43:33 np0005593233 NetworkManager[48871]: <info>  [1769161413.8072] device (tap07a3d7bb-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.811 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3da232-75e7-4200-8078-62f3013e3de2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:33Z|00156|binding|INFO|Setting lport 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab ovn-installed in OVS
Jan 23 04:43:33 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:33Z|00157|binding|INFO|Setting lport 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab up in Southbound
Jan 23 04:43:33 np0005593233 nova_compute[222017]: 2026-01-23 09:43:33.823 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:33 np0005593233 systemd[1]: Started Virtual Machine qemu-27-instance-00000033.
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.830 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2ac07b-a0e4-4e6b-809a-c4e5834d0fe6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.888 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[83c24d29-4472-4f36-b73f-a5192a97db16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 NetworkManager[48871]: <info>  [1769161413.8974] manager: (tapeccab5f7-60): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Jan 23 04:43:33 np0005593233 systemd-udevd[242539]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.899 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1012900b-6647-4f4b-bafe-f43095a785c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.938 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[bd059ea3-59e9-4188-bf36-5a0dbf217909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.943 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa1ea16-50f9-4349-aa6a-815706367cd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:33 np0005593233 NetworkManager[48871]: <info>  [1769161413.9758] device (tapeccab5f7-60): carrier: link connected
Jan 23 04:43:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:33.982 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8ddd2a60-de96-48c6-8060-fd9dc572c334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.002 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[acd498c2-6f65-4f66-8288-a696b6d4ba8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeccab5f7-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:fa:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532660, 'reachable_time': 30430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242568, 'error': None, 'target': 'ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.026 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[633ffe08-2ec4-4673-b899-d20b9bd1eaa0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe54:fa3c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532660, 'tstamp': 532660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242569, 'error': None, 'target': 'ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.051 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[58be23c2-d155-45cb-a4f7-a97abfc0aa16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeccab5f7-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:fa:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532660, 'reachable_time': 30430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242570, 'error': None, 'target': 'ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.087 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3946c20c-7371-4724-8de5-8f723f2e51fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.157 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b0cecfa7-d6c1-48f3-bc1d-2997126a6107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.159 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeccab5f7-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.160 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.161 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeccab5f7-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.163 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:34 np0005593233 NetworkManager[48871]: <info>  [1769161414.1643] manager: (tapeccab5f7-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 23 04:43:34 np0005593233 kernel: tapeccab5f7-60: entered promiscuous mode
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.167 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeccab5f7-60, col_values=(('external_ids', {'iface-id': 'e631c964-f1be-4615-b89b-799f9502489b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.168 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.170 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:34Z|00158|binding|INFO|Releasing lport e631c964-f1be-4615-b89b-799f9502489b from this chassis (sb_readonly=0)
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.171 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eccab5f7-6d9c-4e5d-aaca-8a125da0cddb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eccab5f7-6d9c-4e5d-aaca-8a125da0cddb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.175 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1075ec-cc38-4f01-b4c4-ee006f02ca27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.176 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/eccab5f7-6d9c-4e5d-aaca-8a125da0cddb.pid.haproxy
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID eccab5f7-6d9c-4e5d-aaca-8a125da0cddb
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:43:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:34.177 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'env', 'PROCESS_TAG=haproxy-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eccab5f7-6d9c-4e5d-aaca-8a125da0cddb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.196 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.413 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:34 np0005593233 podman[242603]: 2026-01-23 09:43:34.641225194 +0000 UTC m=+0.084824725 container create abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:43:34 np0005593233 podman[242603]: 2026-01-23 09:43:34.591233845 +0000 UTC m=+0.034833476 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:43:34 np0005593233 systemd[1]: Started libpod-conmon-abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb.scope.
Jan 23 04:43:34 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:43:34 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31744423916363fd7156eb36f78316597ad732f4b297cc2181d0ee69670853c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:43:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:34 np0005593233 podman[242603]: 2026-01-23 09:43:34.786299029 +0000 UTC m=+0.229898610 container init abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:43:34 np0005593233 podman[242603]: 2026-01-23 09:43:34.792739543 +0000 UTC m=+0.236339084 container start abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:43:34 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [NOTICE]   (242648) : New worker (242659) forked
Jan 23 04:43:34 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [NOTICE]   (242648) : Loading success.
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.973 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161414.972826, a20beb60-edf4-4d74-b1fe-d7a806a43094 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:34 np0005593233 nova_compute[222017]: 2026-01-23 09:43:34.975 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] VM Started (Lifecycle Event)#033[00m
Jan 23 04:43:35 np0005593233 nova_compute[222017]: 2026-01-23 09:43:35.012 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:35 np0005593233 nova_compute[222017]: 2026-01-23 09:43:35.019 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161414.9748518, a20beb60-edf4-4d74-b1fe-d7a806a43094 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:35 np0005593233 nova_compute[222017]: 2026-01-23 09:43:35.020 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:43:35 np0005593233 nova_compute[222017]: 2026-01-23 09:43:35.084 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:35 np0005593233 nova_compute[222017]: 2026-01-23 09:43:35.090 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:35 np0005593233 nova_compute[222017]: 2026-01-23 09:43:35.129 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:43:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:35.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:43:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:35.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:35.773 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.119 222021 DEBUG nova.compute.manager [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.120 222021 DEBUG oslo_concurrency.lockutils [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.120 222021 DEBUG oslo_concurrency.lockutils [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.121 222021 DEBUG oslo_concurrency.lockutils [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.121 222021 DEBUG nova.compute.manager [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Processing event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.121 222021 DEBUG nova.compute.manager [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.122 222021 DEBUG oslo_concurrency.lockutils [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.122 222021 DEBUG oslo_concurrency.lockutils [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.122 222021 DEBUG oslo_concurrency.lockutils [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.122 222021 DEBUG nova.compute.manager [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] No waiting events found dispatching network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.123 222021 WARNING nova.compute.manager [req-2e6bd281-93e2-44d2-b93e-8b7610a4f495 req-01521eb0-865e-4b64-b3ee-6ee749cf33cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received unexpected event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.124 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.129 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161416.128594, a20beb60-edf4-4d74-b1fe-d7a806a43094 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.129 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.132 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.136 222021 INFO nova.virt.libvirt.driver [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Instance spawned successfully.#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.137 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.156 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.160 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.172 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.173 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.174 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.174 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.175 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.175 222021 DEBUG nova.virt.libvirt.driver [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.182 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.294 222021 INFO nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Took 10.91 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.294 222021 DEBUG nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.467 222021 INFO nova.compute.manager [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Took 12.18 seconds to build instance.#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.495 222021 DEBUG oslo_concurrency.lockutils [None req-77303532-b050-4909-9844-6549d6f9f60c c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:36 np0005593233 nova_compute[222017]: 2026-01-23 09:43:36.830 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 23 04:43:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:37.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:37.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:43:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1581788817' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:43:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:43:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1581788817' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:43:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:39.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:39.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.799 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.800 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.800 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.801 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.801 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.803 222021 INFO nova.compute.manager [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Terminating instance#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.804 222021 DEBUG nova.compute.manager [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:43:39 np0005593233 kernel: tap9bf1dba3-67 (unregistering): left promiscuous mode
Jan 23 04:43:39 np0005593233 NetworkManager[48871]: <info>  [1769161419.8679] device (tap9bf1dba3-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:43:39 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:39Z|00159|binding|INFO|Releasing lport 9bf1dba3-6743-4176-b024-221df7bb7634 from this chassis (sb_readonly=0)
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.885 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:39 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:39Z|00160|binding|INFO|Setting lport 9bf1dba3-6743-4176-b024-221df7bb7634 down in Southbound
Jan 23 04:43:39 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:39Z|00161|binding|INFO|Removing iface tap9bf1dba3-67 ovn-installed in OVS
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.890 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:39.899 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e5:52 10.100.0.7'], port_security=['fa:16:3e:28:e5:52 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '47eda3a7-c47a-48cc-8381-a702e2e27bfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=9bf1dba3-6743-4176-b024-221df7bb7634) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:39.902 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 9bf1dba3-6743-4176-b024-221df7bb7634 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d unbound from our chassis#033[00m
Jan 23 04:43:39 np0005593233 nova_compute[222017]: 2026-01-23 09:43:39.903 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:39.905 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2696fd4-5fd7-4934-88ac-40162fad555d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:43:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:39.907 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4d1b98-4846-49d3-af75-b52f41513216]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:39.908 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace which is not needed anymore#033[00m
Jan 23 04:43:39 np0005593233 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 23 04:43:39 np0005593233 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000031.scope: Consumed 16.045s CPU time.
Jan 23 04:43:39 np0005593233 systemd-machined[190954]: Machine qemu-26-instance-00000031 terminated.
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.041 222021 INFO nova.virt.libvirt.driver [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Instance destroyed successfully.#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.042 222021 DEBUG nova.objects.instance [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'resources' on Instance uuid 47eda3a7-c47a-48cc-8381-a702e2e27bfc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.077 222021 DEBUG nova.virt.libvirt.vif [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:42:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1340106533',display_name='tempest-ImagesTestJSON-server-1340106533',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1340106533',id=49,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:43:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-929q7r12',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:43:11Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=47eda3a7-c47a-48cc-8381-a702e2e27bfc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.077 222021 DEBUG nova.network.os_vif_util [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "9bf1dba3-6743-4176-b024-221df7bb7634", "address": "fa:16:3e:28:e5:52", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bf1dba3-67", "ovs_interfaceid": "9bf1dba3-6743-4176-b024-221df7bb7634", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.078 222021 DEBUG nova.network.os_vif_util [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.079 222021 DEBUG os_vif [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.081 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bf1dba3-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.082 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.085 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.087 222021 INFO os_vif [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:e5:52,bridge_name='br-int',has_traffic_filtering=True,id=9bf1dba3-6743-4176-b024-221df7bb7634,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bf1dba3-67')#033[00m
Jan 23 04:43:40 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [NOTICE]   (241968) : haproxy version is 2.8.14-c23fe91
Jan 23 04:43:40 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [NOTICE]   (241968) : path to executable is /usr/sbin/haproxy
Jan 23 04:43:40 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [WARNING]  (241968) : Exiting Master process...
Jan 23 04:43:40 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [ALERT]    (241968) : Current worker (241970) exited with code 143 (Terminated)
Jan 23 04:43:40 np0005593233 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[241964]: [WARNING]  (241968) : All workers exited. Exiting... (0)
Jan 23 04:43:40 np0005593233 systemd[1]: libpod-f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380.scope: Deactivated successfully.
Jan 23 04:43:40 np0005593233 podman[242703]: 2026-01-23 09:43:40.112045716 +0000 UTC m=+0.063654469 container died f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:43:40 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380-userdata-shm.mount: Deactivated successfully.
Jan 23 04:43:40 np0005593233 systemd[1]: var-lib-containers-storage-overlay-93cb20647b20fd69904119de94bc1ba42fcfdd42e6e89837a0e8ded5583b8467-merged.mount: Deactivated successfully.
Jan 23 04:43:40 np0005593233 podman[242703]: 2026-01-23 09:43:40.156934299 +0000 UTC m=+0.108543022 container cleanup f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:43:40 np0005593233 systemd[1]: libpod-conmon-f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380.scope: Deactivated successfully.
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.196 222021 DEBUG nova.compute.manager [req-dadda8c4-ca5d-4138-86d8-79f15bb34d63 req-ac00f315-55c3-41fc-b7e4-95197a8df20d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-vif-unplugged-9bf1dba3-6743-4176-b024-221df7bb7634 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.196 222021 DEBUG oslo_concurrency.lockutils [req-dadda8c4-ca5d-4138-86d8-79f15bb34d63 req-ac00f315-55c3-41fc-b7e4-95197a8df20d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.197 222021 DEBUG oslo_concurrency.lockutils [req-dadda8c4-ca5d-4138-86d8-79f15bb34d63 req-ac00f315-55c3-41fc-b7e4-95197a8df20d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.197 222021 DEBUG oslo_concurrency.lockutils [req-dadda8c4-ca5d-4138-86d8-79f15bb34d63 req-ac00f315-55c3-41fc-b7e4-95197a8df20d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.197 222021 DEBUG nova.compute.manager [req-dadda8c4-ca5d-4138-86d8-79f15bb34d63 req-ac00f315-55c3-41fc-b7e4-95197a8df20d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] No waiting events found dispatching network-vif-unplugged-9bf1dba3-6743-4176-b024-221df7bb7634 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.198 222021 DEBUG nova.compute.manager [req-dadda8c4-ca5d-4138-86d8-79f15bb34d63 req-ac00f315-55c3-41fc-b7e4-95197a8df20d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-vif-unplugged-9bf1dba3-6743-4176-b024-221df7bb7634 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:43:40 np0005593233 podman[242756]: 2026-01-23 09:43:40.316825688 +0000 UTC m=+0.139648321 container remove f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.324 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2e8bb5-6ecf-4a11-b121-b671b4f2e61c]: (4, ('Fri Jan 23 09:43:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380)\nf75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380\nFri Jan 23 09:43:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (f75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380)\nf75153f5dbfdbe6694cd282f31d4b4aff4f25531c386bdfb6016aad02c15d380\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.326 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca6eb48-3f91-4f83-857c-95c40d1d2f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.327 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.328 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:40 np0005593233 kernel: tapc2696fd4-50: left promiscuous mode
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.343 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.346 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[af5f00b7-f7fe-439b-a9f2-809da68b0b88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.362 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5264d1cd-79e3-4de5-b900-6d176fc29bb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.364 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[79d77c47-6fde-4ce8-891e-a82c1be80485]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.381 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[594692b6-2ae9-4979-b850-b54801cb5504]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529414, 'reachable_time': 31626, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242768, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 systemd[1]: run-netns-ovnmeta\x2dc2696fd4\x2d5fd7\x2d4934\x2d88ac\x2d40162fad555d.mount: Deactivated successfully.
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.385 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:43:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:40.385 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[97ecc6b6-a79e-41b8-abb2-310cbcfc6cde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.773 222021 INFO nova.virt.libvirt.driver [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Deleting instance files /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc_del#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.774 222021 INFO nova.virt.libvirt.driver [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Deletion of /var/lib/nova/instances/47eda3a7-c47a-48cc-8381-a702e2e27bfc_del complete#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.834 222021 INFO nova.compute.manager [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Took 1.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.834 222021 DEBUG oslo.service.loopingcall [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.834 222021 DEBUG nova.compute.manager [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:43:40 np0005593233 nova_compute[222017]: 2026-01-23 09:43:40.835 222021 DEBUG nova.network.neutron [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:43:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:41.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:41.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:43:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:43:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:43:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.171 222021 DEBUG nova.network.neutron [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.195 222021 INFO nova.compute.manager [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Took 1.36 seconds to deallocate network for instance.#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.253 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.254 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.272 222021 DEBUG nova.compute.manager [req-9f4c3ac6-9b62-4523-9a5b-994492ff8fc5 req-2436e3bb-de2c-4d83-b383-cc3b098169bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-vif-deleted-9bf1dba3-6743-4176-b024-221df7bb7634 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.352 222021 DEBUG oslo_concurrency.processutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.410 222021 DEBUG nova.compute.manager [req-d5310821-9836-48e2-98e0-aee6133e6287 req-5f64d1ab-56ac-4e01-92a3-7efa19b0e003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.411 222021 DEBUG oslo_concurrency.lockutils [req-d5310821-9836-48e2-98e0-aee6133e6287 req-5f64d1ab-56ac-4e01-92a3-7efa19b0e003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.411 222021 DEBUG oslo_concurrency.lockutils [req-d5310821-9836-48e2-98e0-aee6133e6287 req-5f64d1ab-56ac-4e01-92a3-7efa19b0e003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.412 222021 DEBUG oslo_concurrency.lockutils [req-d5310821-9836-48e2-98e0-aee6133e6287 req-5f64d1ab-56ac-4e01-92a3-7efa19b0e003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.412 222021 DEBUG nova.compute.manager [req-d5310821-9836-48e2-98e0-aee6133e6287 req-5f64d1ab-56ac-4e01-92a3-7efa19b0e003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] No waiting events found dispatching network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.412 222021 WARNING nova.compute.manager [req-d5310821-9836-48e2-98e0-aee6133e6287 req-5f64d1ab-56ac-4e01-92a3-7efa19b0e003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Received unexpected event network-vif-plugged-9bf1dba3-6743-4176-b024-221df7bb7634 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:43:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:42.646 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:42.649 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:43:42.650 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.870 222021 DEBUG oslo_concurrency.processutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.877 222021 DEBUG nova.compute.provider_tree [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.902 222021 DEBUG nova.scheduler.client.report [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.931 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:42 np0005593233 nova_compute[222017]: 2026-01-23 09:43:42.963 222021 INFO nova.scheduler.client.report [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Deleted allocations for instance 47eda3a7-c47a-48cc-8381-a702e2e27bfc#033[00m
Jan 23 04:43:43 np0005593233 nova_compute[222017]: 2026-01-23 09:43:43.069 222021 DEBUG oslo_concurrency.lockutils [None req-afadbccb-1781-4619-a456-d91929632230 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "47eda3a7-c47a-48cc-8381-a702e2e27bfc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:43.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:43.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:44 np0005593233 podman[242921]: 2026-01-23 09:43:44.103103497 +0000 UTC m=+0.112536497 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:43:44 np0005593233 nova_compute[222017]: 2026-01-23 09:43:44.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:44 np0005593233 nova_compute[222017]: 2026-01-23 09:43:44.542 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:44 np0005593233 NetworkManager[48871]: <info>  [1769161424.5437] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 23 04:43:44 np0005593233 NetworkManager[48871]: <info>  [1769161424.5464] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 23 04:43:44 np0005593233 nova_compute[222017]: 2026-01-23 09:43:44.695 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:44Z|00162|binding|INFO|Releasing lport e631c964-f1be-4615-b89b-799f9502489b from this chassis (sb_readonly=0)
Jan 23 04:43:44 np0005593233 nova_compute[222017]: 2026-01-23 09:43:44.713 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:45 np0005593233 nova_compute[222017]: 2026-01-23 09:43:45.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:45.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:46 np0005593233 nova_compute[222017]: 2026-01-23 09:43:46.011 222021 DEBUG nova.compute.manager [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-changed-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:46 np0005593233 nova_compute[222017]: 2026-01-23 09:43:46.012 222021 DEBUG nova.compute.manager [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Refreshing instance network info cache due to event network-changed-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:43:46 np0005593233 nova_compute[222017]: 2026-01-23 09:43:46.012 222021 DEBUG oslo_concurrency.lockutils [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:46 np0005593233 nova_compute[222017]: 2026-01-23 09:43:46.012 222021 DEBUG oslo_concurrency.lockutils [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:46 np0005593233 nova_compute[222017]: 2026-01-23 09:43:46.013 222021 DEBUG nova.network.neutron [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Refreshing network info cache for port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:43:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:47.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:48 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:48Z|00163|binding|INFO|Releasing lport e631c964-f1be-4615-b89b-799f9502489b from this chassis (sb_readonly=0)
Jan 23 04:43:48 np0005593233 nova_compute[222017]: 2026-01-23 09:43:48.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:49 np0005593233 nova_compute[222017]: 2026-01-23 09:43:49.073 222021 DEBUG nova.network.neutron [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updated VIF entry in instance network info cache for port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:43:49 np0005593233 nova_compute[222017]: 2026-01-23 09:43:49.075 222021 DEBUG nova.network.neutron [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updating instance_info_cache with network_info: [{"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:49 np0005593233 nova_compute[222017]: 2026-01-23 09:43:49.120 222021 DEBUG oslo_concurrency.lockutils [req-97b3abcb-4290-4d00-be53-46233faef378 req-9247bd47-0440-4099-b50a-12f71c1cadd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 23 04:43:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:49.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:49 np0005593233 nova_compute[222017]: 2026-01-23 09:43:49.423 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:49.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:50 np0005593233 nova_compute[222017]: 2026-01-23 09:43:50.085 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:50Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:14:ec 10.100.0.14
Jan 23 04:43:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:43:50Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:14:ec 10.100.0.14
Jan 23 04:43:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:51.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:51.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:53.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:53.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:54 np0005593233 nova_compute[222017]: 2026-01-23 09:43:54.426 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:55 np0005593233 nova_compute[222017]: 2026-01-23 09:43:55.038 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161420.0369935, 47eda3a7-c47a-48cc-8381-a702e2e27bfc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:55 np0005593233 nova_compute[222017]: 2026-01-23 09:43:55.039 222021 INFO nova.compute.manager [-] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:43:55 np0005593233 nova_compute[222017]: 2026-01-23 09:43:55.087 222021 DEBUG nova.compute.manager [None req-b5002ab0-d2df-4f5f-acd3-201f99307e87 - - - - - -] [instance: 47eda3a7-c47a-48cc-8381-a702e2e27bfc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:55 np0005593233 nova_compute[222017]: 2026-01-23 09:43:55.131 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:55.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:55.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:56 np0005593233 podman[242998]: 2026-01-23 09:43:56.111882765 +0000 UTC m=+0.094181552 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:43:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:57.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:43:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:59.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:43:59 np0005593233 nova_compute[222017]: 2026-01-23 09:43:59.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:43:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:43:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:59.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:43:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:00 np0005593233 nova_compute[222017]: 2026-01-23 09:44:00.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:01.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:01Z|00164|binding|INFO|Releasing lport e631c964-f1be-4615-b89b-799f9502489b from this chassis (sb_readonly=0)
Jan 23 04:44:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:01.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:01 np0005593233 nova_compute[222017]: 2026-01-23 09:44:01.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:03.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 23 04:44:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:03.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:03 np0005593233 nova_compute[222017]: 2026-01-23 09:44:03.934 222021 DEBUG nova.compute.manager [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-changed-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:03 np0005593233 nova_compute[222017]: 2026-01-23 09:44:03.934 222021 DEBUG nova.compute.manager [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Refreshing instance network info cache due to event network-changed-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:44:03 np0005593233 nova_compute[222017]: 2026-01-23 09:44:03.934 222021 DEBUG oslo_concurrency.lockutils [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:44:03 np0005593233 nova_compute[222017]: 2026-01-23 09:44:03.935 222021 DEBUG oslo_concurrency.lockutils [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:44:03 np0005593233 nova_compute[222017]: 2026-01-23 09:44:03.935 222021 DEBUG nova.network.neutron [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Refreshing network info cache for port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:44:04 np0005593233 nova_compute[222017]: 2026-01-23 09:44:04.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 23 04:44:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:05 np0005593233 nova_compute[222017]: 2026-01-23 09:44:05.137 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:05.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 23 04:44:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:05.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:06 np0005593233 nova_compute[222017]: 2026-01-23 09:44:06.106 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:06 np0005593233 nova_compute[222017]: 2026-01-23 09:44:06.251 222021 DEBUG nova.network.neutron [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updated VIF entry in instance network info cache for port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:44:06 np0005593233 nova_compute[222017]: 2026-01-23 09:44:06.252 222021 DEBUG nova.network.neutron [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updating instance_info_cache with network_info: [{"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:06 np0005593233 nova_compute[222017]: 2026-01-23 09:44:06.286 222021 DEBUG oslo_concurrency.lockutils [req-e7b9ec1f-fffe-476f-9316-b08561aaf663 req-30ae6b6f-a5b2-4ea8-884c-15d4c484737f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a20beb60-edf4-4d74-b1fe-d7a806a43094" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:44:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 23 04:44:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:07Z|00165|binding|INFO|Releasing lport e631c964-f1be-4615-b89b-799f9502489b from this chassis (sb_readonly=0)
Jan 23 04:44:07 np0005593233 nova_compute[222017]: 2026-01-23 09:44:07.017 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:07Z|00166|binding|INFO|Releasing lport e631c964-f1be-4615-b89b-799f9502489b from this chassis (sb_readonly=0)
Jan 23 04:44:07 np0005593233 nova_compute[222017]: 2026-01-23 09:44:07.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:07.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:07.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:09.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:09 np0005593233 nova_compute[222017]: 2026-01-23 09:44:09.434 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:44:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:09.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:44:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:10 np0005593233 nova_compute[222017]: 2026-01-23 09:44:10.140 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:10 np0005593233 nova_compute[222017]: 2026-01-23 09:44:10.403 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:11.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:11.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.806 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.807 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.808 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.808 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.808 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.810 222021 INFO nova.compute.manager [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Terminating instance#033[00m
Jan 23 04:44:11 np0005593233 nova_compute[222017]: 2026-01-23 09:44:11.811 222021 DEBUG nova.compute.manager [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:44:12 np0005593233 kernel: tap07a3d7bb-3d (unregistering): left promiscuous mode
Jan 23 04:44:12 np0005593233 NetworkManager[48871]: <info>  [1769161452.3420] device (tap07a3d7bb-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:44:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:12Z|00167|binding|INFO|Releasing lport 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab from this chassis (sb_readonly=0)
Jan 23 04:44:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:12Z|00168|binding|INFO|Setting lport 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab down in Southbound
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.354 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:12Z|00169|binding|INFO|Removing iface tap07a3d7bb-3d ovn-installed in OVS
Jan 23 04:44:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:12.411 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:14:ec 10.100.0.14'], port_security=['fa:16:3e:7a:14:ec 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a20beb60-edf4-4d74-b1fe-d7a806a43094', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1636cc2fefd24ee3803797b94a3a30a4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2169c6e-10e7-450c-9de2-03f687338bd7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40d01e86-7571-4c5e-ac86-7993de7c4ca7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:44:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:12.412 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab in datapath eccab5f7-6d9c-4e5d-aaca-8a125da0cddb unbound from our chassis#033[00m
Jan 23 04:44:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:12.414 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:44:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:12.415 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4917524f-0399-434b-93e0-76891d069f32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:12.416 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb namespace which is not needed anymore#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.424 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:12 np0005593233 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 23 04:44:12 np0005593233 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000033.scope: Consumed 16.267s CPU time.
Jan 23 04:44:12 np0005593233 systemd-machined[190954]: Machine qemu-27-instance-00000033 terminated.
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.669 222021 INFO nova.virt.libvirt.driver [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Instance destroyed successfully.#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.669 222021 DEBUG nova.objects.instance [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lazy-loading 'resources' on Instance uuid a20beb60-edf4-4d74-b1fe-d7a806a43094 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.688 222021 DEBUG nova.virt.libvirt.vif [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:43:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-848442657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-848442657',id=51,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:43:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1636cc2fefd24ee3803797b94a3a30a4',ramdisk_id='',reservation_id='r-l5q4bx3c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-421410905',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-421410905-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:43:36Z,user_data=None,user_id='c24a9322e0b749479afb44a44db2c404',uuid=a20beb60-edf4-4d74-b1fe-d7a806a43094,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.688 222021 DEBUG nova.network.os_vif_util [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Converting VIF {"id": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "address": "fa:16:3e:7a:14:ec", "network": {"id": "eccab5f7-6d9c-4e5d-aaca-8a125da0cddb", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1054088849-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1636cc2fefd24ee3803797b94a3a30a4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a3d7bb-3d", "ovs_interfaceid": "07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.689 222021 DEBUG nova.network.os_vif_util [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.689 222021 DEBUG os_vif [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.693 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07a3d7bb-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:12 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [NOTICE]   (242648) : haproxy version is 2.8.14-c23fe91
Jan 23 04:44:12 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [NOTICE]   (242648) : path to executable is /usr/sbin/haproxy
Jan 23 04:44:12 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [WARNING]  (242648) : Exiting Master process...
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.696 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:12 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [ALERT]    (242648) : Current worker (242659) exited with code 143 (Terminated)
Jan 23 04:44:12 np0005593233 neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb[242626]: [WARNING]  (242648) : All workers exited. Exiting... (0)
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.700 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:44:12 np0005593233 systemd[1]: libpod-abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb.scope: Deactivated successfully.
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.704 222021 INFO os_vif [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:14:ec,bridge_name='br-int',has_traffic_filtering=True,id=07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab,network=Network(eccab5f7-6d9c-4e5d-aaca-8a125da0cddb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a3d7bb-3d')#033[00m
Jan 23 04:44:12 np0005593233 podman[243043]: 2026-01-23 09:44:12.706047606 +0000 UTC m=+0.176062612 container died abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:44:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb-userdata-shm.mount: Deactivated successfully.
Jan 23 04:44:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay-d31744423916363fd7156eb36f78316597ad732f4b297cc2181d0ee69670853c-merged.mount: Deactivated successfully.
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.812 222021 DEBUG nova.compute.manager [req-e7f06302-08b7-4bdf-b363-c9bacb1e0aa5 req-d8c5b403-9843-4c42-97a8-8832a24a1487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-vif-unplugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.813 222021 DEBUG oslo_concurrency.lockutils [req-e7f06302-08b7-4bdf-b363-c9bacb1e0aa5 req-d8c5b403-9843-4c42-97a8-8832a24a1487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.814 222021 DEBUG oslo_concurrency.lockutils [req-e7f06302-08b7-4bdf-b363-c9bacb1e0aa5 req-d8c5b403-9843-4c42-97a8-8832a24a1487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.814 222021 DEBUG oslo_concurrency.lockutils [req-e7f06302-08b7-4bdf-b363-c9bacb1e0aa5 req-d8c5b403-9843-4c42-97a8-8832a24a1487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.814 222021 DEBUG nova.compute.manager [req-e7f06302-08b7-4bdf-b363-c9bacb1e0aa5 req-d8c5b403-9843-4c42-97a8-8832a24a1487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] No waiting events found dispatching network-vif-unplugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:44:12 np0005593233 nova_compute[222017]: 2026-01-23 09:44:12.815 222021 DEBUG nova.compute.manager [req-e7f06302-08b7-4bdf-b363-c9bacb1e0aa5 req-d8c5b403-9843-4c42-97a8-8832a24a1487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-vif-unplugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:44:12 np0005593233 podman[243043]: 2026-01-23 09:44:12.919002221 +0000 UTC m=+0.389017227 container cleanup abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:44:12 np0005593233 systemd[1]: libpod-conmon-abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb.scope: Deactivated successfully.
Jan 23 04:44:12 np0005593233 podman[243100]: 2026-01-23 09:44:12.999026817 +0000 UTC m=+0.052806570 container remove abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.008 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[738c739a-1196-49d0-a15c-9c710e3268a3]: (4, ('Fri Jan 23 09:44:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb (abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb)\nabab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb\nFri Jan 23 09:44:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb (abab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb)\nabab25ef10fcc54df7e6f46ae38229e2145c647e896593d565df7d8d4a3dccdb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.011 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6f06ecd1-260a-4593-b0c0-1c112ad88718]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.012 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeccab5f7-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:13 np0005593233 nova_compute[222017]: 2026-01-23 09:44:13.014 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:13 np0005593233 kernel: tapeccab5f7-60: left promiscuous mode
Jan 23 04:44:13 np0005593233 nova_compute[222017]: 2026-01-23 09:44:13.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.035 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[92c00423-1130-4561-8c44-10e51b94dd6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.054 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a097d206-2994-437e-8da4-be8d29bffb14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.056 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1c0f3c-898b-49ba-b07a-6507f246f4e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.076 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[44bf67d2-fa73-4ff8-ba60-a459a46861da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532651, 'reachable_time': 22004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243115, 'error': None, 'target': 'ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 systemd[1]: run-netns-ovnmeta\x2deccab5f7\x2d6d9c\x2d4e5d\x2daaca\x2d8a125da0cddb.mount: Deactivated successfully.
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.081 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eccab5f7-6d9c-4e5d-aaca-8a125da0cddb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:44:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:13.082 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[54d71a0b-f31e-4985-8dbb-dba6a62c0264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:13.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:13 np0005593233 nova_compute[222017]: 2026-01-23 09:44:13.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:13 np0005593233 nova_compute[222017]: 2026-01-23 09:44:13.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:44:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:13.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:13 np0005593233 nova_compute[222017]: 2026-01-23 09:44:13.974 222021 INFO nova.virt.libvirt.driver [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Deleting instance files /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094_del#033[00m
Jan 23 04:44:13 np0005593233 nova_compute[222017]: 2026-01-23 09:44:13.975 222021 INFO nova.virt.libvirt.driver [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Deletion of /var/lib/nova/instances/a20beb60-edf4-4d74-b1fe-d7a806a43094_del complete#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.082 222021 INFO nova.compute.manager [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Took 2.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.084 222021 DEBUG oslo.service.loopingcall [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.085 222021 DEBUG nova.compute.manager [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.085 222021 DEBUG nova.network.neutron [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:44:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.614 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.614 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.615 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.952 222021 DEBUG nova.compute.manager [req-a7746e99-43a5-43ee-9ed7-7cf9e67ca9f3 req-89204a76-6870-42db-9636-61a891531396 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.953 222021 DEBUG oslo_concurrency.lockutils [req-a7746e99-43a5-43ee-9ed7-7cf9e67ca9f3 req-89204a76-6870-42db-9636-61a891531396 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.953 222021 DEBUG oslo_concurrency.lockutils [req-a7746e99-43a5-43ee-9ed7-7cf9e67ca9f3 req-89204a76-6870-42db-9636-61a891531396 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.954 222021 DEBUG oslo_concurrency.lockutils [req-a7746e99-43a5-43ee-9ed7-7cf9e67ca9f3 req-89204a76-6870-42db-9636-61a891531396 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.954 222021 DEBUG nova.compute.manager [req-a7746e99-43a5-43ee-9ed7-7cf9e67ca9f3 req-89204a76-6870-42db-9636-61a891531396 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] No waiting events found dispatching network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:44:14 np0005593233 nova_compute[222017]: 2026-01-23 09:44:14.954 222021 WARNING nova.compute.manager [req-a7746e99-43a5-43ee-9ed7-7cf9e67ca9f3 req-89204a76-6870-42db-9636-61a891531396 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received unexpected event network-vif-plugged-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:44:15 np0005593233 podman[243117]: 2026-01-23 09:44:15.100957958 +0000 UTC m=+0.107484403 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 23 04:44:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:15.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:15.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.703 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.704 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.704 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.704 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.704 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.892 222021 DEBUG nova.network.neutron [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.912 222021 INFO nova.compute.manager [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Took 1.83 seconds to deallocate network for instance.#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.996 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:15 np0005593233 nova_compute[222017]: 2026-01-23 09:44:15.997 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.048 222021 DEBUG nova.scheduler.client.report [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.084 222021 DEBUG nova.scheduler.client.report [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.085 222021 DEBUG nova.compute.provider_tree [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.107 222021 DEBUG nova.scheduler.client.report [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.137 222021 DEBUG nova.scheduler.client.report [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.178 222021 DEBUG oslo_concurrency.processutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/990241739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.224 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.331 222021 DEBUG nova.compute.manager [req-6235de18-4504-470a-a1f3-67213bedbe56 req-3e5a2483-f3c4-4c4d-9ad0-7c1de3e904ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Received event network-vif-deleted-07a3d7bb-3d19-4f60-a8e1-4cd821b7bbab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.435 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.437 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4742MB free_disk=20.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.438 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2370023076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.705 222021 DEBUG oslo_concurrency.processutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.713 222021 DEBUG nova.compute.provider_tree [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.744 222021 DEBUG nova.scheduler.client.report [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.766 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.769 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.801 222021 INFO nova.scheduler.client.report [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Deleted allocations for instance a20beb60-edf4-4d74-b1fe-d7a806a43094#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.840 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.841 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.860 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:16 np0005593233 nova_compute[222017]: 2026-01-23 09:44:16.908 222021 DEBUG oslo_concurrency.lockutils [None req-2149902d-c2ab-4a57-8389-40ee1f2db1bc c24a9322e0b749479afb44a44db2c404 1636cc2fefd24ee3803797b94a3a30a4 - - default default] Lock "a20beb60-edf4-4d74-b1fe-d7a806a43094" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:17.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/996120962' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:17 np0005593233 nova_compute[222017]: 2026-01-23 09:44:17.368 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:17 np0005593233 nova_compute[222017]: 2026-01-23 09:44:17.374 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:44:17 np0005593233 nova_compute[222017]: 2026-01-23 09:44:17.404 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:44:17 np0005593233 nova_compute[222017]: 2026-01-23 09:44:17.436 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:44:17 np0005593233 nova_compute[222017]: 2026-01-23 09:44:17.436 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:17.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:17 np0005593233 nova_compute[222017]: 2026-01-23 09:44:17.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:19.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:19 np0005593233 nova_compute[222017]: 2026-01-23 09:44:19.436 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:19 np0005593233 nova_compute[222017]: 2026-01-23 09:44:19.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:19.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:21.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:21.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:22 np0005593233 nova_compute[222017]: 2026-01-23 09:44:22.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:22 np0005593233 nova_compute[222017]: 2026-01-23 09:44:22.700 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:23.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:23 np0005593233 nova_compute[222017]: 2026-01-23 09:44:23.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:23.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:24 np0005593233 nova_compute[222017]: 2026-01-23 09:44:24.468 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:25.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:25.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:27 np0005593233 podman[243210]: 2026-01-23 09:44:27.090696333 +0000 UTC m=+0.092305769 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:44:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:27.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:27.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:27 np0005593233 nova_compute[222017]: 2026-01-23 09:44:27.666 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161452.6635582, a20beb60-edf4-4d74-b1fe-d7a806a43094 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:27 np0005593233 nova_compute[222017]: 2026-01-23 09:44:27.666 222021 INFO nova.compute.manager [-] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:44:27 np0005593233 nova_compute[222017]: 2026-01-23 09:44:27.688 222021 DEBUG nova.compute.manager [None req-3673cb97-2519-4ca0-ae7f-b775d1c4ef21 - - - - - -] [instance: a20beb60-edf4-4d74-b1fe-d7a806a43094] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:27 np0005593233 nova_compute[222017]: 2026-01-23 09:44:27.702 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:29.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:29 np0005593233 nova_compute[222017]: 2026-01-23 09:44:29.471 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:29.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:31 np0005593233 nova_compute[222017]: 2026-01-23 09:44:31.147 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:31.146 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:44:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:31.149 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:44:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:31.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:31.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:32 np0005593233 nova_compute[222017]: 2026-01-23 09:44:32.704 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:33.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:33.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:34 np0005593233 nova_compute[222017]: 2026-01-23 09:44:34.473 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:35.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:35.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:37.151 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:37.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:37.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:37 np0005593233 nova_compute[222017]: 2026-01-23 09:44:37.707 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.531 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.532 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.553 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.641 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.641 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.649 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.650 222021 INFO nova.compute.claims [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:44:38 np0005593233 nova_compute[222017]: 2026-01-23 09:44:38.844 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/673792123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.304 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.310 222021 DEBUG nova.compute.provider_tree [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:44:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:39.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.337 222021 DEBUG nova.scheduler.client.report [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.374 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.375 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.437 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.437 222021 DEBUG nova.network.neutron [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.461 222021 INFO nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.475 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.483 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.640 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.642 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.643 222021 INFO nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Creating image(s)#033[00m
Jan 23 04:44:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:39.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.682 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.723 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.752 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.756 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.826 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.827 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.828 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.829 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.864 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.870 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:39 np0005593233 nova_compute[222017]: 2026-01-23 09:44:39.914 222021 DEBUG nova.policy [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1270518e615c4c63a54865bfe906ce5d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c05913e2e5c046bf92e6c8c855833959', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:44:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:41.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.490 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.620s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.563 222021 DEBUG nova.network.neutron [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Successfully created port: 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.570 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] resizing rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:44:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:41.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.674 222021 DEBUG nova.objects.instance [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lazy-loading 'migration_context' on Instance uuid ab4209ce-8ecd-48d4-9826-fe7501e19da8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.702 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.703 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Ensure instance console log exists: /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.703 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.704 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:41 np0005593233 nova_compute[222017]: 2026-01-23 09:44:41.704 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:42.647 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:42.647 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:42.647 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:42 np0005593233 nova_compute[222017]: 2026-01-23 09:44:42.709 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:42 np0005593233 nova_compute[222017]: 2026-01-23 09:44:42.938 222021 DEBUG nova.network.neutron [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Successfully updated port: 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:44:42 np0005593233 nova_compute[222017]: 2026-01-23 09:44:42.959 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:44:42 np0005593233 nova_compute[222017]: 2026-01-23 09:44:42.960 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:44:42 np0005593233 nova_compute[222017]: 2026-01-23 09:44:42.960 222021 DEBUG nova.network.neutron [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:44:43 np0005593233 nova_compute[222017]: 2026-01-23 09:44:43.088 222021 DEBUG nova.compute.manager [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:43 np0005593233 nova_compute[222017]: 2026-01-23 09:44:43.088 222021 DEBUG nova.compute.manager [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing instance network info cache due to event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:44:43 np0005593233 nova_compute[222017]: 2026-01-23 09:44:43.089 222021 DEBUG oslo_concurrency.lockutils [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:44:43 np0005593233 nova_compute[222017]: 2026-01-23 09:44:43.198 222021 DEBUG nova.network.neutron [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:44:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:43.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:43.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.166 222021 DEBUG nova.network.neutron [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.197 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.198 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Instance network_info: |[{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.199 222021 DEBUG oslo_concurrency.lockutils [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.200 222021 DEBUG nova.network.neutron [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.205 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Start _get_guest_xml network_info=[{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.213 222021 WARNING nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.226 222021 DEBUG nova.virt.libvirt.host [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.227 222021 DEBUG nova.virt.libvirt.host [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.231 222021 DEBUG nova.virt.libvirt.host [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.232 222021 DEBUG nova.virt.libvirt.host [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.233 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.234 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.234 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.235 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.235 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.235 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.235 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.236 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.236 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.236 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.237 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.237 222021 DEBUG nova.virt.hardware [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.241 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1170622131' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1170622131' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.477 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2581339563' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.734 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.776 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:44 np0005593233 nova_compute[222017]: 2026-01-23 09:44:44.780 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:44:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/115919361' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.206 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.208 222021 DEBUG nova.virt.libvirt.vif [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:44:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1741184475',display_name='tempest-FloatingIPsAssociationTestJSON-server-1741184475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1741184475',id=54,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c05913e2e5c046bf92e6c8c855833959',ramdisk_id='',reservation_id='r-e8i000mq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1969273951',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1969273951-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:39Z,user_data=None,user_id='1270518e615c4c63a54865bfe906ce5d',uuid=ab4209ce-8ecd-48d4-9826-fe7501e19da8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.208 222021 DEBUG nova.network.os_vif_util [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Converting VIF {"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.209 222021 DEBUG nova.network.os_vif_util [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.210 222021 DEBUG nova.objects.instance [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lazy-loading 'pci_devices' on Instance uuid ab4209ce-8ecd-48d4-9826-fe7501e19da8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:44:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:45.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.483 222021 DEBUG nova.network.neutron [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated VIF entry in instance network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.483 222021 DEBUG nova.network.neutron [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:45.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.677 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <uuid>ab4209ce-8ecd-48d4-9826-fe7501e19da8</uuid>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <name>instance-00000036</name>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1741184475</nova:name>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:44:44</nova:creationTime>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:user uuid="1270518e615c4c63a54865bfe906ce5d">tempest-FloatingIPsAssociationTestJSON-1969273951-project-member</nova:user>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:project uuid="c05913e2e5c046bf92e6c8c855833959">tempest-FloatingIPsAssociationTestJSON-1969273951</nova:project>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <nova:port uuid="05c086a7-19a8-4dfd-9eb7-dff8452eb4aa">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <entry name="serial">ab4209ce-8ecd-48d4-9826-fe7501e19da8</entry>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <entry name="uuid">ab4209ce-8ecd-48d4-9826-fe7501e19da8</entry>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk.config">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:f5:51:60"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <target dev="tap05c086a7-19"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/console.log" append="off"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:44:45 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:44:45 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:44:45 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:44:45 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.678 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Preparing to wait for external event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.678 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.679 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.679 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.679 222021 DEBUG nova.virt.libvirt.vif [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:44:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1741184475',display_name='tempest-FloatingIPsAssociationTestJSON-server-1741184475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1741184475',id=54,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c05913e2e5c046bf92e6c8c855833959',ramdisk_id='',reservation_id='r-e8i000mq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1969273951',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1969273951-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:39Z,user_data=None,user_id='1270518e615c4c63a54865bfe906ce5d',uuid=ab4209ce-8ecd-48d4-9826-fe7501e19da8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.680 222021 DEBUG nova.network.os_vif_util [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Converting VIF {"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.680 222021 DEBUG nova.network.os_vif_util [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.681 222021 DEBUG os_vif [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.682 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.682 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.686 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05c086a7-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.687 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap05c086a7-19, col_values=(('external_ids', {'iface-id': '05c086a7-19a8-4dfd-9eb7-dff8452eb4aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:51:60', 'vm-uuid': 'ab4209ce-8ecd-48d4-9826-fe7501e19da8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.689 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:45 np0005593233 NetworkManager[48871]: <info>  [1769161485.6904] manager: (tap05c086a7-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.690 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.697 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.698 222021 INFO os_vif [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19')#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.751 222021 DEBUG oslo_concurrency.lockutils [req-86beba15-dea4-4067-a6ac-205c65569d8c req-6eae6674-3b43-484c-855c-c21429b1c45a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.826 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.827 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.827 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] No VIF found with MAC fa:16:3e:f5:51:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.828 222021 INFO nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Using config drive#033[00m
Jan 23 04:44:45 np0005593233 nova_compute[222017]: 2026-01-23 09:44:45.859 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:46 np0005593233 podman[243500]: 2026-01-23 09:44:46.135883361 +0000 UTC m=+0.131464888 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.447 222021 INFO nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Creating config drive at /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/disk.config#033[00m
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.454 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03foksv6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.598 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03foksv6" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.644 222021 DEBUG nova.storage.rbd_utils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] rbd image ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.650 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/disk.config ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.878 222021 DEBUG oslo_concurrency.processutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/disk.config ab4209ce-8ecd-48d4-9826-fe7501e19da8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:46 np0005593233 nova_compute[222017]: 2026-01-23 09:44:46.880 222021 INFO nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Deleting local config drive /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8/disk.config because it was imported into RBD.#033[00m
Jan 23 04:44:46 np0005593233 kernel: tap05c086a7-19: entered promiscuous mode
Jan 23 04:44:46 np0005593233 NetworkManager[48871]: <info>  [1769161486.9589] manager: (tap05c086a7-19): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Jan 23 04:44:47 np0005593233 systemd-udevd[243576]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:44:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:47Z|00170|binding|INFO|Claiming lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa for this chassis.
Jan 23 04:44:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:47Z|00171|binding|INFO|05c086a7-19a8-4dfd-9eb7-dff8452eb4aa: Claiming fa:16:3e:f5:51:60 10.100.0.14
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.018 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 NetworkManager[48871]: <info>  [1769161487.0380] device (tap05c086a7-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:44:47 np0005593233 NetworkManager[48871]: <info>  [1769161487.0389] device (tap05c086a7-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.037 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:51:60 10.100.0.14'], port_security=['fa:16:3e:f5:51:60 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ab4209ce-8ecd-48d4-9826-fe7501e19da8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e98627d8-446e-4b60-8051-8e37123acd76', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c05913e2e5c046bf92e6c8c855833959', 'neutron:revision_number': '2', 'neutron:security_group_ids': '151c9987-f620-4133-b1a2-fc22782378bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8711ac9b-6a17-48e3-a5cd-eace160ad350, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.039 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa in datapath e98627d8-446e-4b60-8051-8e37123acd76 bound to our chassis#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.042 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e98627d8-446e-4b60-8051-8e37123acd76#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.061 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d1fc65-e020-409e-a868-0f615dfac6d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.062 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape98627d8-41 in ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.066 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape98627d8-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.066 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[138ab629-b0cc-4a1d-bff9-2db4b236ca95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.067 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8a389747-8adb-48d7-9a2a-f62a182d5c16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 systemd-machined[190954]: New machine qemu-28-instance-00000036.
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.085 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[78436360-a9a4-4fc2-87be-9acd860d982f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:47Z|00172|binding|INFO|Setting lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa ovn-installed in OVS
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.095 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:47Z|00173|binding|INFO|Setting lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa up in Southbound
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 systemd[1]: Started Virtual Machine qemu-28-instance-00000036.
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.106 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb51022-af15-4b30-aebe-363e79ca0941]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.145 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6317502c-a304-4d5a-a230-8983d6e4a3eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 NetworkManager[48871]: <info>  [1769161487.1558] manager: (tape98627d8-40): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.155 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d13753-64c4-4f09-a88b-7ebba33c63bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.195 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8e1cb493-d732-471c-9e92-ae83a33c64b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.199 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3e7d1345-6668-4850-a3a1-47250c4c1f50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 NetworkManager[48871]: <info>  [1769161487.2253] device (tape98627d8-40): carrier: link connected
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.230 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1efc1708-ecb3-4c78-85b7-abdf763b180b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.245 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8f798a8b-3cbd-4a59-857b-4f559d685eb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape98627d8-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:9a:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539985, 'reachable_time': 36822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243612, 'error': None, 'target': 'ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.265 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eadcb4d0-13b9-4c17-8b29-21a344745599]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:9a62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539985, 'tstamp': 539985}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243613, 'error': None, 'target': 'ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.294 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[02a9cbc5-cb79-4016-8a75-caf6bef24243]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape98627d8-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:9a:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539985, 'reachable_time': 36822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243614, 'error': None, 'target': 'ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:47.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.348 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb2c159-8fe9-4c28-907b-1a5fe7e3f016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.431 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4da0824e-9b20-478f-ad17-04eb66824179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.433 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape98627d8-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.433 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.434 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape98627d8-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:47 np0005593233 kernel: tape98627d8-40: entered promiscuous mode
Jan 23 04:44:47 np0005593233 NetworkManager[48871]: <info>  [1769161487.4368] manager: (tape98627d8-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.438 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.445 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape98627d8-40, col_values=(('external_ids', {'iface-id': 'cc7bf29e-8ca5-4432-828c-b26d34e969d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.446 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:44:47Z|00174|binding|INFO|Releasing lport cc7bf29e-8ca5-4432-828c-b26d34e969d3 from this chassis (sb_readonly=0)
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.447 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.450 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e98627d8-446e-4b60-8051-8e37123acd76.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e98627d8-446e-4b60-8051-8e37123acd76.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.451 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[35e037a8-6b2c-4642-b72c-a8d8faf37c23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.452 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-e98627d8-446e-4b60-8051-8e37123acd76
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/e98627d8-446e-4b60-8051-8e37123acd76.pid.haproxy
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID e98627d8-446e-4b60-8051-8e37123acd76
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:44:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:44:47.454 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76', 'env', 'PROCESS_TAG=haproxy-e98627d8-446e-4b60-8051-8e37123acd76', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e98627d8-446e-4b60-8051-8e37123acd76.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.460 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:47.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:47 np0005593233 podman[243686]: 2026-01-23 09:44:47.909476158 +0000 UTC m=+0.069287130 container create fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.953 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161487.9523396, ab4209ce-8ecd-48d4-9826-fe7501e19da8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:47 np0005593233 nova_compute[222017]: 2026-01-23 09:44:47.954 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] VM Started (Lifecycle Event)#033[00m
Jan 23 04:44:47 np0005593233 systemd[1]: Started libpod-conmon-fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989.scope.
Jan 23 04:44:47 np0005593233 podman[243686]: 2026-01-23 09:44:47.874756886 +0000 UTC m=+0.034567888 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:44:47 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:44:48 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c886c086d08c69367cc4c9218b52a2ad4c0ee901a27c2c0a2f268b7f4163b112/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:44:48 np0005593233 podman[243686]: 2026-01-23 09:44:48.01941463 +0000 UTC m=+0.179225602 container init fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:44:48 np0005593233 podman[243686]: 2026-01-23 09:44:48.025305018 +0000 UTC m=+0.185115990 container start fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:44:48 np0005593233 neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76[243703]: [NOTICE]   (243707) : New worker (243709) forked
Jan 23 04:44:48 np0005593233 neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76[243703]: [NOTICE]   (243707) : Loading success.
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.128 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.134 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161487.9540203, ab4209ce-8ecd-48d4-9826-fe7501e19da8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.135 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.201 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.206 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.241 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.622 222021 DEBUG nova.compute.manager [req-aa2cbdee-ff01-44a1-90e9-23be42ae383f req-e5437176-fc8b-4fb8-a87c-4f519610d99e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.624 222021 DEBUG oslo_concurrency.lockutils [req-aa2cbdee-ff01-44a1-90e9-23be42ae383f req-e5437176-fc8b-4fb8-a87c-4f519610d99e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.625 222021 DEBUG oslo_concurrency.lockutils [req-aa2cbdee-ff01-44a1-90e9-23be42ae383f req-e5437176-fc8b-4fb8-a87c-4f519610d99e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.625 222021 DEBUG oslo_concurrency.lockutils [req-aa2cbdee-ff01-44a1-90e9-23be42ae383f req-e5437176-fc8b-4fb8-a87c-4f519610d99e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.626 222021 DEBUG nova.compute.manager [req-aa2cbdee-ff01-44a1-90e9-23be42ae383f req-e5437176-fc8b-4fb8-a87c-4f519610d99e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Processing event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.628 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.635 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161488.6342692, ab4209ce-8ecd-48d4-9826-fe7501e19da8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.636 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.638 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.642 222021 INFO nova.virt.libvirt.driver [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Instance spawned successfully.#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.642 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.666 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.672 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.673 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.674 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.674 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.674 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.675 222021 DEBUG nova.virt.libvirt.driver [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.679 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.975 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.977 222021 INFO nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Took 9.34 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:44:48 np0005593233 nova_compute[222017]: 2026-01-23 09:44:48.977 222021 DEBUG nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:44:49 np0005593233 nova_compute[222017]: 2026-01-23 09:44:49.083 222021 INFO nova.compute.manager [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Took 10.49 seconds to build instance.#033[00m
Jan 23 04:44:49 np0005593233 nova_compute[222017]: 2026-01-23 09:44:49.101 222021 DEBUG oslo_concurrency.lockutils [None req-be23915f-ede2-4609-b834-9f46b0a0165b 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:49.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:49 np0005593233 nova_compute[222017]: 2026-01-23 09:44:49.481 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:44:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:49.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:44:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:44:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:44:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:44:50 np0005593233 nova_compute[222017]: 2026-01-23 09:44:50.695 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:51 np0005593233 nova_compute[222017]: 2026-01-23 09:44:51.166 222021 DEBUG nova.compute.manager [req-3168419a-175d-47d9-807b-0c6c3e2c6c8a req-133fb275-77d8-4bcd-b5ec-3d7abef5b35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:51 np0005593233 nova_compute[222017]: 2026-01-23 09:44:51.167 222021 DEBUG oslo_concurrency.lockutils [req-3168419a-175d-47d9-807b-0c6c3e2c6c8a req-133fb275-77d8-4bcd-b5ec-3d7abef5b35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:51 np0005593233 nova_compute[222017]: 2026-01-23 09:44:51.167 222021 DEBUG oslo_concurrency.lockutils [req-3168419a-175d-47d9-807b-0c6c3e2c6c8a req-133fb275-77d8-4bcd-b5ec-3d7abef5b35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:51 np0005593233 nova_compute[222017]: 2026-01-23 09:44:51.167 222021 DEBUG oslo_concurrency.lockutils [req-3168419a-175d-47d9-807b-0c6c3e2c6c8a req-133fb275-77d8-4bcd-b5ec-3d7abef5b35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:51 np0005593233 nova_compute[222017]: 2026-01-23 09:44:51.167 222021 DEBUG nova.compute.manager [req-3168419a-175d-47d9-807b-0c6c3e2c6c8a req-133fb275-77d8-4bcd-b5ec-3d7abef5b35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] No waiting events found dispatching network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:44:51 np0005593233 nova_compute[222017]: 2026-01-23 09:44:51.168 222021 WARNING nova.compute.manager [req-3168419a-175d-47d9-807b-0c6c3e2c6c8a req-133fb275-77d8-4bcd-b5ec-3d7abef5b35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received unexpected event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa for instance with vm_state active and task_state None.#033[00m
Jan 23 04:44:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:51.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:44:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:51.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:44:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:53.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:53.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:54 np0005593233 nova_compute[222017]: 2026-01-23 09:44:54.483 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:55.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:55.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:55 np0005593233 nova_compute[222017]: 2026-01-23 09:44:55.702 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:44:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:44:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:57.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:57.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:58 np0005593233 podman[243898]: 2026-01-23 09:44:58.076320936 +0000 UTC m=+0.075709224 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 04:44:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:59.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:59 np0005593233 nova_compute[222017]: 2026-01-23 09:44:59.533 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:44:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:59.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:00 np0005593233 nova_compute[222017]: 2026-01-23 09:45:00.705 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:01.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:01.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:02 np0005593233 ovn_controller[130653]: 2026-01-23T09:45:02Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:51:60 10.100.0.14
Jan 23 04:45:02 np0005593233 ovn_controller[130653]: 2026-01-23T09:45:02Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:51:60 10.100.0.14
Jan 23 04:45:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:03.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:04 np0005593233 nova_compute[222017]: 2026-01-23 09:45:04.538 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:05.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:05.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:05 np0005593233 nova_compute[222017]: 2026-01-23 09:45:05.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:07.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:07.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:09.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:09 np0005593233 nova_compute[222017]: 2026-01-23 09:45:09.541 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:09.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:10 np0005593233 nova_compute[222017]: 2026-01-23 09:45:10.711 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:11.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:11 np0005593233 nova_compute[222017]: 2026-01-23 09:45:11.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 23 04:45:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:11.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:12 np0005593233 nova_compute[222017]: 2026-01-23 09:45:12.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 23 04:45:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:13.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 23 04:45:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:13.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:14 np0005593233 nova_compute[222017]: 2026-01-23 09:45:14.543 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:15.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.433 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.433 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.714 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:15.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:45:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2495700131' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:45:15 np0005593233 nova_compute[222017]: 2026-01-23 09:45:15.888 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.011 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000036 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.013 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000036 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.253 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.255 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4522MB free_disk=20.901145935058594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.255 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.255 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.399 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ab4209ce-8ecd-48d4-9826-fe7501e19da8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.401 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.402 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.464 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.945 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.953 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:45:16 np0005593233 nova_compute[222017]: 2026-01-23 09:45:16.976 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:45:17 np0005593233 nova_compute[222017]: 2026-01-23 09:45:17.001 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:45:17 np0005593233 nova_compute[222017]: 2026-01-23 09:45:17.002 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.746s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:17 np0005593233 podman[243965]: 2026-01-23 09:45:17.120637418 +0000 UTC m=+0.118338253 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:45:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:17.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:17.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.002 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.003 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.004 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.460 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.461 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.461 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:45:18 np0005593233 nova_compute[222017]: 2026-01-23 09:45:18.462 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ab4209ce-8ecd-48d4-9826-fe7501e19da8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:45:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:19.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:19 np0005593233 nova_compute[222017]: 2026-01-23 09:45:19.546 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:19.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 23 04:45:20 np0005593233 nova_compute[222017]: 2026-01-23 09:45:20.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:21.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:21.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:45:21 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2918311962' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:45:21 np0005593233 nova_compute[222017]: 2026-01-23 09:45:21.920 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:45:21 np0005593233 nova_compute[222017]: 2026-01-23 09:45:21.960 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:45:21 np0005593233 nova_compute[222017]: 2026-01-23 09:45:21.960 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:45:21 np0005593233 nova_compute[222017]: 2026-01-23 09:45:21.961 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:23.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:23.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 23 04:45:24 np0005593233 nova_compute[222017]: 2026-01-23 09:45:24.551 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.017 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:25 np0005593233 NetworkManager[48871]: <info>  [1769161525.0211] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 23 04:45:25 np0005593233 NetworkManager[48871]: <info>  [1769161525.0234] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:25 np0005593233 ovn_controller[130653]: 2026-01-23T09:45:25Z|00175|binding|INFO|Releasing lport cc7bf29e-8ca5-4432-828c-b26d34e969d3 from this chassis (sb_readonly=0)
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.175 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:25.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:25.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.778 222021 DEBUG nova.compute.manager [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.779 222021 DEBUG nova.compute.manager [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing instance network info cache due to event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.779 222021 DEBUG oslo_concurrency.lockutils [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.779 222021 DEBUG oslo_concurrency.lockutils [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:45:25 np0005593233 nova_compute[222017]: 2026-01-23 09:45:25.779 222021 DEBUG nova.network.neutron [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:45:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 23 04:45:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:27.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:27.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:28 np0005593233 nova_compute[222017]: 2026-01-23 09:45:28.338 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:28 np0005593233 nova_compute[222017]: 2026-01-23 09:45:28.339 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:29 np0005593233 podman[243992]: 2026-01-23 09:45:29.063969014 +0000 UTC m=+0.068586201 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 04:45:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:29.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:29 np0005593233 nova_compute[222017]: 2026-01-23 09:45:29.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:29.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:30 np0005593233 nova_compute[222017]: 2026-01-23 09:45:30.046 222021 DEBUG nova.network.neutron [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated VIF entry in instance network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:45:30 np0005593233 nova_compute[222017]: 2026-01-23 09:45:30.046 222021 DEBUG nova.network.neutron [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:45:30 np0005593233 nova_compute[222017]: 2026-01-23 09:45:30.686 222021 DEBUG oslo_concurrency.lockutils [req-7be2ea0f-719b-485d-8764-70f17123095c req-feb72b2e-cf4e-4e3f-a5b7-37f6e644b81c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:45:30 np0005593233 nova_compute[222017]: 2026-01-23 09:45:30.723 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:30 np0005593233 nova_compute[222017]: 2026-01-23 09:45:30.835 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:30 np0005593233 ovn_controller[130653]: 2026-01-23T09:45:30Z|00176|binding|INFO|Releasing lport cc7bf29e-8ca5-4432-828c-b26d34e969d3 from this chassis (sb_readonly=0)
Jan 23 04:45:30 np0005593233 nova_compute[222017]: 2026-01-23 09:45:30.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:31.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:31.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:33.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:33.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 23 04:45:34 np0005593233 nova_compute[222017]: 2026-01-23 09:45:34.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:35.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:35.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:35 np0005593233 nova_compute[222017]: 2026-01-23 09:45:35.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:36 np0005593233 nova_compute[222017]: 2026-01-23 09:45:36.456 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:45:36.460 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:45:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:45:36.461 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:45:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 23 04:45:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:37.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:37.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:38 np0005593233 nova_compute[222017]: 2026-01-23 09:45:38.241 222021 DEBUG nova.compute.manager [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:38 np0005593233 nova_compute[222017]: 2026-01-23 09:45:38.241 222021 DEBUG nova.compute.manager [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing instance network info cache due to event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:45:38 np0005593233 nova_compute[222017]: 2026-01-23 09:45:38.241 222021 DEBUG oslo_concurrency.lockutils [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:45:38 np0005593233 nova_compute[222017]: 2026-01-23 09:45:38.241 222021 DEBUG oslo_concurrency.lockutils [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:45:38 np0005593233 nova_compute[222017]: 2026-01-23 09:45:38.242 222021 DEBUG nova.network.neutron [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:45:38 np0005593233 nova_compute[222017]: 2026-01-23 09:45:38.998 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:39.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:39 np0005593233 nova_compute[222017]: 2026-01-23 09:45:39.559 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:39.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:40 np0005593233 nova_compute[222017]: 2026-01-23 09:45:40.763 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:41 np0005593233 nova_compute[222017]: 2026-01-23 09:45:41.094 222021 DEBUG nova.network.neutron [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated VIF entry in instance network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:45:41 np0005593233 nova_compute[222017]: 2026-01-23 09:45:41.095 222021 DEBUG nova.network.neutron [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:45:41 np0005593233 nova_compute[222017]: 2026-01-23 09:45:41.130 222021 DEBUG oslo_concurrency.lockutils [req-99ebfc59-3f1f-4da4-afb9-4dabd299a9c7 req-7a6362c1-48db-42f5-a513-fff1700ff79e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:45:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:41.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:45:41.463 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:45:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:41.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:45:42.648 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:45:42.649 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:45:42.650 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:43.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:43.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 23 04:45:44 np0005593233 nova_compute[222017]: 2026-01-23 09:45:44.562 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:45.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:45 np0005593233 nova_compute[222017]: 2026-01-23 09:45:45.764 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:45.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:47.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:47.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:48 np0005593233 podman[244011]: 2026-01-23 09:45:48.090269875 +0000 UTC m=+0.104287871 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 04:45:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:49.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:49 np0005593233 nova_compute[222017]: 2026-01-23 09:45:49.598 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 23 04:45:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:49.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:50 np0005593233 nova_compute[222017]: 2026-01-23 09:45:50.823 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:51.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:45:51Z|00177|binding|INFO|Releasing lport cc7bf29e-8ca5-4432-828c-b26d34e969d3 from this chassis (sb_readonly=0)
Jan 23 04:45:51 np0005593233 nova_compute[222017]: 2026-01-23 09:45:51.500 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:45:51Z|00178|binding|INFO|Releasing lport cc7bf29e-8ca5-4432-828c-b26d34e969d3 from this chassis (sb_readonly=0)
Jan 23 04:45:51 np0005593233 nova_compute[222017]: 2026-01-23 09:45:51.653 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:51.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:53.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 23 04:45:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:53.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:54 np0005593233 nova_compute[222017]: 2026-01-23 09:45:54.601 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:55.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:55.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:55 np0005593233 nova_compute[222017]: 2026-01-23 09:45:55.825 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:45:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:57.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:45:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:57.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:45:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:45:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:45:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 23 04:45:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:45:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:59.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:45:59 np0005593233 nova_compute[222017]: 2026-01-23 09:45:59.603 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:45:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:59.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:00 np0005593233 podman[244171]: 2026-01-23 09:46:00.071684997 +0000 UTC m=+0.068486978 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 04:46:00 np0005593233 nova_compute[222017]: 2026-01-23 09:46:00.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:01.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:01.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:02 np0005593233 nova_compute[222017]: 2026-01-23 09:46:02.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:02 np0005593233 NetworkManager[48871]: <info>  [1769161562.7108] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 23 04:46:02 np0005593233 NetworkManager[48871]: <info>  [1769161562.7125] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 23 04:46:02 np0005593233 nova_compute[222017]: 2026-01-23 09:46:02.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:02 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:02Z|00179|binding|INFO|Releasing lport cc7bf29e-8ca5-4432-828c-b26d34e969d3 from this chassis (sb_readonly=0)
Jan 23 04:46:02 np0005593233 nova_compute[222017]: 2026-01-23 09:46:02.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:03.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:03.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:46:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:46:04 np0005593233 nova_compute[222017]: 2026-01-23 09:46:04.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:05.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:05.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:05 np0005593233 nova_compute[222017]: 2026-01-23 09:46:05.830 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:06 np0005593233 nova_compute[222017]: 2026-01-23 09:46:06.897 222021 DEBUG nova.compute.manager [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:06 np0005593233 nova_compute[222017]: 2026-01-23 09:46:06.898 222021 DEBUG nova.compute.manager [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing instance network info cache due to event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:46:06 np0005593233 nova_compute[222017]: 2026-01-23 09:46:06.898 222021 DEBUG oslo_concurrency.lockutils [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:06 np0005593233 nova_compute[222017]: 2026-01-23 09:46:06.899 222021 DEBUG oslo_concurrency.lockutils [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:06 np0005593233 nova_compute[222017]: 2026-01-23 09:46:06.899 222021 DEBUG nova.network.neutron [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:46:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:07.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:07.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:09.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:09 np0005593233 nova_compute[222017]: 2026-01-23 09:46:09.609 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:09.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:10 np0005593233 nova_compute[222017]: 2026-01-23 09:46:10.832 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:11 np0005593233 nova_compute[222017]: 2026-01-23 09:46:11.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:11.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:11.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:13 np0005593233 nova_compute[222017]: 2026-01-23 09:46:13.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:13.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:13.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.364002) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574364142, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2263, "num_deletes": 263, "total_data_size": 5090863, "memory_usage": 5171152, "flush_reason": "Manual Compaction"}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574471550, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3336685, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35918, "largest_seqno": 38176, "table_properties": {"data_size": 3327268, "index_size": 5911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19761, "raw_average_key_size": 20, "raw_value_size": 3308268, "raw_average_value_size": 3446, "num_data_blocks": 256, "num_entries": 960, "num_filter_entries": 960, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161403, "oldest_key_time": 1769161403, "file_creation_time": 1769161574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 107642 microseconds, and 14525 cpu microseconds.
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.471636) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3336685 bytes OK
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.471664) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.541224) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.541279) EVENT_LOG_v1 {"time_micros": 1769161574541268, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.541304) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5080567, prev total WAL file size 5080567, number of live WAL files 2.
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.543035) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303033' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3258KB)], [69(8131KB)]
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574543178, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11663788, "oldest_snapshot_seqno": -1}
Jan 23 04:46:14 np0005593233 nova_compute[222017]: 2026-01-23 09:46:14.612 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6250 keys, 11511414 bytes, temperature: kUnknown
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574778257, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11511414, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11467527, "index_size": 27137, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 159743, "raw_average_key_size": 25, "raw_value_size": 11353245, "raw_average_value_size": 1816, "num_data_blocks": 1096, "num_entries": 6250, "num_filter_entries": 6250, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.778579) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11511414 bytes
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.829734) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 49.6 rd, 49.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.9 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.9) write-amplify(3.4) OK, records in: 6790, records dropped: 540 output_compression: NoCompression
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.829786) EVENT_LOG_v1 {"time_micros": 1769161574829768, "job": 42, "event": "compaction_finished", "compaction_time_micros": 235162, "compaction_time_cpu_micros": 47312, "output_level": 6, "num_output_files": 1, "total_output_size": 11511414, "num_input_records": 6790, "num_output_records": 6250, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574830585, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574832116, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.542776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.832202) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.832207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.832209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.832211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:14.832213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:15.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:15.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:15 np0005593233 nova_compute[222017]: 2026-01-23 09:46:15.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:16 np0005593233 nova_compute[222017]: 2026-01-23 09:46:16.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:16 np0005593233 nova_compute[222017]: 2026-01-23 09:46:16.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:17.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.498 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.499 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.500 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.500 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.501 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.538 222021 DEBUG nova.network.neutron [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated VIF entry in instance network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.541 222021 DEBUG nova.network.neutron [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:17 np0005593233 nova_compute[222017]: 2026-01-23 09:46:17.600 222021 DEBUG oslo_concurrency.lockutils [req-4a090a7a-d79c-4818-892f-bf235828ad64 req-08980cd6-7edb-4ffc-95f1-cc588593fc3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:17.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/600250705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:18 np0005593233 nova_compute[222017]: 2026-01-23 09:46:18.361 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.860s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:19 np0005593233 podman[244265]: 2026-01-23 09:46:19.116229915 +0000 UTC m=+0.117695664 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 23 04:46:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:19.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:19 np0005593233 nova_compute[222017]: 2026-01-23 09:46:19.613 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:19.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:20 np0005593233 nova_compute[222017]: 2026-01-23 09:46:20.838 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.073 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "14f26d33-78bc-4b9c-9b73-f660998601ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.074 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "14f26d33-78bc-4b9c-9b73-f660998601ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.188 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000036 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.188 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000036 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.204 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.392 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.393 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.401 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.401 222021 INFO nova.compute.claims [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.406 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.406 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4536MB free_disk=20.922027587890625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.407 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:21.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:21 np0005593233 nova_compute[222017]: 2026-01-23 09:46:21.694 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:21.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/807979017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:22 np0005593233 nova_compute[222017]: 2026-01-23 09:46:22.183 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:22 np0005593233 nova_compute[222017]: 2026-01-23 09:46:22.194 222021 DEBUG nova.compute.provider_tree [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:46:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:23.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:23.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.277 222021 DEBUG nova.scheduler.client.report [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.773 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.774 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.778 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 3.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.902 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.903 222021 DEBUG nova.network.neutron [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.930 222021 INFO nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.962 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.971 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ab4209ce-8ecd-48d4-9826-fe7501e19da8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.971 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 14f26d33-78bc-4b9c-9b73-f660998601ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.971 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:46:24 np0005593233 nova_compute[222017]: 2026-01-23 09:46:24.972 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:46:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.087 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.242 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.245 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.246 222021 INFO nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Creating image(s)#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.302 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.341 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.375 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.381 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.473 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.475 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.477 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.477 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:25.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.528 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.533 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 14f26d33-78bc-4b9c-9b73-f660998601ab_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1331500184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.637 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.644 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.687 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.689 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.690 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:25.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.900 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 14f26d33-78bc-4b9c-9b73-f660998601ab_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.367s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.982 222021 DEBUG nova.network.neutron [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.983 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:46:25 np0005593233 nova_compute[222017]: 2026-01-23 09:46:25.988 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] resizing rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.106 222021 DEBUG nova.objects.instance [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lazy-loading 'migration_context' on Instance uuid 14f26d33-78bc-4b9c-9b73-f660998601ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.141 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.141 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Ensure instance console log exists: /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.142 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.142 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.142 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.144 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.149 222021 WARNING nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.162 222021 DEBUG nova.virt.libvirt.host [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.163 222021 DEBUG nova.virt.libvirt.host [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.169 222021 DEBUG nova.virt.libvirt.host [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.169 222021 DEBUG nova.virt.libvirt.host [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.170 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.171 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.171 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.171 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.172 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.172 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.172 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.173 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.173 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.173 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.173 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.174 222021 DEBUG nova.virt.hardware [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.177 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:46:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3326229861' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.674 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.714 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.721 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.752 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.754 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.755 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.755 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:46:26 np0005593233 nova_compute[222017]: 2026-01-23 09:46:26.941 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:46:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:46:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4137341398' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.194 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.197 222021 DEBUG nova.objects.instance [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lazy-loading 'pci_devices' on Instance uuid 14f26d33-78bc-4b9c-9b73-f660998601ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.235 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <uuid>14f26d33-78bc-4b9c-9b73-f660998601ab</uuid>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <name>instance-0000003b</name>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:name>tempest-ListImageFiltersTestJSON-server-115831291</nova:name>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:46:26</nova:creationTime>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:user uuid="8d1d7c58442749759ba7dc3a19799796">tempest-ListImageFiltersTestJSON-1689583115-project-member</nova:user>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <nova:project uuid="5d69aaa276f94de98e4011fa17428b40">tempest-ListImageFiltersTestJSON-1689583115</nova:project>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <entry name="serial">14f26d33-78bc-4b9c-9b73-f660998601ab</entry>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <entry name="uuid">14f26d33-78bc-4b9c-9b73-f660998601ab</entry>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/14f26d33-78bc-4b9c-9b73-f660998601ab_disk">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/14f26d33-78bc-4b9c-9b73-f660998601ab_disk.config">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/console.log" append="off"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:46:27 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:46:27 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:46:27 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:46:27 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.319 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.320 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.321 222021 INFO nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Using config drive#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.357 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:27.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.512 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.513 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.513 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:46:27 np0005593233 nova_compute[222017]: 2026-01-23 09:46:27.513 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ab4209ce-8ecd-48d4-9826-fe7501e19da8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:27.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.145 222021 INFO nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Creating config drive at /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/disk.config#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.150 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_raiahyb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.287 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_raiahyb" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.328 222021 DEBUG nova.storage.rbd_utils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] rbd image 14f26d33-78bc-4b9c-9b73-f660998601ab_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.334 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/disk.config 14f26d33-78bc-4b9c-9b73-f660998601ab_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.544 222021 DEBUG oslo_concurrency.processutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/disk.config 14f26d33-78bc-4b9c-9b73-f660998601ab_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.546 222021 INFO nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Deleting local config drive /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab/disk.config because it was imported into RBD.#033[00m
Jan 23 04:46:28 np0005593233 systemd-machined[190954]: New machine qemu-29-instance-0000003b.
Jan 23 04:46:28 np0005593233 systemd[1]: Started Virtual Machine qemu-29-instance-0000003b.
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.773 222021 DEBUG nova.compute.manager [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.775 222021 DEBUG nova.compute.manager [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing instance network info cache due to event network-changed-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:46:28 np0005593233 nova_compute[222017]: 2026-01-23 09:46:28.776 222021 DEBUG oslo_concurrency.lockutils [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.342 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.343 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.343 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161589.3420298, 14f26d33-78bc-4b9c-9b73-f660998601ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.343 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.348 222021 INFO nova.virt.libvirt.driver [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance spawned successfully.#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.349 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:46:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:29.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.619 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.832 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.836 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:46:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:29.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.851 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.852 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.853 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.853 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.854 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:29 np0005593233 nova_compute[222017]: 2026-01-23 09:46:29.854 222021 DEBUG nova.virt.libvirt.driver [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.190 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.191 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161589.3422878, 14f26d33-78bc-4b9c-9b73-f660998601ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.191 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] VM Started (Lifecycle Event)#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.230 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.249 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.253 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.326 222021 INFO nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Took 5.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.326 222021 DEBUG nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.411 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.585 222021 INFO nova.compute.manager [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Took 9.24 seconds to build instance.#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.619 222021 DEBUG oslo_concurrency.lockutils [None req-1b2589c3-7d20-4cea-9576-7947aace6b90 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "14f26d33-78bc-4b9c-9b73-f660998601ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:30 np0005593233 nova_compute[222017]: 2026-01-23 09:46:30.845 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:31 np0005593233 podman[244680]: 2026-01-23 09:46:31.082097726 +0000 UTC m=+0.089542459 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:46:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:31.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:32 np0005593233 nova_compute[222017]: 2026-01-23 09:46:32.473 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:32 np0005593233 nova_compute[222017]: 2026-01-23 09:46:32.501 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:32 np0005593233 nova_compute[222017]: 2026-01-23 09:46:32.502 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:46:32 np0005593233 nova_compute[222017]: 2026-01-23 09:46:32.502 222021 DEBUG oslo_concurrency.lockutils [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:32 np0005593233 nova_compute[222017]: 2026-01-23 09:46:32.503 222021 DEBUG nova.network.neutron [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Refreshing network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:46:32 np0005593233 nova_compute[222017]: 2026-01-23 09:46:32.504 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.612895) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592612955, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 443, "num_deletes": 251, "total_data_size": 478094, "memory_usage": 486696, "flush_reason": "Manual Compaction"}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592618053, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 315025, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38181, "largest_seqno": 38619, "table_properties": {"data_size": 312605, "index_size": 520, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6165, "raw_average_key_size": 18, "raw_value_size": 307724, "raw_average_value_size": 943, "num_data_blocks": 23, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161574, "oldest_key_time": 1769161574, "file_creation_time": 1769161592, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 5174 microseconds, and 1634 cpu microseconds.
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.618089) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 315025 bytes OK
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.618109) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.619999) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.620016) EVENT_LOG_v1 {"time_micros": 1769161592620010, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.620035) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 475306, prev total WAL file size 475306, number of live WAL files 2.
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.620606) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(307KB)], [72(10MB)]
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592620712, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 11826439, "oldest_snapshot_seqno": -1}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6066 keys, 10006137 bytes, temperature: kUnknown
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592753535, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 10006137, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9964802, "index_size": 25097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 156654, "raw_average_key_size": 25, "raw_value_size": 9854960, "raw_average_value_size": 1624, "num_data_blocks": 1003, "num_entries": 6066, "num_filter_entries": 6066, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161592, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.753830) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 10006137 bytes
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.755852) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.0 rd, 75.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.0 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(69.3) write-amplify(31.8) OK, records in: 6576, records dropped: 510 output_compression: NoCompression
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.755879) EVENT_LOG_v1 {"time_micros": 1769161592755866, "job": 44, "event": "compaction_finished", "compaction_time_micros": 132910, "compaction_time_cpu_micros": 48939, "output_level": 6, "num_output_files": 1, "total_output_size": 10006137, "num_input_records": 6576, "num_output_records": 6066, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592756160, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592758499, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.620455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.758642) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.758655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.758658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.758663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:46:32.758666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:33.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:33.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:34 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Jan 23 04:46:34 np0005593233 nova_compute[222017]: 2026-01-23 09:46:34.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:35.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:35.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:35 np0005593233 nova_compute[222017]: 2026-01-23 09:46:35.884 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:36 np0005593233 nova_compute[222017]: 2026-01-23 09:46:36.186 222021 DEBUG nova.network.neutron [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updated VIF entry in instance network info cache for port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:46:36 np0005593233 nova_compute[222017]: 2026-01-23 09:46:36.187 222021 DEBUG nova.network.neutron [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [{"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:36 np0005593233 nova_compute[222017]: 2026-01-23 09:46:36.216 222021 DEBUG oslo_concurrency.lockutils [req-23eb8e09-2ed5-4d36-895c-0f42243bf868 req-4c8db2be-6dcc-46b5-8e22-1f7415b21ff3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ab4209ce-8ecd-48d4-9826-fe7501e19da8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:36 np0005593233 nova_compute[222017]: 2026-01-23 09:46:36.547 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:36.548 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:36.551 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:46:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.460 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.461 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.461 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.461 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.462 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.463 222021 INFO nova.compute.manager [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Terminating instance#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.463 222021 DEBUG nova.compute.manager [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:46:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:37.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:37 np0005593233 kernel: tap05c086a7-19 (unregistering): left promiscuous mode
Jan 23 04:46:37 np0005593233 NetworkManager[48871]: <info>  [1769161597.5372] device (tap05c086a7-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:46:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:37Z|00180|binding|INFO|Releasing lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa from this chassis (sb_readonly=0)
Jan 23 04:46:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:37Z|00181|binding|INFO|Setting lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa down in Southbound
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.551 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:37Z|00182|binding|INFO|Removing iface tap05c086a7-19 ovn-installed in OVS
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.554 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.571 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:51:60 10.100.0.14'], port_security=['fa:16:3e:f5:51:60 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ab4209ce-8ecd-48d4-9826-fe7501e19da8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e98627d8-446e-4b60-8051-8e37123acd76', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c05913e2e5c046bf92e6c8c855833959', 'neutron:revision_number': '4', 'neutron:security_group_ids': '151c9987-f620-4133-b1a2-fc22782378bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8711ac9b-6a17-48e3-a5cd-eace160ad350, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.573 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa in datapath e98627d8-446e-4b60-8051-8e37123acd76 unbound from our chassis#033[00m
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.578 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e98627d8-446e-4b60-8051-8e37123acd76, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.580 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e4444f4e-7202-47b0-8607-5cce89648a0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.581 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76 namespace which is not needed anymore#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.593 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000036.scope: Deactivated successfully.
Jan 23 04:46:37 np0005593233 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000036.scope: Consumed 19.131s CPU time.
Jan 23 04:46:37 np0005593233 systemd-machined[190954]: Machine qemu-28-instance-00000036 terminated.
Jan 23 04:46:37 np0005593233 kernel: tap05c086a7-19: entered promiscuous mode
Jan 23 04:46:37 np0005593233 NetworkManager[48871]: <info>  [1769161597.6853] manager: (tap05c086a7-19): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Jan 23 04:46:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:37Z|00183|binding|INFO|Claiming lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa for this chassis.
Jan 23 04:46:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:37Z|00184|binding|INFO|05c086a7-19a8-4dfd-9eb7-dff8452eb4aa: Claiming fa:16:3e:f5:51:60 10.100.0.14
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 kernel: tap05c086a7-19 (unregistering): left promiscuous mode
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.824 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:51:60 10.100.0.14'], port_security=['fa:16:3e:f5:51:60 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ab4209ce-8ecd-48d4-9826-fe7501e19da8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e98627d8-446e-4b60-8051-8e37123acd76', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c05913e2e5c046bf92e6c8c855833959', 'neutron:revision_number': '4', 'neutron:security_group_ids': '151c9987-f620-4133-b1a2-fc22782378bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8711ac9b-6a17-48e3-a5cd-eace160ad350, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.844 222021 INFO nova.virt.libvirt.driver [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Instance destroyed successfully.#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.845 222021 DEBUG nova.objects.instance [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lazy-loading 'resources' on Instance uuid ab4209ce-8ecd-48d4-9826-fe7501e19da8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:46:37Z|00185|binding|INFO|Releasing lport 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa from this chassis (sb_readonly=0)
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.860 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 23 04:46:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:37.865 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:51:60 10.100.0.14'], port_security=['fa:16:3e:f5:51:60 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ab4209ce-8ecd-48d4-9826-fe7501e19da8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e98627d8-446e-4b60-8051-8e37123acd76', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c05913e2e5c046bf92e6c8c855833959', 'neutron:revision_number': '4', 'neutron:security_group_ids': '151c9987-f620-4133-b1a2-fc22782378bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8711ac9b-6a17-48e3-a5cd-eace160ad350, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.865 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.873 222021 DEBUG nova.virt.libvirt.vif [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:44:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1741184475',display_name='tempest-FloatingIPsAssociationTestJSON-server-1741184475',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1741184475',id=54,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:44:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c05913e2e5c046bf92e6c8c855833959',ramdisk_id='',reservation_id='r-e8i000mq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1969273951',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1969273951-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:44:49Z,user_data=None,user_id='1270518e615c4c63a54865bfe906ce5d',uuid=ab4209ce-8ecd-48d4-9826-fe7501e19da8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.874 222021 DEBUG nova.network.os_vif_util [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Converting VIF {"id": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "address": "fa:16:3e:f5:51:60", "network": {"id": "e98627d8-446e-4b60-8051-8e37123acd76", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1442445063-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c05913e2e5c046bf92e6c8c855833959", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05c086a7-19", "ovs_interfaceid": "05c086a7-19a8-4dfd-9eb7-dff8452eb4aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.875 222021 DEBUG nova.network.os_vif_util [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.875 222021 DEBUG os_vif [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.878 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05c086a7-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.882 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:46:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:37.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.885 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.890 222021 INFO os_vif [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:51:60,bridge_name='br-int',has_traffic_filtering=True,id=05c086a7-19a8-4dfd-9eb7-dff8452eb4aa,network=Network(e98627d8-446e-4b60-8051-8e37123acd76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05c086a7-19')#033[00m
Jan 23 04:46:37 np0005593233 neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76[243703]: [NOTICE]   (243707) : haproxy version is 2.8.14-c23fe91
Jan 23 04:46:37 np0005593233 neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76[243703]: [NOTICE]   (243707) : path to executable is /usr/sbin/haproxy
Jan 23 04:46:37 np0005593233 neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76[243703]: [ALERT]    (243707) : Current worker (243709) exited with code 143 (Terminated)
Jan 23 04:46:37 np0005593233 neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76[243703]: [WARNING]  (243707) : All workers exited. Exiting... (0)
Jan 23 04:46:37 np0005593233 systemd[1]: libpod-fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989.scope: Deactivated successfully.
Jan 23 04:46:37 np0005593233 podman[244723]: 2026-01-23 09:46:37.943555415 +0000 UTC m=+0.075993421 container died fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:46:37 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989-userdata-shm.mount: Deactivated successfully.
Jan 23 04:46:37 np0005593233 systemd[1]: var-lib-containers-storage-overlay-c886c086d08c69367cc4c9218b52a2ad4c0ee901a27c2c0a2f268b7f4163b112-merged.mount: Deactivated successfully.
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:37.999 222021 DEBUG nova.compute.manager [req-32d30fde-545e-4e08-a933-273af353aa3b req-7ccf4c4c-3d22-498d-962a-60b88129d550 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-vif-unplugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.000 222021 DEBUG oslo_concurrency.lockutils [req-32d30fde-545e-4e08-a933-273af353aa3b req-7ccf4c4c-3d22-498d-962a-60b88129d550 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.001 222021 DEBUG oslo_concurrency.lockutils [req-32d30fde-545e-4e08-a933-273af353aa3b req-7ccf4c4c-3d22-498d-962a-60b88129d550 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.001 222021 DEBUG oslo_concurrency.lockutils [req-32d30fde-545e-4e08-a933-273af353aa3b req-7ccf4c4c-3d22-498d-962a-60b88129d550 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.002 222021 DEBUG nova.compute.manager [req-32d30fde-545e-4e08-a933-273af353aa3b req-7ccf4c4c-3d22-498d-962a-60b88129d550 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] No waiting events found dispatching network-vif-unplugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.003 222021 DEBUG nova.compute.manager [req-32d30fde-545e-4e08-a933-273af353aa3b req-7ccf4c4c-3d22-498d-962a-60b88129d550 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-vif-unplugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:46:38 np0005593233 podman[244723]: 2026-01-23 09:46:38.009828319 +0000 UTC m=+0.142266335 container cleanup fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:46:38 np0005593233 systemd[1]: libpod-conmon-fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989.scope: Deactivated successfully.
Jan 23 04:46:38 np0005593233 podman[244769]: 2026-01-23 09:46:38.120181162 +0000 UTC m=+0.074711585 container remove fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.132 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c66c9f20-f448-4b32-a67c-01101c95b36e]: (4, ('Fri Jan 23 09:46:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76 (fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989)\nfdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989\nFri Jan 23 09:46:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76 (fdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989)\nfdf69a72619225d6e590bdfbc31ed6172eb29a24510ab33092df7baeac4a6989\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.134 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f9db73b4-22d0-46dc-bac2-67c93acd6889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.135 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape98627d8-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.137 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:38 np0005593233 kernel: tape98627d8-40: left promiscuous mode
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.163 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.169 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9c8f5275-4e4e-48f9-ae0e-58d0f75d8990]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.185 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4683b5a2-03c4-4632-932e-fb1576353996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.186 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[21e5bbb0-3e8f-454b-ac7b-6c11c09484f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.210 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[30a8299f-ebfe-4820-9960-5e898cf4a5a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539976, 'reachable_time': 36053, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244784, 'error': None, 'target': 'ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 systemd[1]: run-netns-ovnmeta\x2de98627d8\x2d446e\x2d4b60\x2d8051\x2d8e37123acd76.mount: Deactivated successfully.
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.213 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e98627d8-446e-4b60-8051-8e37123acd76 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.213 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e8da0d36-e18f-4845-afa5-e1af102db5d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.214 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa in datapath e98627d8-446e-4b60-8051-8e37123acd76 unbound from our chassis#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.215 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e98627d8-446e-4b60-8051-8e37123acd76, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.216 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f67b1c9f-3448-4a40-887d-e4d09f9179a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.216 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 05c086a7-19a8-4dfd-9eb7-dff8452eb4aa in datapath e98627d8-446e-4b60-8051-8e37123acd76 unbound from our chassis#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.217 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e98627d8-446e-4b60-8051-8e37123acd76, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:46:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:38.217 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d3cfda2d-da33-47f2-b158-6eee20240b71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.586 222021 INFO nova.virt.libvirt.driver [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Deleting instance files /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8_del#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.587 222021 INFO nova.virt.libvirt.driver [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Deletion of /var/lib/nova/instances/ab4209ce-8ecd-48d4-9826-fe7501e19da8_del complete#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.698 222021 INFO nova.compute.manager [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Took 1.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.699 222021 DEBUG oslo.service.loopingcall [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.700 222021 DEBUG nova.compute.manager [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:46:38 np0005593233 nova_compute[222017]: 2026-01-23 09:46:38.700 222021 DEBUG nova.network.neutron [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:46:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 23 04:46:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:39.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:39 np0005593233 nova_compute[222017]: 2026-01-23 09:46:39.625 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:39.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:40 np0005593233 nova_compute[222017]: 2026-01-23 09:46:40.276 222021 DEBUG nova.compute.manager [req-e5747c07-df0a-4f04-8ebe-e69ad552248f req-e702e135-7dc7-401f-b681-b596ed9a5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:40 np0005593233 nova_compute[222017]: 2026-01-23 09:46:40.277 222021 DEBUG oslo_concurrency.lockutils [req-e5747c07-df0a-4f04-8ebe-e69ad552248f req-e702e135-7dc7-401f-b681-b596ed9a5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:40 np0005593233 nova_compute[222017]: 2026-01-23 09:46:40.277 222021 DEBUG oslo_concurrency.lockutils [req-e5747c07-df0a-4f04-8ebe-e69ad552248f req-e702e135-7dc7-401f-b681-b596ed9a5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:40 np0005593233 nova_compute[222017]: 2026-01-23 09:46:40.278 222021 DEBUG oslo_concurrency.lockutils [req-e5747c07-df0a-4f04-8ebe-e69ad552248f req-e702e135-7dc7-401f-b681-b596ed9a5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:40 np0005593233 nova_compute[222017]: 2026-01-23 09:46:40.278 222021 DEBUG nova.compute.manager [req-e5747c07-df0a-4f04-8ebe-e69ad552248f req-e702e135-7dc7-401f-b681-b596ed9a5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] No waiting events found dispatching network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:46:40 np0005593233 nova_compute[222017]: 2026-01-23 09:46:40.279 222021 WARNING nova.compute.manager [req-e5747c07-df0a-4f04-8ebe-e69ad552248f req-e702e135-7dc7-401f-b681-b596ed9a5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received unexpected event network-vif-plugged-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.151 222021 DEBUG nova.network.neutron [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.181 222021 INFO nova.compute.manager [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Took 2.48 seconds to deallocate network for instance.#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.260 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.261 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.397 222021 DEBUG oslo_concurrency.processutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:41.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/975862859' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:41.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.895 222021 DEBUG oslo_concurrency.processutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.908 222021 DEBUG nova.compute.provider_tree [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.942 222021 DEBUG nova.scheduler.client.report [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:46:41 np0005593233 nova_compute[222017]: 2026-01-23 09:46:41.972 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:42 np0005593233 nova_compute[222017]: 2026-01-23 09:46:42.049 222021 INFO nova.scheduler.client.report [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Deleted allocations for instance ab4209ce-8ecd-48d4-9826-fe7501e19da8#033[00m
Jan 23 04:46:42 np0005593233 nova_compute[222017]: 2026-01-23 09:46:42.221 222021 DEBUG oslo_concurrency.lockutils [None req-b0af584d-6321-401c-959d-e8712402f0c5 1270518e615c4c63a54865bfe906ce5d c05913e2e5c046bf92e6c8c855833959 - - default default] Lock "ab4209ce-8ecd-48d4-9826-fe7501e19da8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:42 np0005593233 nova_compute[222017]: 2026-01-23 09:46:42.540 222021 DEBUG nova.compute.manager [req-7c8aa3d4-418b-4013-82b1-88a0396ad6a5 req-ee922ec9-91c0-428c-9fbe-6de049946757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Received event network-vif-deleted-05c086a7-19a8-4dfd-9eb7-dff8452eb4aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:42.649 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:42.650 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:46:42.650 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:42 np0005593233 nova_compute[222017]: 2026-01-23 09:46:42.894 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:43.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:43.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 23 04:46:44 np0005593233 nova_compute[222017]: 2026-01-23 09:46:44.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.489 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.490 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.522 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:46:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:45.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.618 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.619 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.629 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.630 222021 INFO nova.compute.claims [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.726 222021 DEBUG nova.compute.manager [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.822 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:45 np0005593233 nova_compute[222017]: 2026-01-23 09:46:45.856 222021 INFO nova.compute.manager [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] instance snapshotting#033[00m
Jan 23 04:46:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:45.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/62295666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.287 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.299 222021 DEBUG nova.compute.provider_tree [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.331 222021 DEBUG nova.scheduler.client.report [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.376 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.377 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.480 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.481 222021 DEBUG nova.network.neutron [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.496 222021 INFO nova.virt.libvirt.driver [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Beginning live snapshot process#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.537 222021 INFO nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.586 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.797 222021 DEBUG nova.virt.libvirt.imagebackend [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.863 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.866 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.867 222021 INFO nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Creating image(s)#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.908 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.942 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.976 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:46 np0005593233 nova_compute[222017]: 2026-01-23 09:46:46.980 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.027 222021 DEBUG nova.policy [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae77ac206ed246b49262982455564c01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ff15972efaf47c1a5483927aa058ee1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.085 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.086 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.087 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.088 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.123 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.127 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.233 222021 DEBUG nova.storage.rbd_utils [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] creating snapshot(42f60db588c44660818d73acd1bb7979) on rbd image(14f26d33-78bc-4b9c-9b73-f660998601ab_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:46:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:47.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.860 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.733s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:47.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:47 np0005593233 nova_compute[222017]: 2026-01-23 09:46:47.995 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] resizing rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.125 222021 DEBUG nova.objects.instance [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'migration_context' on Instance uuid ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.150 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.151 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Ensure instance console log exists: /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.151 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.152 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.152 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.732 222021 DEBUG nova.storage.rbd_utils [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] cloning vms/14f26d33-78bc-4b9c-9b73-f660998601ab_disk@42f60db588c44660818d73acd1bb7979 to images/b6586179-6c6b-4cff-8136-eeb31a27eb50 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:46:48 np0005593233 nova_compute[222017]: 2026-01-23 09:46:48.887 222021 DEBUG nova.storage.rbd_utils [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] flattening images/b6586179-6c6b-4cff-8136-eeb31a27eb50 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:46:49 np0005593233 nova_compute[222017]: 2026-01-23 09:46:49.398 222021 DEBUG nova.storage.rbd_utils [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] removing snapshot(42f60db588c44660818d73acd1bb7979) on rbd image(14f26d33-78bc-4b9c-9b73-f660998601ab_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:46:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:49.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:49 np0005593233 nova_compute[222017]: 2026-01-23 09:46:49.629 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 23 04:46:49 np0005593233 nova_compute[222017]: 2026-01-23 09:46:49.705 222021 DEBUG nova.storage.rbd_utils [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] creating snapshot(snap) on rbd image(b6586179-6c6b-4cff-8136-eeb31a27eb50) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:46:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:49.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:50 np0005593233 podman[245138]: 2026-01-23 09:46:50.167989145 +0000 UTC m=+0.160581660 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:46:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 23 04:46:50 np0005593233 nova_compute[222017]: 2026-01-23 09:46:50.726 222021 DEBUG nova.network.neutron [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Successfully created port: 3db32163-4419-4458-b500-57e797f956a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:46:51 np0005593233 nova_compute[222017]: 2026-01-23 09:46:51.439 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:51.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:51 np0005593233 nova_compute[222017]: 2026-01-23 09:46:51.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:51.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:52 np0005593233 nova_compute[222017]: 2026-01-23 09:46:52.842 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161597.840182, ab4209ce-8ecd-48d4-9826-fe7501e19da8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:46:52 np0005593233 nova_compute[222017]: 2026-01-23 09:46:52.843 222021 INFO nova.compute.manager [-] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:46:52 np0005593233 nova_compute[222017]: 2026-01-23 09:46:52.892 222021 DEBUG nova.compute.manager [None req-4e6b20d8-7874-419b-a4e8-b6308cedd8ab - - - - - -] [instance: ab4209ce-8ecd-48d4-9826-fe7501e19da8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:52 np0005593233 nova_compute[222017]: 2026-01-23 09:46:52.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:53 np0005593233 nova_compute[222017]: 2026-01-23 09:46:53.517 222021 INFO nova.virt.libvirt.driver [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Snapshot image upload complete#033[00m
Jan 23 04:46:53 np0005593233 nova_compute[222017]: 2026-01-23 09:46:53.518 222021 INFO nova.compute.manager [None req-bc6004d5-b522-4dca-b52a-6c27cc9869c8 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Took 7.66 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 04:46:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:53.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:53.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.633 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.714 222021 DEBUG nova.network.neutron [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Successfully updated port: 3db32163-4419-4458-b500-57e797f956a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.747 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "refresh_cache-ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.748 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquired lock "refresh_cache-ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.748 222021 DEBUG nova.network.neutron [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.895 222021 DEBUG nova.compute.manager [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-changed-3db32163-4419-4458-b500-57e797f956a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.895 222021 DEBUG nova.compute.manager [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Refreshing instance network info cache due to event network-changed-3db32163-4419-4458-b500-57e797f956a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:46:54 np0005593233 nova_compute[222017]: 2026-01-23 09:46:54.896 222021 DEBUG oslo_concurrency.lockutils [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 23 04:46:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:55 np0005593233 nova_compute[222017]: 2026-01-23 09:46:55.578 222021 DEBUG nova.network.neutron [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:46:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:55.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:55.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.598 222021 DEBUG nova.network.neutron [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Updating instance_info_cache with network_info: [{"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.641 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Releasing lock "refresh_cache-ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.642 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Instance network_info: |[{"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.643 222021 DEBUG oslo_concurrency.lockutils [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.643 222021 DEBUG nova.network.neutron [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Refreshing network info cache for port 3db32163-4419-4458-b500-57e797f956a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.647 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Start _get_guest_xml network_info=[{"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.654 222021 WARNING nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.669 222021 DEBUG nova.virt.libvirt.host [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.670 222021 DEBUG nova.virt.libvirt.host [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.685 222021 DEBUG nova.virt.libvirt.host [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.686 222021 DEBUG nova.virt.libvirt.host [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.688 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.688 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.689 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.689 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.689 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.689 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.689 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.690 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.690 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.690 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.690 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.691 222021 DEBUG nova.virt.hardware [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.695 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:46:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:57.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:46:57 np0005593233 nova_compute[222017]: 2026-01-23 09:46:57.952 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:46:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2186782611' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.198 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.234 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.240 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 23 04:46:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:46:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2448681281' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.719 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.722 222021 DEBUG nova.virt.libvirt.vif [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:46:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1605189062',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1605189062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1605189062',id=60,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-7rn069r3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:46:46Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=ccb9c4ce-3766-49d2-94e5-88a46bcc07b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.723 222021 DEBUG nova.network.os_vif_util [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.724 222021 DEBUG nova.network.os_vif_util [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.726 222021 DEBUG nova.objects.instance [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'pci_devices' on Instance uuid ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.754 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <uuid>ccb9c4ce-3766-49d2-94e5-88a46bcc07b6</uuid>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <name>instance-0000003c</name>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1605189062</nova:name>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:46:57</nova:creationTime>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:user uuid="ae77ac206ed246b49262982455564c01">tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member</nova:user>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:project uuid="6ff15972efaf47c1a5483927aa058ee1">tempest-ImagesOneServerNegativeTestJSON-1870050002</nova:project>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <nova:port uuid="3db32163-4419-4458-b500-57e797f956a7">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <entry name="serial">ccb9c4ce-3766-49d2-94e5-88a46bcc07b6</entry>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <entry name="uuid">ccb9c4ce-3766-49d2-94e5-88a46bcc07b6</entry>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk.config">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:ad:b3:3f"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <target dev="tap3db32163-44"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/console.log" append="off"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:46:58 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:46:58 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:46:58 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:46:58 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.756 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Preparing to wait for external event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.757 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.757 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.757 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.758 222021 DEBUG nova.virt.libvirt.vif [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:46:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1605189062',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1605189062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1605189062',id=60,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-7rn069r3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:46:46Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=ccb9c4ce-3766-49d2-94e5-88a46bcc07b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.758 222021 DEBUG nova.network.os_vif_util [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.759 222021 DEBUG nova.network.os_vif_util [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.760 222021 DEBUG os_vif [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.761 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.762 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.765 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.766 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3db32163-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.766 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3db32163-44, col_values=(('external_ids', {'iface-id': '3db32163-4419-4458-b500-57e797f956a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:b3:3f', 'vm-uuid': 'ccb9c4ce-3766-49d2-94e5-88a46bcc07b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.768 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:58 np0005593233 NetworkManager[48871]: <info>  [1769161618.7697] manager: (tap3db32163-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.770 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.784 222021 INFO os_vif [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44')#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.885 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.885 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.886 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No VIF found with MAC fa:16:3e:ad:b3:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.886 222021 INFO nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Using config drive#033[00m
Jan 23 04:46:58 np0005593233 nova_compute[222017]: 2026-01-23 09:46:58.910 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 23 04:46:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:59.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:46:59 np0005593233 nova_compute[222017]: 2026-01-23 09:46:59.636 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:46:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:46:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:59.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.296 222021 INFO nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Creating config drive at /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/disk.config#033[00m
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.304 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxjmuuthe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.444 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxjmuuthe" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.478 222021 DEBUG nova.storage.rbd_utils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.482 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/disk.config ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.684 222021 DEBUG oslo_concurrency.processutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/disk.config ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.202s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.685 222021 INFO nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Deleting local config drive /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6/disk.config because it was imported into RBD.#033[00m
Jan 23 04:47:00 np0005593233 kernel: tap3db32163-44: entered promiscuous mode
Jan 23 04:47:00 np0005593233 NetworkManager[48871]: <info>  [1769161620.7611] manager: (tap3db32163-44): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 23 04:47:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:00Z|00186|binding|INFO|Claiming lport 3db32163-4419-4458-b500-57e797f956a7 for this chassis.
Jan 23 04:47:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:00Z|00187|binding|INFO|3db32163-4419-4458-b500-57e797f956a7: Claiming fa:16:3e:ad:b3:3f 10.100.0.7
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.789 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:b3:3f 10.100.0.7'], port_security=['fa:16:3e:ad:b3:3f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ccb9c4ce-3766-49d2-94e5-88a46bcc07b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ff15972efaf47c1a5483927aa058ee1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '579c0db7-85ec-4a28-8798-9415af4416d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d761e1-c6a0-4a70-8ba3-d112e3c168c9, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=3db32163-4419-4458-b500-57e797f956a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.790 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 3db32163-4419-4458-b500-57e797f956a7 in datapath 3d3b9f7b-3375-48b6-888c-0ff96bc928a5 bound to our chassis#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.791 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d3b9f7b-3375-48b6-888c-0ff96bc928a5#033[00m
Jan 23 04:47:00 np0005593233 systemd-udevd[245300]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:47:00 np0005593233 NetworkManager[48871]: <info>  [1769161620.8059] device (tap3db32163-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.806 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5110a8ae-7008-404b-89c2-8dff31f5220b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.807 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d3b9f7b-31 in ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:47:00 np0005593233 NetworkManager[48871]: <info>  [1769161620.8082] device (tap3db32163-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.810 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d3b9f7b-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.810 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c666345b-6990-4fd4-b7ad-5ace01d07e68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.811 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d52397a-36cc-4dbc-9a04-19260291f206]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 systemd-machined[190954]: New machine qemu-30-instance-0000003c.
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.823 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[41de8521-3cdf-4ee7-8017-82f24a3d673c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.847 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aec7e482-f430-4a70-9057-6ba42efedb13]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 systemd[1]: Started Virtual Machine qemu-30-instance-0000003c.
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.863 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:00Z|00188|binding|INFO|Setting lport 3db32163-4419-4458-b500-57e797f956a7 ovn-installed in OVS
Jan 23 04:47:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:00Z|00189|binding|INFO|Setting lport 3db32163-4419-4458-b500-57e797f956a7 up in Southbound
Jan 23 04:47:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:00.870 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.881 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2923d6a3-ea65-420e-9db7-59df7a214b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 NetworkManager[48871]: <info>  [1769161620.8882] manager: (tap3d3b9f7b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.887 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0f931561-4376-4542-ad40-8de2700e21a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.929 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[020e76f6-0889-4dc4-b72d-7d31fe61baf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.931 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[eb324cd5-8a68-40e8-b058-47fbae928dfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 NetworkManager[48871]: <info>  [1769161620.9610] device (tap3d3b9f7b-30): carrier: link connected
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.967 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[90032ba0-6af5-480f-9207-615604daac1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:00.992 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[48cea867-7157-46f3-9e1b-39e4099c282b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d3b9f7b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:f9:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553359, 'reachable_time': 28281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245336, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.008 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fcbe7165-9cf2-424e-807d-6a658c28fff6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:f995'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 553359, 'tstamp': 553359}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245337, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.027 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9f360f78-46b8-494d-b72c-c178f76e02d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d3b9f7b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:f9:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553359, 'reachable_time': 28281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245338, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.060 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a94c4d6e-84e0-4d90-b92f-f93d5aa0e368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.122 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1254fa54-c953-4548-9551-1a17e37adf32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.124 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d3b9f7b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.124 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.125 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d3b9f7b-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:01 np0005593233 kernel: tap3d3b9f7b-30: entered promiscuous mode
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.127 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:01 np0005593233 NetworkManager[48871]: <info>  [1769161621.1280] manager: (tap3d3b9f7b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.132 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d3b9f7b-30, col_values=(('external_ids', {'iface-id': '28841559-e121-4a0b-bc47-8aa922ed5fb0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:01 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:01Z|00190|binding|INFO|Releasing lport 28841559-e121-4a0b-bc47-8aa922ed5fb0 from this chassis (sb_readonly=0)
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.135 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.136 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d98a6b4-a06d-43ff-8d28-f881b256334b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.137 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-3d3b9f7b-3375-48b6-888c-0ff96bc928a5
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 3d3b9f7b-3375-48b6-888c-0ff96bc928a5
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:01.138 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'env', 'PROCESS_TAG=haproxy-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.147 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.391 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161621.3902106, ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.391 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] VM Started (Lifecycle Event)#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.431 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.436 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161621.3917348, ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.436 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.464 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.468 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:47:01 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.498 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:47:01 np0005593233 podman[245412]: 2026-01-23 09:47:01.573678668 +0000 UTC m=+0.056868456 container create a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:47:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:01.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:01 np0005593233 systemd[1]: Started libpod-conmon-a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6.scope.
Jan 23 04:47:01 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:47:01 np0005593233 podman[245412]: 2026-01-23 09:47:01.543365432 +0000 UTC m=+0.026555230 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:47:01 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cebaaf496bc3747d6119df6e58ca7aea4df3fdcf57991b9e455936ade04d5a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:47:01 np0005593233 podman[245412]: 2026-01-23 09:47:01.65984129 +0000 UTC m=+0.143031148 container init a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 04:47:01 np0005593233 podman[245412]: 2026-01-23 09:47:01.668622341 +0000 UTC m=+0.151812159 container start a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:47:01 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [NOTICE]   (245443) : New worker (245450) forked
Jan 23 04:47:01 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [NOTICE]   (245443) : Loading success.
Jan 23 04:47:01 np0005593233 podman[245423]: 2026-01-23 09:47:01.727140693 +0000 UTC m=+0.096747956 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 04:47:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:01.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.999 222021 DEBUG nova.network.neutron [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Updated VIF entry in instance network info cache for port 3db32163-4419-4458-b500-57e797f956a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:01.999 222021 DEBUG nova.network.neutron [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Updating instance_info_cache with network_info: [{"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.029 222021 DEBUG oslo_concurrency.lockutils [req-f753ffc8-6e82-4697-816a-d56dadd3dfbb req-7bd5106d-5ec4-4289-b2bd-b747404ab4d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.128 222021 DEBUG nova.compute.manager [req-09547f2f-4b41-41c1-b499-57a56d0d1aae req-0af09c41-c9b4-466d-9439-13fef28569ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.128 222021 DEBUG oslo_concurrency.lockutils [req-09547f2f-4b41-41c1-b499-57a56d0d1aae req-0af09c41-c9b4-466d-9439-13fef28569ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.129 222021 DEBUG oslo_concurrency.lockutils [req-09547f2f-4b41-41c1-b499-57a56d0d1aae req-0af09c41-c9b4-466d-9439-13fef28569ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.129 222021 DEBUG oslo_concurrency.lockutils [req-09547f2f-4b41-41c1-b499-57a56d0d1aae req-0af09c41-c9b4-466d-9439-13fef28569ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.130 222021 DEBUG nova.compute.manager [req-09547f2f-4b41-41c1-b499-57a56d0d1aae req-0af09c41-c9b4-466d-9439-13fef28569ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Processing event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.131 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.136 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161622.1364481, ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.137 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.141 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.146 222021 INFO nova.virt.libvirt.driver [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Instance spawned successfully.#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.147 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.179 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.188 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.196 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.197 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.198 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.199 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.200 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.201 222021 DEBUG nova.virt.libvirt.driver [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.219 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.280 222021 INFO nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Took 15.42 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.281 222021 DEBUG nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.381 222021 INFO nova.compute.manager [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Took 16.80 seconds to build instance.#033[00m
Jan 23 04:47:02 np0005593233 nova_compute[222017]: 2026-01-23 09:47:02.403 222021 DEBUG oslo_concurrency.lockutils [None req-b3167fba-21bb-49c7-8698-b876defa06d0 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:03.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:03 np0005593233 nova_compute[222017]: 2026-01-23 09:47:03.769 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:03.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 23 04:47:04 np0005593233 podman[245634]: 2026-01-23 09:47:04.569247273 +0000 UTC m=+0.083426425 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:04 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:47:04 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:47:04 np0005593233 podman[245634]: 2026-01-23 09:47:04.666286946 +0000 UTC m=+0.180466098 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.774 222021 DEBUG nova.compute.manager [req-4691fde4-91ea-4e6a-afe9-999426d8ed9e req-5b2b2abb-59eb-4f73-bfb0-7a3ff2683a1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.775 222021 DEBUG oslo_concurrency.lockutils [req-4691fde4-91ea-4e6a-afe9-999426d8ed9e req-5b2b2abb-59eb-4f73-bfb0-7a3ff2683a1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.776 222021 DEBUG oslo_concurrency.lockutils [req-4691fde4-91ea-4e6a-afe9-999426d8ed9e req-5b2b2abb-59eb-4f73-bfb0-7a3ff2683a1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.776 222021 DEBUG oslo_concurrency.lockutils [req-4691fde4-91ea-4e6a-afe9-999426d8ed9e req-5b2b2abb-59eb-4f73-bfb0-7a3ff2683a1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.776 222021 DEBUG nova.compute.manager [req-4691fde4-91ea-4e6a-afe9-999426d8ed9e req-5b2b2abb-59eb-4f73-bfb0-7a3ff2683a1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] No waiting events found dispatching network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:47:04 np0005593233 nova_compute[222017]: 2026-01-23 09:47:04.776 222021 WARNING nova.compute.manager [req-4691fde4-91ea-4e6a-afe9-999426d8ed9e req-5b2b2abb-59eb-4f73-bfb0-7a3ff2683a1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received unexpected event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:47:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:05.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:05.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:47:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:47:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:07.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:07.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:08 np0005593233 nova_compute[222017]: 2026-01-23 09:47:08.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:09.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:09 np0005593233 nova_compute[222017]: 2026-01-23 09:47:09.642 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:09.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:11.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:11.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:12 np0005593233 nova_compute[222017]: 2026-01-23 09:47:12.707 222021 DEBUG nova.compute.manager [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:12 np0005593233 nova_compute[222017]: 2026-01-23 09:47:12.758 222021 INFO nova.compute.manager [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] instance snapshotting#033[00m
Jan 23 04:47:13 np0005593233 nova_compute[222017]: 2026-01-23 09:47:13.007 222021 INFO nova.virt.libvirt.driver [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Beginning live snapshot process#033[00m
Jan 23 04:47:13 np0005593233 nova_compute[222017]: 2026-01-23 09:47:13.209 222021 DEBUG nova.virt.libvirt.imagebackend [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:47:13 np0005593233 nova_compute[222017]: 2026-01-23 09:47:13.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:13 np0005593233 nova_compute[222017]: 2026-01-23 09:47:13.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:13 np0005593233 nova_compute[222017]: 2026-01-23 09:47:13.548 222021 DEBUG nova.storage.rbd_utils [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] creating snapshot(9b6e0d5178984e1bb57f7b1e4eac1956) on rbd image(ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:47:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:13.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:13 np0005593233 nova_compute[222017]: 2026-01-23 09:47:13.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:13.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 23 04:47:14 np0005593233 nova_compute[222017]: 2026-01-23 09:47:14.572 222021 DEBUG nova.storage.rbd_utils [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] cloning vms/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk@9b6e0d5178984e1bb57f7b1e4eac1956 to images/6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:47:14 np0005593233 nova_compute[222017]: 2026-01-23 09:47:14.642 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:14 np0005593233 nova_compute[222017]: 2026-01-23 09:47:14.833 222021 DEBUG nova.storage.rbd_utils [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] flattening images/6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:47:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:15 np0005593233 nova_compute[222017]: 2026-01-23 09:47:15.143 222021 DEBUG nova.storage.rbd_utils [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] removing snapshot(9b6e0d5178984e1bb57f7b1e4eac1956) on rbd image(ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:47:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 23 04:47:15 np0005593233 nova_compute[222017]: 2026-01-23 09:47:15.553 222021 DEBUG nova.storage.rbd_utils [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] creating snapshot(snap) on rbd image(6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:47:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:15.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:15.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image 6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6 could not be found.
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID 6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver 
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver 
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image 6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6 could not be found.
Jan 23 04:47:16 np0005593233 nova_compute[222017]: 2026-01-23 09:47:16.907 222021 ERROR nova.virt.libvirt.driver #033[00m
Jan 23 04:47:16 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:16Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ad:b3:3f 10.100.0.7
Jan 23 04:47:16 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:16Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ad:b3:3f 10.100.0.7
Jan 23 04:47:17 np0005593233 nova_compute[222017]: 2026-01-23 09:47:17.194 222021 DEBUG nova.storage.rbd_utils [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] removing snapshot(snap) on rbd image(6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:47:17 np0005593233 nova_compute[222017]: 2026-01-23 09:47:17.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 23 04:47:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:17.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:17.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.069 222021 WARNING nova.compute.manager [None req-c96b202e-e1e4-452d-950f-68809626c692 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Image not found during snapshot: nova.exception.ImageNotFound: Image 6bf97b3a-7532-45e6-8ddf-ceec1bcd88c6 could not be found.#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.645 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-14f26d33-78bc-4b9c-9b73-f660998601ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.646 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-14f26d33-78bc-4b9c-9b73-f660998601ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.646 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.647 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 14f26d33-78bc-4b9c-9b73-f660998601ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.788 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:18 np0005593233 nova_compute[222017]: 2026-01-23 09:47:18.842 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.221 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.243 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-14f26d33-78bc-4b9c-9b73-f660998601ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.244 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.409 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.410 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.411 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 23 04:47:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:19.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1245248991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.905 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:19.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.992 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000003c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.992 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000003c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.996 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000003b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:47:19 np0005593233 nova_compute[222017]: 2026-01-23 09:47:19.996 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000003b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:47:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.045 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.046 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.047 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.047 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.047 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.049 222021 INFO nova.compute.manager [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Terminating instance#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.050 222021 DEBUG nova.compute.manager [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:47:20 np0005593233 kernel: tap3db32163-44 (unregistering): left promiscuous mode
Jan 23 04:47:20 np0005593233 NetworkManager[48871]: <info>  [1769161640.2051] device (tap3db32163-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:47:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:20Z|00191|binding|INFO|Releasing lport 3db32163-4419-4458-b500-57e797f956a7 from this chassis (sb_readonly=0)
Jan 23 04:47:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:20Z|00192|binding|INFO|Setting lport 3db32163-4419-4458-b500-57e797f956a7 down in Southbound
Jan 23 04:47:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:20Z|00193|binding|INFO|Removing iface tap3db32163-44 ovn-installed in OVS
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.227 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:b3:3f 10.100.0.7'], port_security=['fa:16:3e:ad:b3:3f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ccb9c4ce-3766-49d2-94e5-88a46bcc07b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ff15972efaf47c1a5483927aa058ee1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '579c0db7-85ec-4a28-8798-9415af4416d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d761e1-c6a0-4a70-8ba3-d112e3c168c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=3db32163-4419-4458-b500-57e797f956a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.229 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 3db32163-4419-4458-b500-57e797f956a7 in datapath 3d3b9f7b-3375-48b6-888c-0ff96bc928a5 unbound from our chassis#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.290 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d3b9f7b-3375-48b6-888c-0ff96bc928a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.292 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2c888534-0e78-40c2-b289-28bcbd95b1e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.293 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 namespace which is not needed anymore#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.300 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.303 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4288MB free_disk=20.85207748413086GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.303 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.303 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.305 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:20 np0005593233 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Jan 23 04:47:20 np0005593233 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003c.scope: Consumed 14.661s CPU time.
Jan 23 04:47:20 np0005593233 systemd-machined[190954]: Machine qemu-30-instance-0000003c terminated.
Jan 23 04:47:20 np0005593233 podman[246141]: 2026-01-23 09:47:20.411767527 +0000 UTC m=+0.098305230 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.438 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 14f26d33-78bc-4b9c-9b73-f660998601ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.439 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.439 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.439 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:47:20 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [NOTICE]   (245443) : haproxy version is 2.8.14-c23fe91
Jan 23 04:47:20 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [NOTICE]   (245443) : path to executable is /usr/sbin/haproxy
Jan 23 04:47:20 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [WARNING]  (245443) : Exiting Master process...
Jan 23 04:47:20 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [WARNING]  (245443) : Exiting Master process...
Jan 23 04:47:20 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [ALERT]    (245443) : Current worker (245450) exited with code 143 (Terminated)
Jan 23 04:47:20 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[245427]: [WARNING]  (245443) : All workers exited. Exiting... (0)
Jan 23 04:47:20 np0005593233 systemd[1]: libpod-a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6.scope: Deactivated successfully.
Jan 23 04:47:20 np0005593233 podman[246186]: 2026-01-23 09:47:20.458788361 +0000 UTC m=+0.051776061 container died a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:47:20 np0005593233 systemd[1]: var-lib-containers-storage-overlay-8cebaaf496bc3747d6119df6e58ca7aea4df3fdcf57991b9e455936ade04d5a8-merged.mount: Deactivated successfully.
Jan 23 04:47:20 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6-userdata-shm.mount: Deactivated successfully.
Jan 23 04:47:20 np0005593233 podman[246186]: 2026-01-23 09:47:20.497932079 +0000 UTC m=+0.090919769 container cleanup a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.507 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:20 np0005593233 systemd[1]: libpod-conmon-a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6.scope: Deactivated successfully.
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.553 222021 INFO nova.virt.libvirt.driver [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Instance destroyed successfully.#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.554 222021 DEBUG nova.objects.instance [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'resources' on Instance uuid ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.577 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "14f26d33-78bc-4b9c-9b73-f660998601ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.577 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "14f26d33-78bc-4b9c-9b73-f660998601ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.577 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "14f26d33-78bc-4b9c-9b73-f660998601ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.578 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "14f26d33-78bc-4b9c-9b73-f660998601ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.578 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "14f26d33-78bc-4b9c-9b73-f660998601ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.579 222021 INFO nova.compute.manager [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Terminating instance#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.580 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "refresh_cache-14f26d33-78bc-4b9c-9b73-f660998601ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.580 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquired lock "refresh_cache-14f26d33-78bc-4b9c-9b73-f660998601ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.580 222021 DEBUG nova.network.neutron [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.582 222021 DEBUG nova.virt.libvirt.vif [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:46:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1605189062',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1605189062',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1605189062',id=60,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-7rn069r3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:18Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=ccb9c4ce-3766-49d2-94e5-88a46bcc07b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.582 222021 DEBUG nova.network.os_vif_util [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "3db32163-4419-4458-b500-57e797f956a7", "address": "fa:16:3e:ad:b3:3f", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3db32163-44", "ovs_interfaceid": "3db32163-4419-4458-b500-57e797f956a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.584 222021 DEBUG nova.network.os_vif_util [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.584 222021 DEBUG os_vif [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.588 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.588 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3db32163-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.591 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.593 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.596 222021 INFO os_vif [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:b3:3f,bridge_name='br-int',has_traffic_filtering=True,id=3db32163-4419-4458-b500-57e797f956a7,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3db32163-44')#033[00m
Jan 23 04:47:20 np0005593233 podman[246222]: 2026-01-23 09:47:20.608061326 +0000 UTC m=+0.081205801 container remove a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.620 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f87509-0685-421a-b852-9c6c19cea608]: (4, ('Fri Jan 23 09:47:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 (a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6)\na71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6\nFri Jan 23 09:47:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 (a71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6)\na71d0b7ae3797732aa7e5b32e5aa77d417280be27602a348d3086b50603977d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.622 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[41d2586f-7f31-43a0-9f2a-d5e68d6e5826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.623 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d3b9f7b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:20 np0005593233 kernel: tap3d3b9f7b-30: left promiscuous mode
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.631 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.636 222021 DEBUG nova.compute.manager [req-aba47f31-3dbb-458e-aee6-a682bc5dfce8 req-7822200b-156e-4f28-893a-8b6ba8d4415c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-vif-unplugged-3db32163-4419-4458-b500-57e797f956a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.636 222021 DEBUG oslo_concurrency.lockutils [req-aba47f31-3dbb-458e-aee6-a682bc5dfce8 req-7822200b-156e-4f28-893a-8b6ba8d4415c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.637 222021 DEBUG oslo_concurrency.lockutils [req-aba47f31-3dbb-458e-aee6-a682bc5dfce8 req-7822200b-156e-4f28-893a-8b6ba8d4415c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.637 222021 DEBUG oslo_concurrency.lockutils [req-aba47f31-3dbb-458e-aee6-a682bc5dfce8 req-7822200b-156e-4f28-893a-8b6ba8d4415c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.637 222021 DEBUG nova.compute.manager [req-aba47f31-3dbb-458e-aee6-a682bc5dfce8 req-7822200b-156e-4f28-893a-8b6ba8d4415c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] No waiting events found dispatching network-vif-unplugged-3db32163-4419-4458-b500-57e797f956a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.637 222021 DEBUG nova.compute.manager [req-aba47f31-3dbb-458e-aee6-a682bc5dfce8 req-7822200b-156e-4f28-893a-8b6ba8d4415c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-vif-unplugged-3db32163-4419-4458-b500-57e797f956a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.643 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0e7da9-4d67-4533-9ae3-2fed497f2bf9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.661 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e9bf10cb-a6ae-4761-8302-e4a09a8d7055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.663 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c73788ea-3c8d-45da-ac0f-a0203df36013]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.685 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a82d3b4-1abb-4c46-8e10-c981dd16d999]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553350, 'reachable_time': 33011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246266, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 systemd[1]: run-netns-ovnmeta\x2d3d3b9f7b\x2d3375\x2d48b6\x2d888c\x2d0ff96bc928a5.mount: Deactivated successfully.
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.688 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:47:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:20.688 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef0900e-b7a0-439e-bc36-0698279e18f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:20 np0005593233 nova_compute[222017]: 2026-01-23 09:47:20.793 222021 DEBUG nova.network.neutron [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:47:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2881514672' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.015 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.023 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.048 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.061 222021 INFO nova.virt.libvirt.driver [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Deleting instance files /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_del#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.062 222021 INFO nova.virt.libvirt.driver [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Deletion of /var/lib/nova/instances/ccb9c4ce-3766-49d2-94e5-88a46bcc07b6_del complete#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.086 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.086 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.169 222021 INFO nova.compute.manager [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Took 1.12 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.170 222021 DEBUG oslo.service.loopingcall [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.170 222021 DEBUG nova.compute.manager [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.170 222021 DEBUG nova.network.neutron [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.245 222021 DEBUG nova.network.neutron [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.262 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Releasing lock "refresh_cache-14f26d33-78bc-4b9c-9b73-f660998601ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.263 222021 DEBUG nova.compute.manager [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:47:21 np0005593233 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Jan 23 04:47:21 np0005593233 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003b.scope: Consumed 15.882s CPU time.
Jan 23 04:47:21 np0005593233 systemd-machined[190954]: Machine qemu-29-instance-0000003b terminated.
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.499 222021 INFO nova.virt.libvirt.driver [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance destroyed successfully.#033[00m
Jan 23 04:47:21 np0005593233 nova_compute[222017]: 2026-01-23 09:47:21.499 222021 DEBUG nova.objects.instance [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lazy-loading 'resources' on Instance uuid 14f26d33-78bc-4b9c-9b73-f660998601ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:21.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:21.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.009 222021 DEBUG nova.network.neutron [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.041 222021 INFO nova.compute.manager [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Took 0.87 seconds to deallocate network for instance.#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.108 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.109 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.161 222021 DEBUG nova.compute.manager [req-63f9fc87-3478-4d46-8c44-30d6075eb29c req-ac4619de-9e01-4f0b-9980-c096e08a7655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-vif-deleted-3db32163-4419-4458-b500-57e797f956a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.189 222021 DEBUG oslo_concurrency.processutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.679 222021 DEBUG oslo_concurrency.processutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.687 222021 DEBUG nova.compute.provider_tree [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.749 222021 INFO nova.virt.libvirt.driver [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Deleting instance files /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab_del#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.751 222021 INFO nova.virt.libvirt.driver [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Deletion of /var/lib/nova/instances/14f26d33-78bc-4b9c-9b73-f660998601ab_del complete#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.761 222021 DEBUG nova.scheduler.client.report [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.808 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.913 222021 INFO nova.compute.manager [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Took 1.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.914 222021 DEBUG oslo.service.loopingcall [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.914 222021 DEBUG nova.compute.manager [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:47:22 np0005593233 nova_compute[222017]: 2026-01-23 09:47:22.914 222021 DEBUG nova.network.neutron [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.071 222021 INFO nova.scheduler.client.report [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Deleted allocations for instance ccb9c4ce-3766-49d2-94e5-88a46bcc07b6#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.155 222021 DEBUG nova.compute.manager [req-e2065f0c-4bbf-4405-97dc-e3ca4618170c req-277064a2-73de-4330-8155-90149b77ce28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.156 222021 DEBUG oslo_concurrency.lockutils [req-e2065f0c-4bbf-4405-97dc-e3ca4618170c req-277064a2-73de-4330-8155-90149b77ce28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.156 222021 DEBUG oslo_concurrency.lockutils [req-e2065f0c-4bbf-4405-97dc-e3ca4618170c req-277064a2-73de-4330-8155-90149b77ce28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.157 222021 DEBUG oslo_concurrency.lockutils [req-e2065f0c-4bbf-4405-97dc-e3ca4618170c req-277064a2-73de-4330-8155-90149b77ce28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.157 222021 DEBUG nova.compute.manager [req-e2065f0c-4bbf-4405-97dc-e3ca4618170c req-277064a2-73de-4330-8155-90149b77ce28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] No waiting events found dispatching network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.158 222021 WARNING nova.compute.manager [req-e2065f0c-4bbf-4405-97dc-e3ca4618170c req-277064a2-73de-4330-8155-90149b77ce28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Received unexpected event network-vif-plugged-3db32163-4419-4458-b500-57e797f956a7 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.309 222021 DEBUG nova.network.neutron [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.412 222021 DEBUG oslo_concurrency.lockutils [None req-b580c1f1-d654-4faa-9618-6da08fdcdc57 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "ccb9c4ce-3766-49d2-94e5-88a46bcc07b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.415 222021 DEBUG nova.network.neutron [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.449 222021 INFO nova.compute.manager [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Took 0.53 seconds to deallocate network for instance.#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.550 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.550 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:23 np0005593233 nova_compute[222017]: 2026-01-23 09:47:23.599 222021 DEBUG oslo_concurrency.processutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:23.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:23.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/46409325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.070 222021 DEBUG oslo_concurrency.processutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.078 222021 DEBUG nova.compute.provider_tree [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.100 222021 DEBUG nova.scheduler.client.report [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.133 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.178 222021 INFO nova.scheduler.client.report [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Deleted allocations for instance 14f26d33-78bc-4b9c-9b73-f660998601ab#033[00m
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.268 222021 DEBUG oslo_concurrency.lockutils [None req-89e0ef67-7be3-4bb1-8e08-2b119330c35a 8d1d7c58442749759ba7dc3a19799796 5d69aaa276f94de98e4011fa17428b40 - - default default] Lock "14f26d33-78bc-4b9c-9b73-f660998601ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:24 np0005593233 nova_compute[222017]: 2026-01-23 09:47:24.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 23 04:47:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:25 np0005593233 nova_compute[222017]: 2026-01-23 09:47:25.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:25.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:25.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:27.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 23 04:47:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:27.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:28 np0005593233 nova_compute[222017]: 2026-01-23 09:47:28.081 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:29 np0005593233 nova_compute[222017]: 2026-01-23 09:47:29.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 23 04:47:29 np0005593233 nova_compute[222017]: 2026-01-23 09:47:29.651 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:29.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:30.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:30 np0005593233 nova_compute[222017]: 2026-01-23 09:47:30.594 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:31.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:32.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:32 np0005593233 podman[246352]: 2026-01-23 09:47:32.061292279 +0000 UTC m=+0.068091906 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 04:47:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 23 04:47:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:33.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:34.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:34 np0005593233 nova_compute[222017]: 2026-01-23 09:47:34.652 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.276 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.277 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.304 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.418 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.419 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.429 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.430 222021 INFO nova.compute.claims [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:47:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.546 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161640.5327833, ccb9c4ce-3766-49d2-94e5-88a46bcc07b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.547 222021 INFO nova.compute.manager [-] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.568 222021 DEBUG nova.compute.manager [None req-ab025533-5057-4df6-8791-bc9d53719913 - - - - - -] [instance: ccb9c4ce-3766-49d2-94e5-88a46bcc07b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.583 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:35 np0005593233 nova_compute[222017]: 2026-01-23 09:47:35.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:35.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:36.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3918197371' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.079 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.087 222021 DEBUG nova.compute.provider_tree [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.120 222021 DEBUG nova.scheduler.client.report [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.163 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.165 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.228 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.230 222021 DEBUG nova.network.neutron [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.254 222021 INFO nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.283 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.409 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.411 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.412 222021 INFO nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Creating image(s)#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.454 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.492 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.530 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.537 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.577 222021 DEBUG nova.policy [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae77ac206ed246b49262982455564c01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ff15972efaf47c1a5483927aa058ee1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.582 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161641.4962096, 14f26d33-78bc-4b9c-9b73-f660998601ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.583 222021 INFO nova.compute.manager [-] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.611 222021 DEBUG nova.compute.manager [None req-a654690b-7e76-47bd-8ae2-e2b6c10d6739 - - - - - -] [instance: 14f26d33-78bc-4b9c-9b73-f660998601ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.622 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.623 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.624 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.624 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.655 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:36 np0005593233 nova_compute[222017]: 2026-01-23 09:47:36.661 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 98913b45-a388-4bef-ad03-aed41edbdb44_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.018 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 98913b45-a388-4bef-ad03-aed41edbdb44_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.357s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.104 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] resizing rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.230 222021 DEBUG nova.objects.instance [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'migration_context' on Instance uuid 98913b45-a388-4bef-ad03-aed41edbdb44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.247 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.248 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Ensure instance console log exists: /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.248 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.249 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.249 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:37 np0005593233 nova_compute[222017]: 2026-01-23 09:47:37.723 222021 DEBUG nova.network.neutron [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Successfully created port: a0780eca-4f7d-47d8-8242-b0837c25e711 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:47:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:37.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:38.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 23 04:47:39 np0005593233 nova_compute[222017]: 2026-01-23 09:47:39.655 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:39.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:40.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:40 np0005593233 nova_compute[222017]: 2026-01-23 09:47:40.628 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:41.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:42.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:42.650 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:42.651 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:42.651 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.441 222021 DEBUG nova.network.neutron [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Successfully updated port: a0780eca-4f7d-47d8-8242-b0837c25e711 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.518 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "refresh_cache-98913b45-a388-4bef-ad03-aed41edbdb44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.519 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquired lock "refresh_cache-98913b45-a388-4bef-ad03-aed41edbdb44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.519 222021 DEBUG nova.network.neutron [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.695 222021 DEBUG nova.compute.manager [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-changed-a0780eca-4f7d-47d8-8242-b0837c25e711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.696 222021 DEBUG nova.compute.manager [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Refreshing instance network info cache due to event network-changed-a0780eca-4f7d-47d8-8242-b0837c25e711. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.696 222021 DEBUG oslo_concurrency.lockutils [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-98913b45-a388-4bef-ad03-aed41edbdb44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:43 np0005593233 nova_compute[222017]: 2026-01-23 09:47:43.741 222021 DEBUG nova.network.neutron [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:47:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:43.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:44.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e241 e241: 3 total, 3 up, 3 in
Jan 23 04:47:44 np0005593233 nova_compute[222017]: 2026-01-23 09:47:44.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:45.562 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:47:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:45.563 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:47:45 np0005593233 nova_compute[222017]: 2026-01-23 09:47:45.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:45 np0005593233 nova_compute[222017]: 2026-01-23 09:47:45.631 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:45.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:46.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.524 222021 DEBUG nova.network.neutron [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Updating instance_info_cache with network_info: [{"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.775 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Releasing lock "refresh_cache-98913b45-a388-4bef-ad03-aed41edbdb44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.775 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Instance network_info: |[{"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.776 222021 DEBUG oslo_concurrency.lockutils [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-98913b45-a388-4bef-ad03-aed41edbdb44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.776 222021 DEBUG nova.network.neutron [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Refreshing network info cache for port a0780eca-4f7d-47d8-8242-b0837c25e711 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.780 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Start _get_guest_xml network_info=[{"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.788 222021 WARNING nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.800 222021 DEBUG nova.virt.libvirt.host [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.802 222021 DEBUG nova.virt.libvirt.host [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.809 222021 DEBUG nova.virt.libvirt.host [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.810 222021 DEBUG nova.virt.libvirt.host [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.811 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.812 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.812 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.813 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.813 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.813 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.813 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.814 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.814 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.814 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.814 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.815 222021 DEBUG nova.virt.hardware [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:47:46 np0005593233 nova_compute[222017]: 2026-01-23 09:47:46.819 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:47:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2388971212' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.390 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.426 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.432 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:47.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:47:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/625521465' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.904 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.906 222021 DEBUG nova.virt.libvirt.vif [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-316501491',display_name='tempest-ImagesOneServerNegativeTestJSON-server-316501491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-316501491',id=61,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-kit0jq0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:47:36Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=98913b45-a388-4bef-ad03-aed41edbdb44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.906 222021 DEBUG nova.network.os_vif_util [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.908 222021 DEBUG nova.network.os_vif_util [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.909 222021 DEBUG nova.objects.instance [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 98913b45-a388-4bef-ad03-aed41edbdb44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.975 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <uuid>98913b45-a388-4bef-ad03-aed41edbdb44</uuid>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <name>instance-0000003d</name>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-316501491</nova:name>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:47:46</nova:creationTime>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:user uuid="ae77ac206ed246b49262982455564c01">tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member</nova:user>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:project uuid="6ff15972efaf47c1a5483927aa058ee1">tempest-ImagesOneServerNegativeTestJSON-1870050002</nova:project>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <nova:port uuid="a0780eca-4f7d-47d8-8242-b0837c25e711">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <entry name="serial">98913b45-a388-4bef-ad03-aed41edbdb44</entry>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <entry name="uuid">98913b45-a388-4bef-ad03-aed41edbdb44</entry>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/98913b45-a388-4bef-ad03-aed41edbdb44_disk">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/98913b45-a388-4bef-ad03-aed41edbdb44_disk.config">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:11:67:66"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <target dev="tapa0780eca-4f"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/console.log" append="off"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:47:47 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:47:47 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:47:47 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:47:47 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.977 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Preparing to wait for external event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.978 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.978 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.979 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.979 222021 DEBUG nova.virt.libvirt.vif [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-316501491',display_name='tempest-ImagesOneServerNegativeTestJSON-server-316501491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-316501491',id=61,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-kit0jq0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:47:36Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=98913b45-a388-4bef-ad03-aed41edbdb44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.980 222021 DEBUG nova.network.os_vif_util [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.980 222021 DEBUG nova.network.os_vif_util [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.981 222021 DEBUG os_vif [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.982 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.982 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.983 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.987 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.987 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0780eca-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:47 np0005593233 nova_compute[222017]: 2026-01-23 09:47:47.988 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa0780eca-4f, col_values=(('external_ids', {'iface-id': 'a0780eca-4f7d-47d8-8242-b0837c25e711', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:67:66', 'vm-uuid': '98913b45-a388-4bef-ad03-aed41edbdb44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:48 np0005593233 NetworkManager[48871]: <info>  [1769161668.0196] manager: (tapa0780eca-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.020 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.023 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.029 222021 INFO os_vif [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f')#033[00m
Jan 23 04:47:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:48.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:48.565 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.642 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.642 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.643 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No VIF found with MAC fa:16:3e:11:67:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.643 222021 INFO nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Using config drive#033[00m
Jan 23 04:47:48 np0005593233 nova_compute[222017]: 2026-01-23 09:47:48.675 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.404 222021 INFO nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Creating config drive at /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/disk.config#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.409 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp38bk_nvp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.452 222021 DEBUG nova.network.neutron [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Updated VIF entry in instance network info cache for port a0780eca-4f7d-47d8-8242-b0837c25e711. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.454 222021 DEBUG nova.network.neutron [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Updating instance_info_cache with network_info: [{"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.510 222021 DEBUG oslo_concurrency.lockutils [req-9cd1abc1-435b-4fef-a008-44b97380615e req-91157727-88ea-4cf2-8aa0-e9049b394b70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-98913b45-a388-4bef-ad03-aed41edbdb44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.563 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp38bk_nvp" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.595 222021 DEBUG nova.storage.rbd_utils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 98913b45-a388-4bef-ad03-aed41edbdb44_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.603 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/disk.config 98913b45-a388-4bef-ad03-aed41edbdb44_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:49.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.933 222021 DEBUG oslo_concurrency.processutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/disk.config 98913b45-a388-4bef-ad03-aed41edbdb44_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.330s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:49 np0005593233 nova_compute[222017]: 2026-01-23 09:47:49.934 222021 INFO nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Deleting local config drive /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44/disk.config because it was imported into RBD.#033[00m
Jan 23 04:47:50 np0005593233 kernel: tapa0780eca-4f: entered promiscuous mode
Jan 23 04:47:50 np0005593233 NetworkManager[48871]: <info>  [1769161670.0131] manager: (tapa0780eca-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.014 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:50Z|00194|binding|INFO|Claiming lport a0780eca-4f7d-47d8-8242-b0837c25e711 for this chassis.
Jan 23 04:47:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:50Z|00195|binding|INFO|a0780eca-4f7d-47d8-8242-b0837c25e711: Claiming fa:16:3e:11:67:66 10.100.0.4
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.025 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:67:66 10.100.0.4'], port_security=['fa:16:3e:11:67:66 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '98913b45-a388-4bef-ad03-aed41edbdb44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ff15972efaf47c1a5483927aa058ee1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '579c0db7-85ec-4a28-8798-9415af4416d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d761e1-c6a0-4a70-8ba3-d112e3c168c9, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=a0780eca-4f7d-47d8-8242-b0837c25e711) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.026 140224 INFO neutron.agent.ovn.metadata.agent [-] Port a0780eca-4f7d-47d8-8242-b0837c25e711 in datapath 3d3b9f7b-3375-48b6-888c-0ff96bc928a5 bound to our chassis#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.028 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d3b9f7b-3375-48b6-888c-0ff96bc928a5#033[00m
Jan 23 04:47:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:50Z|00196|binding|INFO|Setting lport a0780eca-4f7d-47d8-8242-b0837c25e711 ovn-installed in OVS
Jan 23 04:47:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:50Z|00197|binding|INFO|Setting lport a0780eca-4f7d-47d8-8242-b0837c25e711 up in Southbound
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:50.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.038 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.046 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eddc108c-e1da-4c7f-930d-ea8a2d1ad947]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.047 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d3b9f7b-31 in ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:47:50 np0005593233 systemd-udevd[246692]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.050 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d3b9f7b-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.050 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6db522d2-72d5-417b-ae9d-863eb0a811a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.051 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ea893ec5-9412-462e-8f1d-53e202524e17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 NetworkManager[48871]: <info>  [1769161670.0697] device (tapa0780eca-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:47:50 np0005593233 NetworkManager[48871]: <info>  [1769161670.0705] device (tapa0780eca-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.065 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[1f666edb-6295-4bb2-a559-efb8cbd7eb4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 systemd-machined[190954]: New machine qemu-31-instance-0000003d.
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.083 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d7451821-20b5-4dc1-9992-9fff1998c6cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 systemd[1]: Started Virtual Machine qemu-31-instance-0000003d.
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.124 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b990663d-2879-45ba-b1c3-6fae272a73d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 NetworkManager[48871]: <info>  [1769161670.1316] manager: (tap3d3b9f7b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/97)
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.130 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0c93c05a-f3f5-4516-ba8a-e04c0a642d11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 systemd-udevd[246698]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.169 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9d31d1d0-debe-43d3-9884-15efc9e12918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.173 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9b353d00-b579-4090-9965-e8b86c923c39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 NetworkManager[48871]: <info>  [1769161670.2038] device (tap3d3b9f7b-30): carrier: link connected
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.212 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ba584480-dcae-4c27-860d-2ae51d1bab20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.235 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[852ab46a-3a5c-4ae6-a4bb-21db1ea575af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d3b9f7b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:f9:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558283, 'reachable_time': 32572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246727, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.254 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce21f55b-b378-4756-b7fc-9b90a2a604fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:f995'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 558283, 'tstamp': 558283}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246728, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.276 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ba532c34-0008-49f0-8714-861eac17d526]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d3b9f7b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:f9:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558283, 'reachable_time': 32572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246729, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.320 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[77f5e77a-a3fb-4cd6-8e22-fbde4783f28f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.402 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c4534545-2a08-43e4-a04e-a52a51e8d824]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.404 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d3b9f7b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.405 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.405 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d3b9f7b-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.407 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:50 np0005593233 kernel: tap3d3b9f7b-30: entered promiscuous mode
Jan 23 04:47:50 np0005593233 NetworkManager[48871]: <info>  [1769161670.4088] manager: (tap3d3b9f7b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.411 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d3b9f7b-30, col_values=(('external_ids', {'iface-id': '28841559-e121-4a0b-bc47-8aa922ed5fb0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:47:50Z|00198|binding|INFO|Releasing lport 28841559-e121-4a0b-bc47-8aa922ed5fb0 from this chassis (sb_readonly=0)
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.431 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.432 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a2927a48-cec9-462c-bae7-c35cf9fbdc47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.433 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-3d3b9f7b-3375-48b6-888c-0ff96bc928a5
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 3d3b9f7b-3375-48b6-888c-0ff96bc928a5
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:47:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:47:50.433 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'env', 'PROCESS_TAG=haproxy-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.669 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161670.667889, 98913b45-a388-4bef-ad03-aed41edbdb44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.669 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] VM Started (Lifecycle Event)#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.678 222021 DEBUG nova.compute.manager [req-7d1141ef-aa38-49b2-bc6c-cb2c31a37c89 req-bcd2eccc-86c0-47c6-a832-983f178cc24f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.678 222021 DEBUG oslo_concurrency.lockutils [req-7d1141ef-aa38-49b2-bc6c-cb2c31a37c89 req-bcd2eccc-86c0-47c6-a832-983f178cc24f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.678 222021 DEBUG oslo_concurrency.lockutils [req-7d1141ef-aa38-49b2-bc6c-cb2c31a37c89 req-bcd2eccc-86c0-47c6-a832-983f178cc24f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.679 222021 DEBUG oslo_concurrency.lockutils [req-7d1141ef-aa38-49b2-bc6c-cb2c31a37c89 req-bcd2eccc-86c0-47c6-a832-983f178cc24f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.679 222021 DEBUG nova.compute.manager [req-7d1141ef-aa38-49b2-bc6c-cb2c31a37c89 req-bcd2eccc-86c0-47c6-a832-983f178cc24f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Processing event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.679 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.684 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.688 222021 INFO nova.virt.libvirt.driver [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Instance spawned successfully.#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.688 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.706 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.710 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.733 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.734 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.735 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.735 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.736 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.737 222021 DEBUG nova.virt.libvirt.driver [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.742 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.742 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161670.668276, 98913b45-a388-4bef-ad03-aed41edbdb44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.742 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.773 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.778 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161670.6832395, 98913b45-a388-4bef-ad03-aed41edbdb44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.778 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.806 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.810 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.847 222021 INFO nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Took 14.44 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.848 222021 DEBUG nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.848 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:47:50 np0005593233 podman[246801]: 2026-01-23 09:47:50.884683226 +0000 UTC m=+0.054669073 container create fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.931 222021 INFO nova.compute.manager [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Took 15.55 seconds to build instance.#033[00m
Jan 23 04:47:50 np0005593233 systemd[1]: Started libpod-conmon-fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab.scope.
Jan 23 04:47:50 np0005593233 podman[246801]: 2026-01-23 09:47:50.856323856 +0000 UTC m=+0.026309723 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:47:50 np0005593233 nova_compute[222017]: 2026-01-23 09:47:50.957 222021 DEBUG oslo_concurrency.lockutils [None req-55801909-e347-4280-837e-40ec0d5a326d ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:50 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:47:50 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac7089aece0f76e55d95b92fa517b92b4714ccf30361b3f2359f9b00771c1df6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:47:51 np0005593233 podman[246801]: 2026-01-23 09:47:51.006199979 +0000 UTC m=+0.176185826 container init fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:47:51 np0005593233 podman[246801]: 2026-01-23 09:47:51.014568378 +0000 UTC m=+0.184554255 container start fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:47:51 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [NOTICE]   (246839) : New worker (246847) forked
Jan 23 04:47:51 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [NOTICE]   (246839) : Loading success.
Jan 23 04:47:51 np0005593233 podman[246812]: 2026-01-23 09:47:51.052635565 +0000 UTC m=+0.119595258 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:47:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:51.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:52.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:52 np0005593233 nova_compute[222017]: 2026-01-23 09:47:52.801 222021 DEBUG nova.compute.manager [req-04acf53b-465b-4687-a36b-c0c06f8f546c req-c57a8e5e-cab4-4a3f-bd1c-6642d2008dde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:52 np0005593233 nova_compute[222017]: 2026-01-23 09:47:52.801 222021 DEBUG oslo_concurrency.lockutils [req-04acf53b-465b-4687-a36b-c0c06f8f546c req-c57a8e5e-cab4-4a3f-bd1c-6642d2008dde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:52 np0005593233 nova_compute[222017]: 2026-01-23 09:47:52.802 222021 DEBUG oslo_concurrency.lockutils [req-04acf53b-465b-4687-a36b-c0c06f8f546c req-c57a8e5e-cab4-4a3f-bd1c-6642d2008dde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:52 np0005593233 nova_compute[222017]: 2026-01-23 09:47:52.802 222021 DEBUG oslo_concurrency.lockutils [req-04acf53b-465b-4687-a36b-c0c06f8f546c req-c57a8e5e-cab4-4a3f-bd1c-6642d2008dde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:52 np0005593233 nova_compute[222017]: 2026-01-23 09:47:52.802 222021 DEBUG nova.compute.manager [req-04acf53b-465b-4687-a36b-c0c06f8f546c req-c57a8e5e-cab4-4a3f-bd1c-6642d2008dde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] No waiting events found dispatching network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:47:52 np0005593233 nova_compute[222017]: 2026-01-23 09:47:52.802 222021 WARNING nova.compute.manager [req-04acf53b-465b-4687-a36b-c0c06f8f546c req-c57a8e5e-cab4-4a3f-bd1c-6642d2008dde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received unexpected event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:47:53 np0005593233 nova_compute[222017]: 2026-01-23 09:47:53.020 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:53 np0005593233 nova_compute[222017]: 2026-01-23 09:47:53.506 222021 DEBUG nova.compute.manager [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:53 np0005593233 nova_compute[222017]: 2026-01-23 09:47:53.579 222021 INFO nova.compute.manager [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] instance snapshotting#033[00m
Jan 23 04:47:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:53.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:54 np0005593233 nova_compute[222017]: 2026-01-23 09:47:54.014 222021 INFO nova.virt.libvirt.driver [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Beginning live snapshot process#033[00m
Jan 23 04:47:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:54.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:54 np0005593233 nova_compute[222017]: 2026-01-23 09:47:54.291 222021 DEBUG nova.virt.libvirt.imagebackend [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:47:54 np0005593233 nova_compute[222017]: 2026-01-23 09:47:54.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593233 nova_compute[222017]: 2026-01-23 09:47:54.726 222021 DEBUG nova.storage.rbd_utils [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] creating snapshot(d337b8462d604d45b31b784591bb7ff9) on rbd image(98913b45-a388-4bef-ad03-aed41edbdb44_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:47:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e242 e242: 3 total, 3 up, 3 in
Jan 23 04:47:55 np0005593233 nova_compute[222017]: 2026-01-23 09:47:55.229 222021 DEBUG nova.storage.rbd_utils [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] cloning vms/98913b45-a388-4bef-ad03-aed41edbdb44_disk@d337b8462d604d45b31b784591bb7ff9 to images/4741efb0-ac27-4eb3-af8a-63929f9ec3e2 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:47:55 np0005593233 nova_compute[222017]: 2026-01-23 09:47:55.525 222021 DEBUG nova.storage.rbd_utils [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] flattening images/4741efb0-ac27-4eb3-af8a-63929f9ec3e2 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:47:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:55.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:47:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:56.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:47:56 np0005593233 nova_compute[222017]: 2026-01-23 09:47:56.252 222021 DEBUG nova.storage.rbd_utils [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] removing snapshot(d337b8462d604d45b31b784591bb7ff9) on rbd image(98913b45-a388-4bef-ad03-aed41edbdb44_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:47:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e243 e243: 3 total, 3 up, 3 in
Jan 23 04:47:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:57.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:58 np0005593233 nova_compute[222017]: 2026-01-23 09:47:58.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:58.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:58 np0005593233 nova_compute[222017]: 2026-01-23 09:47:58.298 222021 DEBUG nova.storage.rbd_utils [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] creating snapshot(snap) on rbd image(4741efb0-ac27-4eb3-af8a-63929f9ec3e2) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:47:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e244 e244: 3 total, 3 up, 3 in
Jan 23 04:47:59 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:47:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:47:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:59.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image 4741efb0-ac27-4eb3-af8a-63929f9ec3e2 could not be found.
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID 4741efb0-ac27-4eb3-af8a-63929f9ec3e2
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver 
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver 
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image 4741efb0-ac27-4eb3-af8a-63929f9ec3e2 could not be found.
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:47:59.999 222021 ERROR nova.virt.libvirt.driver #033[00m
Jan 23 04:48:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:00.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:00 np0005593233 nova_compute[222017]: 2026-01-23 09:48:00.518 222021 DEBUG nova.storage.rbd_utils [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] removing snapshot(snap) on rbd image(4741efb0-ac27-4eb3-af8a-63929f9ec3e2) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:48:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e245 e245: 3 total, 3 up, 3 in
Jan 23 04:48:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:01.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:02.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:03 np0005593233 nova_compute[222017]: 2026-01-23 09:48:03.024 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:03 np0005593233 podman[247033]: 2026-01-23 09:48:03.07106184 +0000 UTC m=+0.080544503 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 04:48:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:03.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:04.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:04 np0005593233 nova_compute[222017]: 2026-01-23 09:48:04.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e246 e246: 3 total, 3 up, 3 in
Jan 23 04:48:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:05 np0005593233 nova_compute[222017]: 2026-01-23 09:48:05.672 222021 WARNING nova.compute.manager [None req-d56c3ec7-2694-49a4-a238-6aa216585ce9 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Image not found during snapshot: nova.exception.ImageNotFound: Image 4741efb0-ac27-4eb3-af8a-63929f9ec3e2 could not be found.#033[00m
Jan 23 04:48:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:05.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:06.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:06Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:67:66 10.100.0.4
Jan 23 04:48:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:06Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:67:66 10.100.0.4
Jan 23 04:48:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:07.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:08 np0005593233 nova_compute[222017]: 2026-01-23 09:48:08.028 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:08.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.143 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.143 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.144 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.144 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.145 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.147 222021 INFO nova.compute.manager [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Terminating instance#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.149 222021 DEBUG nova.compute.manager [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:48:09 np0005593233 nova_compute[222017]: 2026-01-23 09:48:09.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:09.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 e247: 3 total, 3 up, 3 in
Jan 23 04:48:09 np0005593233 kernel: tapa0780eca-4f (unregistering): left promiscuous mode
Jan 23 04:48:10 np0005593233 NetworkManager[48871]: <info>  [1769161690.0014] device (tapa0780eca-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:48:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:10 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:10Z|00199|binding|INFO|Releasing lport a0780eca-4f7d-47d8-8242-b0837c25e711 from this chassis (sb_readonly=0)
Jan 23 04:48:10 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:10Z|00200|binding|INFO|Setting lport a0780eca-4f7d-47d8-8242-b0837c25e711 down in Southbound
Jan 23 04:48:10 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:10Z|00201|binding|INFO|Removing iface tapa0780eca-4f ovn-installed in OVS
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.051 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.069 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:10.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.072 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:67:66 10.100.0.4'], port_security=['fa:16:3e:11:67:66 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '98913b45-a388-4bef-ad03-aed41edbdb44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ff15972efaf47c1a5483927aa058ee1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '579c0db7-85ec-4a28-8798-9415af4416d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d761e1-c6a0-4a70-8ba3-d112e3c168c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=a0780eca-4f7d-47d8-8242-b0837c25e711) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.075 140224 INFO neutron.agent.ovn.metadata.agent [-] Port a0780eca-4f7d-47d8-8242-b0837c25e711 in datapath 3d3b9f7b-3375-48b6-888c-0ff96bc928a5 unbound from our chassis#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.077 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d3b9f7b-3375-48b6-888c-0ff96bc928a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.078 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[566cd143-1b9c-4252-969f-59b1250a712d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.080 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 namespace which is not needed anymore#033[00m
Jan 23 04:48:10 np0005593233 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Jan 23 04:48:10 np0005593233 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003d.scope: Consumed 13.963s CPU time.
Jan 23 04:48:10 np0005593233 systemd-machined[190954]: Machine qemu-31-instance-0000003d terminated.
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.173 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.177 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.187 222021 INFO nova.virt.libvirt.driver [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Instance destroyed successfully.#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.189 222021 DEBUG nova.objects.instance [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'resources' on Instance uuid 98913b45-a388-4bef-ad03-aed41edbdb44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.207 222021 DEBUG nova.virt.libvirt.vif [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-316501491',display_name='tempest-ImagesOneServerNegativeTestJSON-server-316501491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-316501491',id=61,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-kit0jq0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:48:05Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=98913b45-a388-4bef-ad03-aed41edbdb44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.209 222021 DEBUG nova.network.os_vif_util [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "a0780eca-4f7d-47d8-8242-b0837c25e711", "address": "fa:16:3e:11:67:66", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0780eca-4f", "ovs_interfaceid": "a0780eca-4f7d-47d8-8242-b0837c25e711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.210 222021 DEBUG nova.network.os_vif_util [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.211 222021 DEBUG os_vif [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.213 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.213 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0780eca-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.215 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.217 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.220 222021 INFO os_vif [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:67:66,bridge_name='br-int',has_traffic_filtering=True,id=a0780eca-4f7d-47d8-8242-b0837c25e711,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0780eca-4f')#033[00m
Jan 23 04:48:10 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [NOTICE]   (246839) : haproxy version is 2.8.14-c23fe91
Jan 23 04:48:10 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [NOTICE]   (246839) : path to executable is /usr/sbin/haproxy
Jan 23 04:48:10 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [WARNING]  (246839) : Exiting Master process...
Jan 23 04:48:10 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [ALERT]    (246839) : Current worker (246847) exited with code 143 (Terminated)
Jan 23 04:48:10 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[246821]: [WARNING]  (246839) : All workers exited. Exiting... (0)
Jan 23 04:48:10 np0005593233 systemd[1]: libpod-fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab.scope: Deactivated successfully.
Jan 23 04:48:10 np0005593233 podman[247086]: 2026-01-23 09:48:10.346343121 +0000 UTC m=+0.154853146 container died fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:48:10 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab-userdata-shm.mount: Deactivated successfully.
Jan 23 04:48:10 np0005593233 systemd[1]: var-lib-containers-storage-overlay-ac7089aece0f76e55d95b92fa517b92b4714ccf30361b3f2359f9b00771c1df6-merged.mount: Deactivated successfully.
Jan 23 04:48:10 np0005593233 podman[247086]: 2026-01-23 09:48:10.432359138 +0000 UTC m=+0.240869163 container cleanup fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:48:10 np0005593233 systemd[1]: libpod-conmon-fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab.scope: Deactivated successfully.
Jan 23 04:48:10 np0005593233 podman[247141]: 2026-01-23 09:48:10.715212701 +0000 UTC m=+0.257134589 container remove fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.723 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee8bb9a-51d9-4904-99f8-03f6392bd3b3]: (4, ('Fri Jan 23 09:48:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 (fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab)\nfb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab\nFri Jan 23 09:48:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 (fb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab)\nfb7317b352573cc1119c31a5a3d5d3e87c56f9eb3f70ac9d0598975e99812fab\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.725 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[94de2834-b517-488a-ae73-4832fdbf36a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.726 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d3b9f7b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 kernel: tap3d3b9f7b-30: left promiscuous mode
Jan 23 04:48:10 np0005593233 nova_compute[222017]: 2026-01-23 09:48:10.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.748 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9f67d3-dbc9-4d21-9e3b-3570b5ac0d50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.767 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[abdd94fe-759e-4236-8ec8-6a17d9e4193e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.769 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9614fc6d-bc6b-4461-a69d-cae240b609de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.790 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e968be51-73c1-401b-b483-a44a3aeae468]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 558274, 'reachable_time': 43974, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247157, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.793 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:48:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:10.793 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[42339793-5208-4c10-b014-252147120f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:10 np0005593233 systemd[1]: run-netns-ovnmeta\x2d3d3b9f7b\x2d3375\x2d48b6\x2d888c\x2d0ff96bc928a5.mount: Deactivated successfully.
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.607 222021 DEBUG nova.compute.manager [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-vif-unplugged-a0780eca-4f7d-47d8-8242-b0837c25e711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.607 222021 DEBUG oslo_concurrency.lockutils [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.607 222021 DEBUG oslo_concurrency.lockutils [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.607 222021 DEBUG oslo_concurrency.lockutils [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.608 222021 DEBUG nova.compute.manager [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] No waiting events found dispatching network-vif-unplugged-a0780eca-4f7d-47d8-8242-b0837c25e711 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.608 222021 DEBUG nova.compute.manager [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-vif-unplugged-a0780eca-4f7d-47d8-8242-b0837c25e711 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.608 222021 DEBUG nova.compute.manager [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.608 222021 DEBUG oslo_concurrency.lockutils [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.608 222021 DEBUG oslo_concurrency.lockutils [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.608 222021 DEBUG oslo_concurrency.lockutils [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.609 222021 DEBUG nova.compute.manager [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] No waiting events found dispatching network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:11 np0005593233 nova_compute[222017]: 2026-01-23 09:48:11.609 222021 WARNING nova.compute.manager [req-8d066887-e1f3-4c43-9a13-697327df681c req-ba2dfedb-59c1-4abd-91ea-3186d739c7b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received unexpected event network-vif-plugged-a0780eca-4f7d-47d8-8242-b0837c25e711 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:48:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:11.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:12.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:12 np0005593233 nova_compute[222017]: 2026-01-23 09:48:12.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:13 np0005593233 nova_compute[222017]: 2026-01-23 09:48:13.398 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:13.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.002 222021 INFO nova.virt.libvirt.driver [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Deleting instance files /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44_del#033[00m
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.003 222021 INFO nova.virt.libvirt.driver [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Deletion of /var/lib/nova/instances/98913b45-a388-4bef-ad03-aed41edbdb44_del complete#033[00m
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.077 222021 INFO nova.compute.manager [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Took 4.93 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.077 222021 DEBUG oslo.service.loopingcall [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.078 222021 DEBUG nova.compute.manager [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.078 222021 DEBUG nova.network.neutron [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:48:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:14.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:14 np0005593233 nova_compute[222017]: 2026-01-23 09:48:14.674 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.138 222021 DEBUG nova.network.neutron [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.160 222021 INFO nova.compute.manager [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Took 1.08 seconds to deallocate network for instance.#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.217 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.229 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.230 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.256 222021 DEBUG nova.compute.manager [req-10c99cd2-371f-416d-ad06-fb78dd486f38 req-17c7f154-ec14-4f8b-910e-759c647d09c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Received event network-vif-deleted-a0780eca-4f7d-47d8-8242-b0837c25e711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.305 222021 DEBUG oslo_concurrency.processutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1362221208' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.766 222021 DEBUG oslo_concurrency.processutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.775 222021 DEBUG nova.compute.provider_tree [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:15.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.804 222021 DEBUG nova.scheduler.client.report [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.834 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.865 222021 INFO nova.scheduler.client.report [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Deleted allocations for instance 98913b45-a388-4bef-ad03-aed41edbdb44#033[00m
Jan 23 04:48:15 np0005593233 nova_compute[222017]: 2026-01-23 09:48:15.948 222021 DEBUG oslo_concurrency.lockutils [None req-e22a4c00-286a-4b45-89e4-3ccab625eb94 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "98913b45-a388-4bef-ad03-aed41edbdb44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:16.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:16 np0005593233 nova_compute[222017]: 2026-01-23 09:48:16.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:17.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:18.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.412 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.413 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.460 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.461 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.461 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.461 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.461 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:19.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3564097539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:19 np0005593233 nova_compute[222017]: 2026-01-23 09:48:19.924 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:20.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.113 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.115 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4658MB free_disk=20.922149658203125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.115 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.116 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.226 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.226 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.292 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1928877678' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.764 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.771 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.790 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.810 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:48:20 np0005593233 nova_compute[222017]: 2026-01-23 09:48:20.811 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:21 np0005593233 nova_compute[222017]: 2026-01-23 09:48:21.784 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:21 np0005593233 nova_compute[222017]: 2026-01-23 09:48:21.784 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:21 np0005593233 nova_compute[222017]: 2026-01-23 09:48:21.785 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:48:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000086s ======
Jan 23 04:48:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:21.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000086s
Jan 23 04:48:22 np0005593233 podman[247358]: 2026-01-23 09:48:22.084373231 +0000 UTC m=+0.094914043 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 04:48:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:22.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:48:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.793 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.793 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.817 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.902 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.903 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.910 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:48:22 np0005593233 nova_compute[222017]: 2026-01-23 09:48:22.910 222021 INFO nova.compute.claims [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.010 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1361621461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.472 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.478 222021 DEBUG nova.compute.provider_tree [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.507 222021 DEBUG nova.scheduler.client.report [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.545 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.547 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.601 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.602 222021 DEBUG nova.network.neutron [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.638 222021 INFO nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.657 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.763 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.764 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.765 222021 INFO nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Creating image(s)#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.796 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:23.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.832 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.868 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.874 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.910 222021 DEBUG nova.policy [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae77ac206ed246b49262982455564c01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ff15972efaf47c1a5483927aa058ee1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.960 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.961 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.962 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:23 np0005593233 nova_compute[222017]: 2026-01-23 09:48:23.962 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:24 np0005593233 nova_compute[222017]: 2026-01-23 09:48:24.001 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:24 np0005593233 nova_compute[222017]: 2026-01-23 09:48:24.006 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 5f158757-86a9-4040-b642-e1068e1ae821_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:24.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:24 np0005593233 nova_compute[222017]: 2026-01-23 09:48:24.679 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:24 np0005593233 nova_compute[222017]: 2026-01-23 09:48:24.867 222021 DEBUG nova.network.neutron [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Successfully created port: 51a06cf4-3fc8-455e-9d00-9b73e111f082 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:48:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.186 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161690.1847858, 98913b45-a388-4bef-ad03-aed41edbdb44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.187 222021 INFO nova.compute.manager [-] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.216 222021 DEBUG nova.compute.manager [None req-211e5b41-894e-4024-a626-2a09c465b11b - - - - - -] [instance: 98913b45-a388-4bef-ad03-aed41edbdb44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.254 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:25.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.866 222021 DEBUG nova.network.neutron [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Successfully updated port: 51a06cf4-3fc8-455e-9d00-9b73e111f082 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.881 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "refresh_cache-5f158757-86a9-4040-b642-e1068e1ae821" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.882 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquired lock "refresh_cache-5f158757-86a9-4040-b642-e1068e1ae821" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.882 222021 DEBUG nova.network.neutron [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.981 222021 DEBUG nova.compute.manager [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-changed-51a06cf4-3fc8-455e-9d00-9b73e111f082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.981 222021 DEBUG nova.compute.manager [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Refreshing instance network info cache due to event network-changed-51a06cf4-3fc8-455e-9d00-9b73e111f082. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:48:25 np0005593233 nova_compute[222017]: 2026-01-23 09:48:25.982 222021 DEBUG oslo_concurrency.lockutils [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5f158757-86a9-4040-b642-e1068e1ae821" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:26 np0005593233 nova_compute[222017]: 2026-01-23 09:48:26.071 222021 DEBUG nova.network.neutron [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:48:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:26.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:26 np0005593233 nova_compute[222017]: 2026-01-23 09:48:26.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:26 np0005593233 nova_compute[222017]: 2026-01-23 09:48:26.652 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 5f158757-86a9-4040-b642-e1068e1ae821_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.646s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:26 np0005593233 nova_compute[222017]: 2026-01-23 09:48:26.749 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] resizing rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.250 222021 DEBUG nova.objects.instance [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f158757-86a9-4040-b642-e1068e1ae821 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.269 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.269 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Ensure instance console log exists: /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.270 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.270 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.270 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.476 222021 DEBUG nova.network.neutron [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Updating instance_info_cache with network_info: [{"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.506 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Releasing lock "refresh_cache-5f158757-86a9-4040-b642-e1068e1ae821" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.506 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Instance network_info: |[{"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.507 222021 DEBUG oslo_concurrency.lockutils [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5f158757-86a9-4040-b642-e1068e1ae821" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.507 222021 DEBUG nova.network.neutron [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Refreshing network info cache for port 51a06cf4-3fc8-455e-9d00-9b73e111f082 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.510 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Start _get_guest_xml network_info=[{"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.515 222021 WARNING nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.520 222021 DEBUG nova.virt.libvirt.host [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.521 222021 DEBUG nova.virt.libvirt.host [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.523 222021 DEBUG nova.virt.libvirt.host [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.524 222021 DEBUG nova.virt.libvirt.host [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.525 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.525 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.526 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.526 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.526 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.526 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.526 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.527 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.527 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.527 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.527 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.527 222021 DEBUG nova.virt.hardware [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.531 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:27.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:48:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/665985014' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:48:27 np0005593233 nova_compute[222017]: 2026-01-23 09:48:27.998 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.035 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.042 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:28.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:48:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1587366486' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.517 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.519 222021 DEBUG nova.virt.libvirt.vif [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1456391854',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1456391854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1456391854',id=64,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-39ec3ibf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:48:23Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=5f158757-86a9-4040-b642-e1068e1ae821,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.519 222021 DEBUG nova.network.os_vif_util [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.521 222021 DEBUG nova.network.os_vif_util [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:28 np0005593233 nova_compute[222017]: 2026-01-23 09:48:28.522 222021 DEBUG nova.objects.instance [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f158757-86a9-4040-b642-e1068e1ae821 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.633 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <uuid>5f158757-86a9-4040-b642-e1068e1ae821</uuid>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <name>instance-00000040</name>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1456391854</nova:name>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:48:27</nova:creationTime>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:user uuid="ae77ac206ed246b49262982455564c01">tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member</nova:user>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:project uuid="6ff15972efaf47c1a5483927aa058ee1">tempest-ImagesOneServerNegativeTestJSON-1870050002</nova:project>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <nova:port uuid="51a06cf4-3fc8-455e-9d00-9b73e111f082">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <entry name="serial">5f158757-86a9-4040-b642-e1068e1ae821</entry>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <entry name="uuid">5f158757-86a9-4040-b642-e1068e1ae821</entry>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/5f158757-86a9-4040-b642-e1068e1ae821_disk">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/5f158757-86a9-4040-b642-e1068e1ae821_disk.config">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:97:93:55"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <target dev="tap51a06cf4-3f"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/console.log" append="off"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:48:29 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:48:29 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:48:29 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:48:29 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.634 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Preparing to wait for external event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.635 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.635 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.635 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.636 222021 DEBUG nova.virt.libvirt.vif [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1456391854',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1456391854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1456391854',id=64,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-39ec3ibf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:48:23Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=5f158757-86a9-4040-b642-e1068e1ae821,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.636 222021 DEBUG nova.network.os_vif_util [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.637 222021 DEBUG nova.network.os_vif_util [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.637 222021 DEBUG os_vif [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.638 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.639 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.642 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51a06cf4-3f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.642 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap51a06cf4-3f, col_values=(('external_ids', {'iface-id': '51a06cf4-3fc8-455e-9d00-9b73e111f082', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:93:55', 'vm-uuid': '5f158757-86a9-4040-b642-e1068e1ae821'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.644 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:29 np0005593233 NetworkManager[48871]: <info>  [1769161709.6451] manager: (tap51a06cf4-3f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.648 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.652 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.653 222021 INFO os_vif [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f')#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.710 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.711 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.711 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] No VIF found with MAC fa:16:3e:97:93:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.711 222021 INFO nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Using config drive#033[00m
Jan 23 04:48:29 np0005593233 nova_compute[222017]: 2026-01-23 09:48:29.741 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:48:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:29.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:48:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:30.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:30 np0005593233 nova_compute[222017]: 2026-01-23 09:48:30.200 222021 INFO nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Creating config drive at /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/disk.config#033[00m
Jan 23 04:48:30 np0005593233 nova_compute[222017]: 2026-01-23 09:48:30.205 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe85upvyi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:30 np0005593233 nova_compute[222017]: 2026-01-23 09:48:30.344 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe85upvyi" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:30 np0005593233 nova_compute[222017]: 2026-01-23 09:48:30.385 222021 DEBUG nova.storage.rbd_utils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] rbd image 5f158757-86a9-4040-b642-e1068e1ae821_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:30 np0005593233 nova_compute[222017]: 2026-01-23 09:48:30.390 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/disk.config 5f158757-86a9-4040-b642-e1068e1ae821_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.025 222021 DEBUG oslo_concurrency.processutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/disk.config 5f158757-86a9-4040-b642-e1068e1ae821_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.636s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.026 222021 INFO nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Deleting local config drive /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821/disk.config because it was imported into RBD.#033[00m
Jan 23 04:48:31 np0005593233 kernel: tap51a06cf4-3f: entered promiscuous mode
Jan 23 04:48:31 np0005593233 NetworkManager[48871]: <info>  [1769161711.0930] manager: (tap51a06cf4-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Jan 23 04:48:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:31Z|00202|binding|INFO|Claiming lport 51a06cf4-3fc8-455e-9d00-9b73e111f082 for this chassis.
Jan 23 04:48:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:31Z|00203|binding|INFO|51a06cf4-3fc8-455e-9d00-9b73e111f082: Claiming fa:16:3e:97:93:55 10.100.0.8
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.093 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.102 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:93:55 10.100.0.8'], port_security=['fa:16:3e:97:93:55 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5f158757-86a9-4040-b642-e1068e1ae821', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ff15972efaf47c1a5483927aa058ee1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '579c0db7-85ec-4a28-8798-9415af4416d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d761e1-c6a0-4a70-8ba3-d112e3c168c9, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=51a06cf4-3fc8-455e-9d00-9b73e111f082) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.104 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 51a06cf4-3fc8-455e-9d00-9b73e111f082 in datapath 3d3b9f7b-3375-48b6-888c-0ff96bc928a5 bound to our chassis#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.107 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3d3b9f7b-3375-48b6-888c-0ff96bc928a5#033[00m
Jan 23 04:48:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:31Z|00204|binding|INFO|Setting lport 51a06cf4-3fc8-455e-9d00-9b73e111f082 ovn-installed in OVS
Jan 23 04:48:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:31Z|00205|binding|INFO|Setting lport 51a06cf4-3fc8-455e-9d00-9b73e111f082 up in Southbound
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.111 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.115 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 systemd-udevd[247707]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.121 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[28da4210-b072-4964-a708-49eec33e161a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.122 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3d3b9f7b-31 in ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.127 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3d3b9f7b-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.128 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fc78988c-7bd8-4ea3-b002-a06586e69c2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.129 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[96b55980-bfe0-404c-94bf-427c78d1243d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 NetworkManager[48871]: <info>  [1769161711.1353] device (tap51a06cf4-3f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:48:31 np0005593233 NetworkManager[48871]: <info>  [1769161711.1366] device (tap51a06cf4-3f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:48:31 np0005593233 systemd-machined[190954]: New machine qemu-32-instance-00000040.
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.141 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d7f6d0-fff3-4d92-81f1-f96b59425ce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 systemd[1]: Started Virtual Machine qemu-32-instance-00000040.
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.154 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3927210b-9b3e-43d6-938d-0a397d44a13b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.191 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[21672868-c57c-46dd-83a5-a3528b6579d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.196 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d82de029-d772-4b4d-adfe-56b44934274f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 NetworkManager[48871]: <info>  [1769161711.1978] manager: (tap3d3b9f7b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Jan 23 04:48:31 np0005593233 systemd-udevd[247719]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.228 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[97cf2e09-e1ab-4d73-ad56-d9f93230ebd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.232 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2a00a7cc-67c9-4572-babb-ef343242d30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 NetworkManager[48871]: <info>  [1769161711.2577] device (tap3d3b9f7b-30): carrier: link connected
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.261 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c56a6a54-525c-4860-900d-450fb2cf2350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.280 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[32521831-470d-4e1a-a2d1-b5e75bd73aab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d3b9f7b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:f9:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562388, 'reachable_time': 42800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247790, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.296 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3a06e12d-d3e1-4b2c-8c0a-7db58c198e78]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:f995'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562388, 'tstamp': 562388}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247791, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.318 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2f1b3627-ac8d-469d-87bc-16d7c061cf67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3d3b9f7b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:f9:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562388, 'reachable_time': 42800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247792, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.355 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[41b7caff-861c-4a0f-b000-a96c394be6b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.436 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7ba129-df05-4bb0-ac6e-03cefd544da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.437 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d3b9f7b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.437 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.438 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d3b9f7b-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:31 np0005593233 kernel: tap3d3b9f7b-30: entered promiscuous mode
Jan 23 04:48:31 np0005593233 NetworkManager[48871]: <info>  [1769161711.4405] manager: (tap3d3b9f7b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.439 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.442 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.446 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3d3b9f7b-30, col_values=(('external_ids', {'iface-id': '28841559-e121-4a0b-bc47-8aa922ed5fb0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.447 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:31Z|00206|binding|INFO|Releasing lport 28841559-e121-4a0b-bc47-8aa922ed5fb0 from this chassis (sb_readonly=0)
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.450 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.462 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[955db984-0c19-4c42-a43a-7dde1469e586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.464 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-3d3b9f7b-3375-48b6-888c-0ff96bc928a5
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.464 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.pid.haproxy
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 3d3b9f7b-3375-48b6-888c-0ff96bc928a5
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:48:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:31.465 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'env', 'PROCESS_TAG=haproxy-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3d3b9f7b-3375-48b6-888c-0ff96bc928a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:48:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:31.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:31 np0005593233 podman[247849]: 2026-01-23 09:48:31.850793776 +0000 UTC m=+0.058633086 container create 78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:48:31 np0005593233 systemd[1]: Started libpod-conmon-78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85.scope.
Jan 23 04:48:31 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:48:31 np0005593233 podman[247849]: 2026-01-23 09:48:31.818890245 +0000 UTC m=+0.026729565 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:48:31 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55122916df8bc02ae8369d37e59e664ce76e96514881bc539c79baba2246d795/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.919 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161711.919, 5f158757-86a9-4040-b642-e1068e1ae821 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.920 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] VM Started (Lifecycle Event)#033[00m
Jan 23 04:48:31 np0005593233 podman[247849]: 2026-01-23 09:48:31.9345756 +0000 UTC m=+0.142414910 container init 78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:48:31 np0005593233 podman[247849]: 2026-01-23 09:48:31.941143638 +0000 UTC m=+0.148982948 container start 78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.950 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.955 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161711.9192016, 5f158757-86a9-4040-b642-e1068e1ae821 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.955 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:48:31 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [NOTICE]   (247884) : New worker (247886) forked
Jan 23 04:48:31 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [NOTICE]   (247884) : Loading success.
Jan 23 04:48:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.982 222021 DEBUG nova.compute.manager [req-d668c0e1-6f4a-4221-ab06-e7e1423882a8 req-dbe60326-b18c-4eb0-8a7f-61b9deca60eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.983 222021 DEBUG oslo_concurrency.lockutils [req-d668c0e1-6f4a-4221-ab06-e7e1423882a8 req-dbe60326-b18c-4eb0-8a7f-61b9deca60eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.983 222021 DEBUG oslo_concurrency.lockutils [req-d668c0e1-6f4a-4221-ab06-e7e1423882a8 req-dbe60326-b18c-4eb0-8a7f-61b9deca60eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.984 222021 DEBUG oslo_concurrency.lockutils [req-d668c0e1-6f4a-4221-ab06-e7e1423882a8 req-dbe60326-b18c-4eb0-8a7f-61b9deca60eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.984 222021 DEBUG nova.compute.manager [req-d668c0e1-6f4a-4221-ab06-e7e1423882a8 req-dbe60326-b18c-4eb0-8a7f-61b9deca60eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Processing event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.986 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.988 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.993 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.995 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161711.9922154, 5f158757-86a9-4040-b642-e1068e1ae821 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:31 np0005593233 nova_compute[222017]: 2026-01-23 09:48:31.995 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.000 222021 INFO nova.virt.libvirt.driver [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Instance spawned successfully.#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.001 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.044 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.049 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.053 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.054 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.054 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.055 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.055 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.056 222021 DEBUG nova.virt.libvirt.driver [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.092 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:48:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:32.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.117 222021 INFO nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Took 8.35 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.118 222021 DEBUG nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.194 222021 INFO nova.compute.manager [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Took 9.31 seconds to build instance.#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.215 222021 DEBUG oslo_concurrency.lockutils [None req-66a14149-f83b-4372-8dea-30db439533c3 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.258 222021 DEBUG nova.network.neutron [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Updated VIF entry in instance network info cache for port 51a06cf4-3fc8-455e-9d00-9b73e111f082. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.259 222021 DEBUG nova.network.neutron [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Updating instance_info_cache with network_info: [{"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:32 np0005593233 nova_compute[222017]: 2026-01-23 09:48:32.287 222021 DEBUG oslo_concurrency.lockutils [req-d0c75503-25d8-4402-9220-b5468107b04d req-a915ddb0-5589-4ad8-8b5c-dfa8f6c76b36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5f158757-86a9-4040-b642-e1068e1ae821" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:33 np0005593233 nova_compute[222017]: 2026-01-23 09:48:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:33 np0005593233 nova_compute[222017]: 2026-01-23 09:48:33.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:48:33 np0005593233 nova_compute[222017]: 2026-01-23 09:48:33.480 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:48:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:33.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:34 np0005593233 podman[247895]: 2026-01-23 09:48:34.07290065 +0000 UTC m=+0.086392229 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 23 04:48:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:34.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.136 222021 DEBUG nova.compute.manager [req-e006520b-8ca4-449a-9ef2-0ccca16f1c85 req-586e2831-244f-49e4-8aee-f8bb7f0dc6ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.137 222021 DEBUG oslo_concurrency.lockutils [req-e006520b-8ca4-449a-9ef2-0ccca16f1c85 req-586e2831-244f-49e4-8aee-f8bb7f0dc6ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.137 222021 DEBUG oslo_concurrency.lockutils [req-e006520b-8ca4-449a-9ef2-0ccca16f1c85 req-586e2831-244f-49e4-8aee-f8bb7f0dc6ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.137 222021 DEBUG oslo_concurrency.lockutils [req-e006520b-8ca4-449a-9ef2-0ccca16f1c85 req-586e2831-244f-49e4-8aee-f8bb7f0dc6ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.137 222021 DEBUG nova.compute.manager [req-e006520b-8ca4-449a-9ef2-0ccca16f1c85 req-586e2831-244f-49e4-8aee-f8bb7f0dc6ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] No waiting events found dispatching network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.138 222021 WARNING nova.compute.manager [req-e006520b-8ca4-449a-9ef2-0ccca16f1c85 req-586e2831-244f-49e4-8aee-f8bb7f0dc6ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received unexpected event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.824 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.825 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.826 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.827 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.827 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.829 222021 INFO nova.compute.manager [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Terminating instance#033[00m
Jan 23 04:48:34 np0005593233 nova_compute[222017]: 2026-01-23 09:48:34.830 222021 DEBUG nova.compute.manager [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:48:34 np0005593233 kernel: tap51a06cf4-3f (unregistering): left promiscuous mode
Jan 23 04:48:34 np0005593233 NetworkManager[48871]: <info>  [1769161714.9969] device (tap51a06cf4-3f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:35Z|00207|binding|INFO|Releasing lport 51a06cf4-3fc8-455e-9d00-9b73e111f082 from this chassis (sb_readonly=0)
Jan 23 04:48:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:35Z|00208|binding|INFO|Setting lport 51a06cf4-3fc8-455e-9d00-9b73e111f082 down in Southbound
Jan 23 04:48:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:35Z|00209|binding|INFO|Removing iface tap51a06cf4-3f ovn-installed in OVS
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000040.scope: Deactivated successfully.
Jan 23 04:48:35 np0005593233 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000040.scope: Consumed 3.793s CPU time.
Jan 23 04:48:35 np0005593233 systemd-machined[190954]: Machine qemu-32-instance-00000040 terminated.
Jan 23 04:48:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.074 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:93:55 10.100.0.8'], port_security=['fa:16:3e:97:93:55 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5f158757-86a9-4040-b642-e1068e1ae821', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ff15972efaf47c1a5483927aa058ee1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '579c0db7-85ec-4a28-8798-9415af4416d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d761e1-c6a0-4a70-8ba3-d112e3c168c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=51a06cf4-3fc8-455e-9d00-9b73e111f082) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.075 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 51a06cf4-3fc8-455e-9d00-9b73e111f082 in datapath 3d3b9f7b-3375-48b6-888c-0ff96bc928a5 unbound from our chassis#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.077 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d3b9f7b-3375-48b6-888c-0ff96bc928a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.079 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[618aa217-4c0d-4b57-b8ce-9405cea61cdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.079 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 namespace which is not needed anymore#033[00m
Jan 23 04:48:35 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [NOTICE]   (247884) : haproxy version is 2.8.14-c23fe91
Jan 23 04:48:35 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [NOTICE]   (247884) : path to executable is /usr/sbin/haproxy
Jan 23 04:48:35 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [WARNING]  (247884) : Exiting Master process...
Jan 23 04:48:35 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [WARNING]  (247884) : Exiting Master process...
Jan 23 04:48:35 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [ALERT]    (247884) : Current worker (247886) exited with code 143 (Terminated)
Jan 23 04:48:35 np0005593233 neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5[247879]: [WARNING]  (247884) : All workers exited. Exiting... (0)
Jan 23 04:48:35 np0005593233 systemd[1]: libpod-78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85.scope: Deactivated successfully.
Jan 23 04:48:35 np0005593233 podman[247941]: 2026-01-23 09:48:35.250216651 +0000 UTC m=+0.055672261 container died 78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.274 222021 INFO nova.virt.libvirt.driver [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Instance destroyed successfully.#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.275 222021 DEBUG nova.objects.instance [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lazy-loading 'resources' on Instance uuid 5f158757-86a9-4040-b642-e1068e1ae821 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:35 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85-userdata-shm.mount: Deactivated successfully.
Jan 23 04:48:35 np0005593233 systemd[1]: var-lib-containers-storage-overlay-55122916df8bc02ae8369d37e59e664ce76e96514881bc539c79baba2246d795-merged.mount: Deactivated successfully.
Jan 23 04:48:35 np0005593233 podman[247941]: 2026-01-23 09:48:35.294171027 +0000 UTC m=+0.099626617 container cleanup 78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 04:48:35 np0005593233 systemd[1]: libpod-conmon-78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85.scope: Deactivated successfully.
Jan 23 04:48:35 np0005593233 podman[247979]: 2026-01-23 09:48:35.365161646 +0000 UTC m=+0.045922643 container remove 78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.372 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4d2e08-69f9-469d-9d69-1d4e1eac6ffb]: (4, ('Fri Jan 23 09:48:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 (78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85)\n78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85\nFri Jan 23 09:48:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 (78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85)\n78312035a8cbc46171d475834043bdded899f7cb4fba5e14510727c75c7f6a85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.374 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5fbcfb53-fdc9-4e66-85b9-cce17367eba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.375 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d3b9f7b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 kernel: tap3d3b9f7b-30: left promiscuous mode
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.462 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.464 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d0aebf-8181-4237-a88e-a8d27ca7cbfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.486 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc95436-6c5f-4c83-85c7-bd62a102a10f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.487 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[554c660d-6d45-4ce2-acb0-4e1eba4e0c44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.502 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f5264a19-c8c7-4fd5-a588-a292d6062eb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562381, 'reachable_time': 16866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247997, 'error': None, 'target': 'ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.506 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3d3b9f7b-3375-48b6-888c-0ff96bc928a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:48:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:35.506 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[19e26d3a-7b4d-461c-b065-3602d470f3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:35 np0005593233 systemd[1]: run-netns-ovnmeta\x2d3d3b9f7b\x2d3375\x2d48b6\x2d888c\x2d0ff96bc928a5.mount: Deactivated successfully.
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.536 222021 DEBUG nova.virt.libvirt.vif [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1456391854',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1456391854',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1456391854',id=64,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:48:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ff15972efaf47c1a5483927aa058ee1',ramdisk_id='',reservation_id='r-39ec3ibf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-1870050002',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-1870050002-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:48:32Z,user_data=None,user_id='ae77ac206ed246b49262982455564c01',uuid=5f158757-86a9-4040-b642-e1068e1ae821,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.537 222021 DEBUG nova.network.os_vif_util [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converting VIF {"id": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "address": "fa:16:3e:97:93:55", "network": {"id": "3d3b9f7b-3375-48b6-888c-0ff96bc928a5", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1277530286-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff15972efaf47c1a5483927aa058ee1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a06cf4-3f", "ovs_interfaceid": "51a06cf4-3fc8-455e-9d00-9b73e111f082", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.538 222021 DEBUG nova.network.os_vif_util [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.538 222021 DEBUG os_vif [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.540 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.541 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51a06cf4-3f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.542 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593233 nova_compute[222017]: 2026-01-23 09:48:35.547 222021 INFO os_vif [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:93:55,bridge_name='br-int',has_traffic_filtering=True,id=51a06cf4-3fc8-455e-9d00-9b73e111f082,network=Network(3d3b9f7b-3375-48b6-888c-0ff96bc928a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a06cf4-3f')#033[00m
Jan 23 04:48:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:35.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:36.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:36 np0005593233 nova_compute[222017]: 2026-01-23 09:48:36.236 222021 DEBUG nova.compute.manager [req-0b5368d7-2454-41f3-a692-7992bfcbfe31 req-4504a22f-ccab-4dfa-a199-7196a6bcec1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-vif-unplugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:36 np0005593233 nova_compute[222017]: 2026-01-23 09:48:36.237 222021 DEBUG oslo_concurrency.lockutils [req-0b5368d7-2454-41f3-a692-7992bfcbfe31 req-4504a22f-ccab-4dfa-a199-7196a6bcec1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:36 np0005593233 nova_compute[222017]: 2026-01-23 09:48:36.237 222021 DEBUG oslo_concurrency.lockutils [req-0b5368d7-2454-41f3-a692-7992bfcbfe31 req-4504a22f-ccab-4dfa-a199-7196a6bcec1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:36 np0005593233 nova_compute[222017]: 2026-01-23 09:48:36.238 222021 DEBUG oslo_concurrency.lockutils [req-0b5368d7-2454-41f3-a692-7992bfcbfe31 req-4504a22f-ccab-4dfa-a199-7196a6bcec1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:36 np0005593233 nova_compute[222017]: 2026-01-23 09:48:36.238 222021 DEBUG nova.compute.manager [req-0b5368d7-2454-41f3-a692-7992bfcbfe31 req-4504a22f-ccab-4dfa-a199-7196a6bcec1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] No waiting events found dispatching network-vif-unplugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:36 np0005593233 nova_compute[222017]: 2026-01-23 09:48:36.239 222021 DEBUG nova.compute.manager [req-0b5368d7-2454-41f3-a692-7992bfcbfe31 req-4504a22f-ccab-4dfa-a199-7196a6bcec1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-vif-unplugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:48:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:37.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:37 np0005593233 nova_compute[222017]: 2026-01-23 09:48:37.832 222021 INFO nova.virt.libvirt.driver [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Deleting instance files /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821_del#033[00m
Jan 23 04:48:37 np0005593233 nova_compute[222017]: 2026-01-23 09:48:37.833 222021 INFO nova.virt.libvirt.driver [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Deletion of /var/lib/nova/instances/5f158757-86a9-4040-b642-e1068e1ae821_del complete#033[00m
Jan 23 04:48:37 np0005593233 nova_compute[222017]: 2026-01-23 09:48:37.891 222021 INFO nova.compute.manager [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Took 3.06 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:48:37 np0005593233 nova_compute[222017]: 2026-01-23 09:48:37.892 222021 DEBUG oslo.service.loopingcall [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:48:37 np0005593233 nova_compute[222017]: 2026-01-23 09:48:37.892 222021 DEBUG nova.compute.manager [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:48:37 np0005593233 nova_compute[222017]: 2026-01-23 09:48:37.892 222021 DEBUG nova.network.neutron [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:48:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:38.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.363 222021 DEBUG nova.compute.manager [req-6fbfa4c2-9a96-4f18-8e70-1c1c64ec2922 req-332860f5-36e0-4400-baf6-f29977462cbc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.364 222021 DEBUG oslo_concurrency.lockutils [req-6fbfa4c2-9a96-4f18-8e70-1c1c64ec2922 req-332860f5-36e0-4400-baf6-f29977462cbc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5f158757-86a9-4040-b642-e1068e1ae821-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.364 222021 DEBUG oslo_concurrency.lockutils [req-6fbfa4c2-9a96-4f18-8e70-1c1c64ec2922 req-332860f5-36e0-4400-baf6-f29977462cbc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.365 222021 DEBUG oslo_concurrency.lockutils [req-6fbfa4c2-9a96-4f18-8e70-1c1c64ec2922 req-332860f5-36e0-4400-baf6-f29977462cbc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.365 222021 DEBUG nova.compute.manager [req-6fbfa4c2-9a96-4f18-8e70-1c1c64ec2922 req-332860f5-36e0-4400-baf6-f29977462cbc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] No waiting events found dispatching network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.365 222021 WARNING nova.compute.manager [req-6fbfa4c2-9a96-4f18-8e70-1c1c64ec2922 req-332860f5-36e0-4400-baf6-f29977462cbc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received unexpected event network-vif-plugged-51a06cf4-3fc8-455e-9d00-9b73e111f082 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.767 222021 DEBUG nova.network.neutron [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.800 222021 INFO nova.compute.manager [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Took 0.91 seconds to deallocate network for instance.#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.873 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.875 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:38 np0005593233 nova_compute[222017]: 2026-01-23 09:48:38.882 222021 DEBUG nova.compute.manager [req-16f9a9e7-67bc-4486-be6f-3a31c44f7376 req-52eb3fe7-7a1e-417a-99a0-ec1c59f28adb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Received event network-vif-deleted-51a06cf4-3fc8-455e-9d00-9b73e111f082 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.038 222021 DEBUG oslo_concurrency.processutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:48:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1420710697' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.562 222021 DEBUG oslo_concurrency.processutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.569 222021 DEBUG nova.compute.provider_tree [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.586 222021 DEBUG nova.scheduler.client.report [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.605 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.630 222021 INFO nova.scheduler.client.report [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Deleted allocations for instance 5f158757-86a9-4040-b642-e1068e1ae821#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:39 np0005593233 nova_compute[222017]: 2026-01-23 09:48:39.708 222021 DEBUG oslo_concurrency.lockutils [None req-f52e2cde-b7f9-4df7-9f13-fb9f0f8acfd2 ae77ac206ed246b49262982455564c01 6ff15972efaf47c1a5483927aa058ee1 - - default default] Lock "5f158757-86a9-4040-b642-e1068e1ae821" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:39.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:40.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:40 np0005593233 nova_compute[222017]: 2026-01-23 09:48:40.542 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:41.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:42.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:42.651 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:42.652 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:42.652 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:43.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:44.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:48:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/388984485' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:48:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:48:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/388984485' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:48:44 np0005593233 nova_compute[222017]: 2026-01-23 09:48:44.689 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:45 np0005593233 nova_compute[222017]: 2026-01-23 09:48:45.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:45.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:45.870 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:45 np0005593233 nova_compute[222017]: 2026-01-23 09:48:45.871 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:45.871 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:48:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:46.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:47.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:47.872 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.022 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.022 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.096 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:48:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:48.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.228 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.229 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.239 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.240 222021 INFO nova.compute.claims [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.410 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/891412677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.842 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.849 222021 DEBUG nova.compute.provider_tree [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.880 222021 DEBUG nova.scheduler.client.report [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.925 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.926 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.997 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:48:48 np0005593233 nova_compute[222017]: 2026-01-23 09:48:48.997 222021 DEBUG nova.network.neutron [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.048 222021 INFO nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.083 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.197 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.199 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.199 222021 INFO nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Creating image(s)#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.230 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.262 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.299 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.303 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.375 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.376 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.377 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.377 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.406 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.410 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.502 222021 DEBUG nova.policy [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'deace84b7e09475d8d5c83d51f8309b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9aca988de81c4bb08a3d641f83b7a61a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:48:49 np0005593233 nova_compute[222017]: 2026-01-23 09:48:49.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:49.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:50.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.269 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161715.2682707, 5f158757-86a9-4040-b642-e1068e1ae821 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.270 222021 INFO nova.compute.manager [-] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.321 222021 DEBUG nova.compute.manager [None req-053598c9-e094-497b-b4ff-0dfe664c7623 - - - - - -] [instance: 5f158757-86a9-4040-b642-e1068e1ae821] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.337 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.928s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.419 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] resizing rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.546 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.585 222021 DEBUG nova.objects.instance [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lazy-loading 'migration_context' on Instance uuid 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.603 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.604 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Ensure instance console log exists: /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.605 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.605 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:50 np0005593233 nova_compute[222017]: 2026-01-23 09:48:50.605 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:51.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:52 np0005593233 nova_compute[222017]: 2026-01-23 09:48:52.065 222021 DEBUG nova.network.neutron [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Successfully created port: 090f7d0a-4ee9-450e-9734-dfd6065896aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:48:52 np0005593233 nova_compute[222017]: 2026-01-23 09:48:52.067 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:48:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:52.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:48:53 np0005593233 podman[248227]: 2026-01-23 09:48:53.132196689 +0000 UTC m=+0.130335875 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:48:53 np0005593233 nova_compute[222017]: 2026-01-23 09:48:53.607 222021 DEBUG nova.network.neutron [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Successfully updated port: 090f7d0a-4ee9-450e-9734-dfd6065896aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:48:53 np0005593233 nova_compute[222017]: 2026-01-23 09:48:53.628 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:53 np0005593233 nova_compute[222017]: 2026-01-23 09:48:53.628 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquired lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:53 np0005593233 nova_compute[222017]: 2026-01-23 09:48:53.628 222021 DEBUG nova.network.neutron [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:48:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:53.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:53 np0005593233 nova_compute[222017]: 2026-01-23 09:48:53.992 222021 DEBUG nova.network.neutron [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:48:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:54.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:54 np0005593233 nova_compute[222017]: 2026-01-23 09:48:54.675 222021 DEBUG nova.compute.manager [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-changed-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:54 np0005593233 nova_compute[222017]: 2026-01-23 09:48:54.676 222021 DEBUG nova.compute.manager [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Refreshing instance network info cache due to event network-changed-090f7d0a-4ee9-450e-9734-dfd6065896aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:48:54 np0005593233 nova_compute[222017]: 2026-01-23 09:48:54.677 222021 DEBUG oslo_concurrency.lockutils [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:54 np0005593233 nova_compute[222017]: 2026-01-23 09:48:54.693 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.220 222021 DEBUG nova.network.neutron [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updating instance_info_cache with network_info: [{"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.245 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Releasing lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.246 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Instance network_info: |[{"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.246 222021 DEBUG oslo_concurrency.lockutils [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.247 222021 DEBUG nova.network.neutron [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Refreshing network info cache for port 090f7d0a-4ee9-450e-9734-dfd6065896aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.251 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Start _get_guest_xml network_info=[{"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.256 222021 WARNING nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.261 222021 DEBUG nova.virt.libvirt.host [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.262 222021 DEBUG nova.virt.libvirt.host [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.271 222021 DEBUG nova.virt.libvirt.host [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.272 222021 DEBUG nova.virt.libvirt.host [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.273 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.273 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.274 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.274 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.274 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.274 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.275 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.275 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.275 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.275 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.276 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.276 222021 DEBUG nova.virt.hardware [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.279 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:55.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:48:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2434669006' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:48:55 np0005593233 nova_compute[222017]: 2026-01-23 09:48:55.986 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.707s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.023 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.028 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:48:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:56.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:48:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:48:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/177145051' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.511 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.514 222021 DEBUG nova.virt.libvirt.vif [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1586343522',display_name='tempest-ServersTestJSON-server-1586343522',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1586343522',id=65,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCp6hK0B05xlXD/dhcR3NsmbySJ4P7z9MbBsTLkt4dHjxwcFH1te3nWco0gMvxEk6d1xVXvl47p1g7CRpEkxPZyayJ/f6INIA+/5KP+Pbb0yahRd11NwlbaCXk7SIxDxEA==',key_name='tempest-keypair-1810134559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9aca988de81c4bb08a3d641f83b7a61a',ramdisk_id='',reservation_id='r-pvp12m0k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-196898738',owner_user_name='tempest-ServersTestJSON-196898738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:48:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='deace84b7e09475d8d5c83d51f8309b5',uuid=6988af6b-5dd8-4567-813b-70a1aa7c8fc7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.514 222021 DEBUG nova.network.os_vif_util [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Converting VIF {"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.516 222021 DEBUG nova.network.os_vif_util [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.519 222021 DEBUG nova.objects.instance [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lazy-loading 'pci_devices' on Instance uuid 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.541 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <uuid>6988af6b-5dd8-4567-813b-70a1aa7c8fc7</uuid>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <name>instance-00000041</name>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersTestJSON-server-1586343522</nova:name>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:48:55</nova:creationTime>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:user uuid="deace84b7e09475d8d5c83d51f8309b5">tempest-ServersTestJSON-196898738-project-member</nova:user>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:project uuid="9aca988de81c4bb08a3d641f83b7a61a">tempest-ServersTestJSON-196898738</nova:project>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <nova:port uuid="090f7d0a-4ee9-450e-9734-dfd6065896aa">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <entry name="serial">6988af6b-5dd8-4567-813b-70a1aa7c8fc7</entry>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <entry name="uuid">6988af6b-5dd8-4567-813b-70a1aa7c8fc7</entry>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk.config">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:f7:3a:00"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <target dev="tap090f7d0a-4e"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/console.log" append="off"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:48:56 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:48:56 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:48:56 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:48:56 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.542 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Preparing to wait for external event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.543 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.543 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.543 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.544 222021 DEBUG nova.virt.libvirt.vif [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1586343522',display_name='tempest-ServersTestJSON-server-1586343522',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1586343522',id=65,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCp6hK0B05xlXD/dhcR3NsmbySJ4P7z9MbBsTLkt4dHjxwcFH1te3nWco0gMvxEk6d1xVXvl47p1g7CRpEkxPZyayJ/f6INIA+/5KP+Pbb0yahRd11NwlbaCXk7SIxDxEA==',key_name='tempest-keypair-1810134559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9aca988de81c4bb08a3d641f83b7a61a',ramdisk_id='',reservation_id='r-pvp12m0k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-196898738',owner_user_name='tempest-ServersTestJSON-196898738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:48:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='deace84b7e09475d8d5c83d51f8309b5',uuid=6988af6b-5dd8-4567-813b-70a1aa7c8fc7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.545 222021 DEBUG nova.network.os_vif_util [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Converting VIF {"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.545 222021 DEBUG nova.network.os_vif_util [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.547 222021 DEBUG os_vif [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.547 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.548 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.548 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.552 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap090f7d0a-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.553 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap090f7d0a-4e, col_values=(('external_ids', {'iface-id': '090f7d0a-4ee9-450e-9734-dfd6065896aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:3a:00', 'vm-uuid': '6988af6b-5dd8-4567-813b-70a1aa7c8fc7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:56 np0005593233 NetworkManager[48871]: <info>  [1769161736.5560] manager: (tap090f7d0a-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.557 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.566 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.568 222021 INFO os_vif [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e')#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.660 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.660 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.660 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] No VIF found with MAC fa:16:3e:f7:3a:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.661 222021 INFO nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Using config drive#033[00m
Jan 23 04:48:56 np0005593233 nova_compute[222017]: 2026-01-23 09:48:56.692 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.133 222021 DEBUG nova.network.neutron [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updated VIF entry in instance network info cache for port 090f7d0a-4ee9-450e-9734-dfd6065896aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.133 222021 DEBUG nova.network.neutron [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updating instance_info_cache with network_info: [{"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.167 222021 DEBUG oslo_concurrency.lockutils [req-29f2e3a7-d448-47a5-a9a8-764c6f5fda9c req-fbc998e6-403e-4cdd-a9a8-f5a99a58d5cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.309 222021 INFO nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Creating config drive at /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/disk.config#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.316 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8qkyabn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.458 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8qkyabn" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.498 222021 DEBUG nova.storage.rbd_utils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] rbd image 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:48:57 np0005593233 nova_compute[222017]: 2026-01-23 09:48:57.502 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/disk.config 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:57.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:58.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.393 222021 DEBUG oslo_concurrency.processutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/disk.config 6988af6b-5dd8-4567-813b-70a1aa7c8fc7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.891s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.395 222021 INFO nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Deleting local config drive /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7/disk.config because it was imported into RBD.#033[00m
Jan 23 04:48:58 np0005593233 kernel: tap090f7d0a-4e: entered promiscuous mode
Jan 23 04:48:58 np0005593233 NetworkManager[48871]: <info>  [1769161738.4652] manager: (tap090f7d0a-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 23 04:48:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:58Z|00210|binding|INFO|Claiming lport 090f7d0a-4ee9-450e-9734-dfd6065896aa for this chassis.
Jan 23 04:48:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:58Z|00211|binding|INFO|090f7d0a-4ee9-450e-9734-dfd6065896aa: Claiming fa:16:3e:f7:3a:00 10.100.0.8
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.480 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:3a:00 10.100.0.8'], port_security=['fa:16:3e:f7:3a:00 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6988af6b-5dd8-4567-813b-70a1aa7c8fc7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9aca988de81c4bb08a3d641f83b7a61a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5477f264-5b8d-4a73-9791-91122a34b9dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d9a9f05-1f65-420c-848c-b1045a4704ac, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=090f7d0a-4ee9-450e-9734-dfd6065896aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.481 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 090f7d0a-4ee9-450e-9734-dfd6065896aa in datapath 9d2afc7b-0ec7-4513-9848-c17b0e23eafc bound to our chassis#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.483 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9d2afc7b-0ec7-4513-9848-c17b0e23eafc#033[00m
Jan 23 04:48:58 np0005593233 systemd-udevd[248392]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.497 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e3990c65-d48a-4656-8058-295b54db84e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.498 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9d2afc7b-01 in ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.500 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9d2afc7b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.500 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fabd1f-f645-472d-bece-44a678500b7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.501 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04c80cdf-3947-44ee-9f00-d24cc06b19b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 systemd-machined[190954]: New machine qemu-33-instance-00000041.
Jan 23 04:48:58 np0005593233 NetworkManager[48871]: <info>  [1769161738.5122] device (tap090f7d0a-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:48:58 np0005593233 NetworkManager[48871]: <info>  [1769161738.5130] device (tap090f7d0a-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.513 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0fe6a6-c812-4fda-a80a-5e4ed0e55c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 systemd[1]: Started Virtual Machine qemu-33-instance-00000041.
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.538 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.543 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[66ecc0b1-a716-455a-a5a6-bb924e958813]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:58Z|00212|binding|INFO|Setting lport 090f7d0a-4ee9-450e-9734-dfd6065896aa ovn-installed in OVS
Jan 23 04:48:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:58Z|00213|binding|INFO|Setting lport 090f7d0a-4ee9-450e-9734-dfd6065896aa up in Southbound
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.578 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f87056e7-8c7b-4c98-bb0b-93dcc48cec62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 systemd-udevd[248396]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.585 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d07dc728-4afa-4b02-9440-fdab74088646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 NetworkManager[48871]: <info>  [1769161738.5866] manager: (tap9d2afc7b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.628 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[745d9785-758c-4590-8f34-cf67e5abdeb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.632 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[566dc2a0-723a-48a8-bf73-fc90223faee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 NetworkManager[48871]: <info>  [1769161738.6668] device (tap9d2afc7b-00): carrier: link connected
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.672 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1faf25-6023-489c-be71-6489c893fc9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.694 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa72e38-1c7e-4c77-834d-2bcb4590de48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d2afc7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:e8:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565129, 'reachable_time': 38278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248425, 'error': None, 'target': 'ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.715 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[deb9a918-6616-4d73-8c26-b726795add12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:e8d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565129, 'tstamp': 565129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248427, 'error': None, 'target': 'ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.731 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fe571c82-0831-4c08-8d6b-75351f11d183]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d2afc7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:e8:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565129, 'reachable_time': 38278, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248441, 'error': None, 'target': 'ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.764 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a269df6f-d590-4727-8e08-7b34b57579db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.823 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4cd3ae-d0a8-44df-ab36-d84bfa842189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.824 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d2afc7b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.824 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.824 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d2afc7b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.826 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593233 kernel: tap9d2afc7b-00: entered promiscuous mode
Jan 23 04:48:58 np0005593233 NetworkManager[48871]: <info>  [1769161738.8271] manager: (tap9d2afc7b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.832 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9d2afc7b-00, col_values=(('external_ids', {'iface-id': '4c77d495-b829-4b4b-bc6b-063710c0d194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.833 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:48:58Z|00214|binding|INFO|Releasing lport 4c77d495-b829-4b4b-bc6b-063710c0d194 from this chassis (sb_readonly=0)
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.834 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9d2afc7b-0ec7-4513-9848-c17b0e23eafc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9d2afc7b-0ec7-4513-9848-c17b0e23eafc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.835 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[036e2cf7-75eb-49d6-8e37-e9354766ad12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.836 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-9d2afc7b-0ec7-4513-9848-c17b0e23eafc
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/9d2afc7b-0ec7-4513-9848-c17b0e23eafc.pid.haproxy
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 9d2afc7b-0ec7-4513-9848-c17b0e23eafc
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:48:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:48:58.837 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'env', 'PROCESS_TAG=haproxy-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9d2afc7b-0ec7-4513-9848-c17b0e23eafc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:48:58 np0005593233 nova_compute[222017]: 2026-01-23 09:48:58.847 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.011 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161739.010296, 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.012 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] VM Started (Lifecycle Event)#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.046 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.054 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161739.0107815, 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.054 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.077 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.081 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.109 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.218 222021 DEBUG nova.compute.manager [req-2d6b769f-c5f9-48bc-8251-f020f45cdd14 req-996d9c2d-8b93-40c2-b431-dd6d7a09d8d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.219 222021 DEBUG oslo_concurrency.lockutils [req-2d6b769f-c5f9-48bc-8251-f020f45cdd14 req-996d9c2d-8b93-40c2-b431-dd6d7a09d8d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.219 222021 DEBUG oslo_concurrency.lockutils [req-2d6b769f-c5f9-48bc-8251-f020f45cdd14 req-996d9c2d-8b93-40c2-b431-dd6d7a09d8d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.220 222021 DEBUG oslo_concurrency.lockutils [req-2d6b769f-c5f9-48bc-8251-f020f45cdd14 req-996d9c2d-8b93-40c2-b431-dd6d7a09d8d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:59 np0005593233 podman[248501]: 2026-01-23 09:48:59.220335691 +0000 UTC m=+0.052997145 container create 60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.220 222021 DEBUG nova.compute.manager [req-2d6b769f-c5f9-48bc-8251-f020f45cdd14 req-996d9c2d-8b93-40c2-b431-dd6d7a09d8d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Processing event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.221 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.226 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161739.2259784, 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.226 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.228 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.231 222021 INFO nova.virt.libvirt.driver [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Instance spawned successfully.#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.232 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.255 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:59 np0005593233 systemd[1]: Started libpod-conmon-60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6.scope.
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.264 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.268 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.269 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.269 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.270 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.270 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.270 222021 DEBUG nova.virt.libvirt.driver [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:48:59 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:48:59 np0005593233 podman[248501]: 2026-01-23 09:48:59.194564355 +0000 UTC m=+0.027225819 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:48:59 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d71f7b7b12837394f2e1d9087855393514c37cb5541b73d62959d5b24cac612d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:48:59 np0005593233 podman[248501]: 2026-01-23 09:48:59.309032565 +0000 UTC m=+0.141694039 container init 60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:48:59 np0005593233 podman[248501]: 2026-01-23 09:48:59.314478801 +0000 UTC m=+0.147140255 container start 60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.335 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:48:59 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [NOTICE]   (248521) : New worker (248523) forked
Jan 23 04:48:59 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [NOTICE]   (248521) : Loading success.
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.358 222021 INFO nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Took 10.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.359 222021 DEBUG nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.427 222021 INFO nova.compute.manager [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Took 11.27 seconds to build instance.#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.456 222021 DEBUG oslo_concurrency.lockutils [None req-1651de8a-de53-4a8b-896c-3b2f70003979 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:59 np0005593233 nova_compute[222017]: 2026-01-23 09:48:59.695 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:48:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:59.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:00.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.363 222021 DEBUG nova.compute.manager [req-2faa8f63-3ab3-4de0-989b-cdd1d76843b6 req-f00cf9f6-f64b-4d1b-9ab0-729cd16df28a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.365 222021 DEBUG oslo_concurrency.lockutils [req-2faa8f63-3ab3-4de0-989b-cdd1d76843b6 req-f00cf9f6-f64b-4d1b-9ab0-729cd16df28a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.365 222021 DEBUG oslo_concurrency.lockutils [req-2faa8f63-3ab3-4de0-989b-cdd1d76843b6 req-f00cf9f6-f64b-4d1b-9ab0-729cd16df28a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.365 222021 DEBUG oslo_concurrency.lockutils [req-2faa8f63-3ab3-4de0-989b-cdd1d76843b6 req-f00cf9f6-f64b-4d1b-9ab0-729cd16df28a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.366 222021 DEBUG nova.compute.manager [req-2faa8f63-3ab3-4de0-989b-cdd1d76843b6 req-f00cf9f6-f64b-4d1b-9ab0-729cd16df28a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] No waiting events found dispatching network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.366 222021 WARNING nova.compute.manager [req-2faa8f63-3ab3-4de0-989b-cdd1d76843b6 req-f00cf9f6-f64b-4d1b-9ab0-729cd16df28a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received unexpected event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa for instance with vm_state active and task_state None.#033[00m
Jan 23 04:49:01 np0005593233 nova_compute[222017]: 2026-01-23 09:49:01.557 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:01.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:02 np0005593233 NetworkManager[48871]: <info>  [1769161742.2151] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 23 04:49:02 np0005593233 NetworkManager[48871]: <info>  [1769161742.2161] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 23 04:49:02 np0005593233 nova_compute[222017]: 2026-01-23 09:49:02.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:02.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:02 np0005593233 nova_compute[222017]: 2026-01-23 09:49:02.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:02 np0005593233 ovn_controller[130653]: 2026-01-23T09:49:02Z|00215|binding|INFO|Releasing lport 4c77d495-b829-4b4b-bc6b-063710c0d194 from this chassis (sb_readonly=0)
Jan 23 04:49:02 np0005593233 nova_compute[222017]: 2026-01-23 09:49:02.383 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:03 np0005593233 nova_compute[222017]: 2026-01-23 09:49:03.540 222021 DEBUG nova.compute.manager [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-changed-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:03 np0005593233 nova_compute[222017]: 2026-01-23 09:49:03.541 222021 DEBUG nova.compute.manager [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Refreshing instance network info cache due to event network-changed-090f7d0a-4ee9-450e-9734-dfd6065896aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:49:03 np0005593233 nova_compute[222017]: 2026-01-23 09:49:03.541 222021 DEBUG oslo_concurrency.lockutils [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:03 np0005593233 nova_compute[222017]: 2026-01-23 09:49:03.541 222021 DEBUG oslo_concurrency.lockutils [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:03 np0005593233 nova_compute[222017]: 2026-01-23 09:49:03.541 222021 DEBUG nova.network.neutron [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Refreshing network info cache for port 090f7d0a-4ee9-450e-9734-dfd6065896aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:49:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:03.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:04.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:04 np0005593233 nova_compute[222017]: 2026-01-23 09:49:04.736 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:05 np0005593233 podman[248533]: 2026-01-23 09:49:05.064728309 +0000 UTC m=+0.069202048 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 04:49:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:05 np0005593233 nova_compute[222017]: 2026-01-23 09:49:05.478 222021 DEBUG nova.network.neutron [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updated VIF entry in instance network info cache for port 090f7d0a-4ee9-450e-9734-dfd6065896aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:49:05 np0005593233 nova_compute[222017]: 2026-01-23 09:49:05.478 222021 DEBUG nova.network.neutron [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updating instance_info_cache with network_info: [{"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:05 np0005593233 nova_compute[222017]: 2026-01-23 09:49:05.507 222021 DEBUG oslo_concurrency.lockutils [req-8c89f022-51eb-41b7-b59b-90e4323a791e req-eb171c2a-0a53-4a74-9c83-33a1b82ef635 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:05.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:06.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:06 np0005593233 nova_compute[222017]: 2026-01-23 09:49:06.560 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:07.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:09 np0005593233 nova_compute[222017]: 2026-01-23 09:49:09.738 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:09.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:10.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:11 np0005593233 nova_compute[222017]: 2026-01-23 09:49:11.564 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:11.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:49:12Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:3a:00 10.100.0.8
Jan 23 04:49:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:49:12Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:3a:00 10.100.0.8
Jan 23 04:49:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:12.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:13 np0005593233 nova_compute[222017]: 2026-01-23 09:49:13.408 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:13.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:14 np0005593233 nova_compute[222017]: 2026-01-23 09:49:14.189 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:14.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:14 np0005593233 nova_compute[222017]: 2026-01-23 09:49:14.780 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:15.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:49:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:16.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:49:16 np0005593233 nova_compute[222017]: 2026-01-23 09:49:16.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:16 np0005593233 nova_compute[222017]: 2026-01-23 09:49:16.568 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:17 np0005593233 nova_compute[222017]: 2026-01-23 09:49:17.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:17.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:18.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:19 np0005593233 nova_compute[222017]: 2026-01-23 09:49:19.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:19.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:20.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:20 np0005593233 nova_compute[222017]: 2026-01-23 09:49:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:20 np0005593233 nova_compute[222017]: 2026-01-23 09:49:20.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:49:20 np0005593233 nova_compute[222017]: 2026-01-23 09:49:20.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.152 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.153 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.153 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.153 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.154 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.155 222021 INFO nova.compute.manager [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Terminating instance#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.156 222021 DEBUG nova.compute.manager [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.220 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.220 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:21 np0005593233 kernel: tap090f7d0a-4e (unregistering): left promiscuous mode
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.221 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.221 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:21 np0005593233 NetworkManager[48871]: <info>  [1769161761.2251] device (tap090f7d0a-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:49:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:49:21Z|00216|binding|INFO|Releasing lport 090f7d0a-4ee9-450e-9734-dfd6065896aa from this chassis (sb_readonly=0)
Jan 23 04:49:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:49:21Z|00217|binding|INFO|Setting lport 090f7d0a-4ee9-450e-9734-dfd6065896aa down in Southbound
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:49:21Z|00218|binding|INFO|Removing iface tap090f7d0a-4e ovn-installed in OVS
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.236 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.258 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:3a:00 10.100.0.8'], port_security=['fa:16:3e:f7:3a:00 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6988af6b-5dd8-4567-813b-70a1aa7c8fc7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9aca988de81c4bb08a3d641f83b7a61a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5477f264-5b8d-4a73-9791-91122a34b9dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2d9a9f05-1f65-420c-848c-b1045a4704ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=090f7d0a-4ee9-450e-9734-dfd6065896aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.259 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 090f7d0a-4ee9-450e-9734-dfd6065896aa in datapath 9d2afc7b-0ec7-4513-9848-c17b0e23eafc unbound from our chassis#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.261 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9d2afc7b-0ec7-4513-9848-c17b0e23eafc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.263 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[71fbddef-563b-4941-b3a6-1bffd7fdbc90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.263 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc namespace which is not needed anymore#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.264 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000041.scope: Deactivated successfully.
Jan 23 04:49:21 np0005593233 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000041.scope: Consumed 14.909s CPU time.
Jan 23 04:49:21 np0005593233 systemd-machined[190954]: Machine qemu-33-instance-00000041 terminated.
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.475 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.483 222021 INFO nova.virt.libvirt.driver [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Instance destroyed successfully.#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.483 222021 DEBUG nova.objects.instance [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lazy-loading 'resources' on Instance uuid 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:21 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [NOTICE]   (248521) : haproxy version is 2.8.14-c23fe91
Jan 23 04:49:21 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [NOTICE]   (248521) : path to executable is /usr/sbin/haproxy
Jan 23 04:49:21 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [WARNING]  (248521) : Exiting Master process...
Jan 23 04:49:21 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [ALERT]    (248521) : Current worker (248523) exited with code 143 (Terminated)
Jan 23 04:49:21 np0005593233 neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc[248517]: [WARNING]  (248521) : All workers exited. Exiting... (0)
Jan 23 04:49:21 np0005593233 systemd[1]: libpod-60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6.scope: Deactivated successfully.
Jan 23 04:49:21 np0005593233 podman[248579]: 2026-01-23 09:49:21.526207078 +0000 UTC m=+0.056466486 container died 60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.570 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6-userdata-shm.mount: Deactivated successfully.
Jan 23 04:49:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay-d71f7b7b12837394f2e1d9087855393514c37cb5541b73d62959d5b24cac612d-merged.mount: Deactivated successfully.
Jan 23 04:49:21 np0005593233 podman[248579]: 2026-01-23 09:49:21.586004248 +0000 UTC m=+0.116263656 container cleanup 60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:21 np0005593233 systemd[1]: libpod-conmon-60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6.scope: Deactivated successfully.
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.608 222021 DEBUG nova.virt.libvirt.vif [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1586343522',display_name='tempest-ServersTestJSON-server-1586343522',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1586343522',id=65,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCp6hK0B05xlXD/dhcR3NsmbySJ4P7z9MbBsTLkt4dHjxwcFH1te3nWco0gMvxEk6d1xVXvl47p1g7CRpEkxPZyayJ/f6INIA+/5KP+Pbb0yahRd11NwlbaCXk7SIxDxEA==',key_name='tempest-keypair-1810134559',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:48:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9aca988de81c4bb08a3d641f83b7a61a',ramdisk_id='',reservation_id='r-pvp12m0k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-196898738',owner_user_name='tempest-ServersTestJSON-196898738-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:48:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='deace84b7e09475d8d5c83d51f8309b5',uuid=6988af6b-5dd8-4567-813b-70a1aa7c8fc7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.608 222021 DEBUG nova.network.os_vif_util [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Converting VIF {"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.610 222021 DEBUG nova.network.os_vif_util [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.612 222021 DEBUG os_vif [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.617 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap090f7d0a-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.618 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.623 222021 INFO os_vif [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:3a:00,bridge_name='br-int',has_traffic_filtering=True,id=090f7d0a-4ee9-450e-9734-dfd6065896aa,network=Network(9d2afc7b-0ec7-4513-9848-c17b0e23eafc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090f7d0a-4e')#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.738 222021 DEBUG nova.compute.manager [req-694280b4-2c4e-489d-914a-cd569cc588e3 req-35147eac-7268-4f35-9004-cf13936a7029 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-vif-unplugged-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.739 222021 DEBUG oslo_concurrency.lockutils [req-694280b4-2c4e-489d-914a-cd569cc588e3 req-35147eac-7268-4f35-9004-cf13936a7029 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.739 222021 DEBUG oslo_concurrency.lockutils [req-694280b4-2c4e-489d-914a-cd569cc588e3 req-35147eac-7268-4f35-9004-cf13936a7029 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.740 222021 DEBUG oslo_concurrency.lockutils [req-694280b4-2c4e-489d-914a-cd569cc588e3 req-35147eac-7268-4f35-9004-cf13936a7029 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.740 222021 DEBUG nova.compute.manager [req-694280b4-2c4e-489d-914a-cd569cc588e3 req-35147eac-7268-4f35-9004-cf13936a7029 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] No waiting events found dispatching network-vif-unplugged-090f7d0a-4ee9-450e-9734-dfd6065896aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.741 222021 DEBUG nova.compute.manager [req-694280b4-2c4e-489d-914a-cd569cc588e3 req-35147eac-7268-4f35-9004-cf13936a7029 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-vif-unplugged-090f7d0a-4ee9-450e-9734-dfd6065896aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:49:21 np0005593233 podman[248616]: 2026-01-23 09:49:21.764128228 +0000 UTC m=+0.151163421 container remove 60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.773 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[940bc5ba-6e0d-475d-b9be-a929f91fef0d]: (4, ('Fri Jan 23 09:49:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc (60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6)\n60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6\nFri Jan 23 09:49:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc (60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6)\n60d42863847a9104c164f11289d4c0c08669a2de78f4c4baa4d1cf59e2f627a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.775 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f30db12b-c827-4e75-bd21-80086e852744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.776 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d2afc7b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 kernel: tap9d2afc7b-00: left promiscuous mode
Jan 23 04:49:21 np0005593233 nova_compute[222017]: 2026-01-23 09:49:21.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.798 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6aa31d-3c81-4e19-b4b6-94f9f6c465ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.812 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3a0beccc-d3a0-4fa5-8a74-501a67f63d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.813 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bba68620-bd0a-40ed-95ef-938b1b87f243]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.828 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[106cdebf-2981-4868-abce-f063499d69b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565120, 'reachable_time': 42183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248649, 'error': None, 'target': 'ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.831 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9d2afc7b-0ec7-4513-9848-c17b0e23eafc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:49:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:21.831 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ef4fdf-6c69-465d-a228-127d944a5b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:21 np0005593233 systemd[1]: run-netns-ovnmeta\x2d9d2afc7b\x2d0ec7\x2d4513\x2d9848\x2dc17b0e23eafc.mount: Deactivated successfully.
Jan 23 04:49:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:21.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.172 222021 INFO nova.virt.libvirt.driver [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Deleting instance files /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7_del#033[00m
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.173 222021 INFO nova.virt.libvirt.driver [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Deletion of /var/lib/nova/instances/6988af6b-5dd8-4567-813b-70a1aa7c8fc7_del complete#033[00m
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.258 222021 INFO nova.compute.manager [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Took 1.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.259 222021 DEBUG oslo.service.loopingcall [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.260 222021 DEBUG nova.compute.manager [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.260 222021 DEBUG nova.network.neutron [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:49:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:22.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:22 np0005593233 nova_compute[222017]: 2026-01-23 09:49:22.886 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.317 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updating instance_info_cache with network_info: [{"id": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "address": "fa:16:3e:f7:3a:00", "network": {"id": "9d2afc7b-0ec7-4513-9848-c17b0e23eafc", "bridge": "br-int", "label": "tempest-ServersTestJSON-1788652075-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9aca988de81c4bb08a3d641f83b7a61a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090f7d0a-4e", "ovs_interfaceid": "090f7d0a-4ee9-450e-9734-dfd6065896aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.373 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-6988af6b-5dd8-4567-813b-70a1aa7c8fc7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.374 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.374 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.374 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.375 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.375 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.375 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.416 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.416 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:23.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.973 222021 DEBUG nova.network.neutron [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.981 222021 DEBUG nova.compute.manager [req-c1d5dc94-08ea-4f18-85de-6046ce7866af req-a0fc62ea-8d4e-4086-ad58-698f19d8ef8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.981 222021 DEBUG oslo_concurrency.lockutils [req-c1d5dc94-08ea-4f18-85de-6046ce7866af req-a0fc62ea-8d4e-4086-ad58-698f19d8ef8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.982 222021 DEBUG oslo_concurrency.lockutils [req-c1d5dc94-08ea-4f18-85de-6046ce7866af req-a0fc62ea-8d4e-4086-ad58-698f19d8ef8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.982 222021 DEBUG oslo_concurrency.lockutils [req-c1d5dc94-08ea-4f18-85de-6046ce7866af req-a0fc62ea-8d4e-4086-ad58-698f19d8ef8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.983 222021 DEBUG nova.compute.manager [req-c1d5dc94-08ea-4f18-85de-6046ce7866af req-a0fc62ea-8d4e-4086-ad58-698f19d8ef8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] No waiting events found dispatching network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.983 222021 WARNING nova.compute.manager [req-c1d5dc94-08ea-4f18-85de-6046ce7866af req-a0fc62ea-8d4e-4086-ad58-698f19d8ef8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received unexpected event network-vif-plugged-090f7d0a-4ee9-450e-9734-dfd6065896aa for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:49:23 np0005593233 nova_compute[222017]: 2026-01-23 09:49:23.991 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.575s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.004 222021 INFO nova.compute.manager [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Took 1.74 seconds to deallocate network for instance.#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.099 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.100 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:24 np0005593233 podman[248672]: 2026-01-23 09:49:24.116123623 +0000 UTC m=+0.108508832 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.136 222021 DEBUG nova.scheduler.client.report [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.157 222021 DEBUG nova.compute.manager [req-5a5e7a24-55b5-4c8e-b74d-b3c2b097c75b req-5c177dea-e968-4307-af0a-e1f22433752d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Received event network-vif-deleted-090f7d0a-4ee9-450e-9734-dfd6065896aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.172 222021 DEBUG nova.scheduler.client.report [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.173 222021 DEBUG nova.compute.provider_tree [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.199 222021 DEBUG nova.scheduler.client.report [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.228 222021 DEBUG nova.scheduler.client.report [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.235 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.236 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4692MB free_disk=20.95269775390625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.236 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:24.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.300 222021 DEBUG oslo_concurrency.processutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:49:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2568678511' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.741 222021 DEBUG oslo_concurrency.processutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.751 222021 DEBUG nova.compute.provider_tree [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.779 222021 DEBUG nova.scheduler.client.report [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.814 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.818 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:24 np0005593233 nova_compute[222017]: 2026-01-23 09:49:24.820 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.231 222021 INFO nova.scheduler.client.report [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Deleted allocations for instance 6988af6b-5dd8-4567-813b-70a1aa7c8fc7#033[00m
Jan 23 04:49:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.374 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.375 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.393 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.433 222021 DEBUG oslo_concurrency.lockutils [None req-caacb557-8eb3-4bf3-b66c-1a0dcaab38a1 deace84b7e09475d8d5c83d51f8309b5 9aca988de81c4bb08a3d641f83b7a61a - - default default] Lock "6988af6b-5dd8-4567-813b-70a1aa7c8fc7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:49:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3798347238' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.839 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.846 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.864 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.889 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:49:25 np0005593233 nova_compute[222017]: 2026-01-23 09:49:25.890 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:25.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:26.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:26 np0005593233 nova_compute[222017]: 2026-01-23 09:49:26.621 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:49:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:27.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:49:28 np0005593233 nova_compute[222017]: 2026-01-23 09:49:28.193 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:28 np0005593233 nova_compute[222017]: 2026-01-23 09:49:28.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:29 np0005593233 nova_compute[222017]: 2026-01-23 09:49:29.822 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:29.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:30.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:31 np0005593233 nova_compute[222017]: 2026-01-23 09:49:31.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:31.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:32.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:32 np0005593233 nova_compute[222017]: 2026-01-23 09:49:32.884 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:32 np0005593233 nova_compute[222017]: 2026-01-23 09:49:32.885 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:49:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:49:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:49:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:33.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:34 np0005593233 nova_compute[222017]: 2026-01-23 09:49:34.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:35.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:36 np0005593233 podman[248876]: 2026-01-23 09:49:36.085850275 +0000 UTC m=+0.090966950 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 04:49:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:36.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:36 np0005593233 nova_compute[222017]: 2026-01-23 09:49:36.482 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161761.480642, 6988af6b-5dd8-4567-813b-70a1aa7c8fc7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:49:36 np0005593233 nova_compute[222017]: 2026-01-23 09:49:36.483 222021 INFO nova.compute.manager [-] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:49:36 np0005593233 nova_compute[222017]: 2026-01-23 09:49:36.539 222021 DEBUG nova.compute.manager [None req-36d6a60b-6c16-4bb0-aa77-c991fe5aba52 - - - - - -] [instance: 6988af6b-5dd8-4567-813b-70a1aa7c8fc7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:49:36 np0005593233 nova_compute[222017]: 2026-01-23 09:49:36.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:37.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:38.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.365707) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779365755, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2511, "num_deletes": 263, "total_data_size": 5695977, "memory_usage": 5778880, "flush_reason": "Manual Compaction"}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779419452, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3733236, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38624, "largest_seqno": 41130, "table_properties": {"data_size": 3722897, "index_size": 6641, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21975, "raw_average_key_size": 21, "raw_value_size": 3702066, "raw_average_value_size": 3549, "num_data_blocks": 286, "num_entries": 1043, "num_filter_entries": 1043, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161593, "oldest_key_time": 1769161593, "file_creation_time": 1769161779, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 53884 microseconds, and 9108 cpu microseconds.
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.419585) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3733236 bytes OK
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.419608) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.422302) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.422320) EVENT_LOG_v1 {"time_micros": 1769161779422313, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.422341) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5684617, prev total WAL file size 5705216, number of live WAL files 2.
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.424434) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3645KB)], [75(9771KB)]
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779424567, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 13739373, "oldest_snapshot_seqno": -1}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6574 keys, 11839989 bytes, temperature: kUnknown
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779624003, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 11839989, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11793384, "index_size": 29085, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 168293, "raw_average_key_size": 25, "raw_value_size": 11672781, "raw_average_value_size": 1775, "num_data_blocks": 1165, "num_entries": 6574, "num_filter_entries": 6574, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161779, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.624320) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 11839989 bytes
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.628492) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 68.9 rd, 59.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.5 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(6.9) write-amplify(3.2) OK, records in: 7109, records dropped: 535 output_compression: NoCompression
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.628522) EVENT_LOG_v1 {"time_micros": 1769161779628510, "job": 46, "event": "compaction_finished", "compaction_time_micros": 199513, "compaction_time_cpu_micros": 37478, "output_level": 6, "num_output_files": 1, "total_output_size": 11839989, "num_input_records": 7109, "num_output_records": 6574, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779629709, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779632054, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.424197) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.632176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.632186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.632190) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.632194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:49:39.632199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593233 nova_compute[222017]: 2026-01-23 09:49:39.851 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:39.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:49:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:49:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:40.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:41 np0005593233 nova_compute[222017]: 2026-01-23 09:49:41.632 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:41.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:42.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:42.652 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:42.653 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:42.654 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:43.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:49:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:44.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:49:44 np0005593233 nova_compute[222017]: 2026-01-23 09:49:44.854 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:45.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:46.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:46 np0005593233 nova_compute[222017]: 2026-01-23 09:49:46.636 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:47.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:48.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:49 np0005593233 nova_compute[222017]: 2026-01-23 09:49:49.893 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:49.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:49:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:50.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:49:51 np0005593233 nova_compute[222017]: 2026-01-23 09:49:51.640 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:51.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:52.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:52.610 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:49:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:52.611 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:49:52 np0005593233 nova_compute[222017]: 2026-01-23 09:49:52.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:49:53.614 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:53.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:54.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:54 np0005593233 nova_compute[222017]: 2026-01-23 09:49:54.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:55 np0005593233 podman[248948]: 2026-01-23 09:49:55.118879987 +0000 UTC m=+0.128691668 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 04:49:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:55.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:49:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:56.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:49:56 np0005593233 nova_compute[222017]: 2026-01-23 09:49:56.644 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:57.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:58.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:59 np0005593233 nova_compute[222017]: 2026-01-23 09:49:59.897 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:49:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:59.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:00.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 04:50:01 np0005593233 nova_compute[222017]: 2026-01-23 09:50:01.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:01.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:02.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:03.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:04.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:04 np0005593233 nova_compute[222017]: 2026-01-23 09:50:04.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:05.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:06.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:06 np0005593233 nova_compute[222017]: 2026-01-23 09:50:06.695 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:07 np0005593233 podman[248978]: 2026-01-23 09:50:07.046875335 +0000 UTC m=+0.059020367 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 04:50:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:07.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:08.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:09 np0005593233 nova_compute[222017]: 2026-01-23 09:50:09.903 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:09.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.034 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.035 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.068 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.189 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.189 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.202 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.203 222021 INFO nova.compute.claims [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:50:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.346 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:10.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1415694543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.827 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.834 222021 DEBUG nova.compute.provider_tree [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.855 222021 DEBUG nova.scheduler.client.report [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.895 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.895 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.954 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.955 222021 DEBUG nova.network.neutron [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:50:10 np0005593233 nova_compute[222017]: 2026-01-23 09:50:10.992 222021 INFO nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.027 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.196 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.200 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.201 222021 INFO nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Creating image(s)#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.236 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.277 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.317 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.322 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.364 222021 DEBUG nova.policy [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.425 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.427 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.427 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.428 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.466 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.473 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c59b553e-fba3-4556-8bfb-574ef73b361e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:11 np0005593233 nova_compute[222017]: 2026-01-23 09:50:11.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:11.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.100 222021 DEBUG nova.network.neutron [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Successfully created port: 12a2295e-315b-425f-aec6-0c57f15c2edf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.126 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c59b553e-fba3-4556-8bfb-574ef73b361e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.213 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:50:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:12.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.516 222021 DEBUG nova.objects.instance [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid c59b553e-fba3-4556-8bfb-574ef73b361e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.531 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.531 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Ensure instance console log exists: /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.532 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.532 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.532 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.949 222021 DEBUG nova.network.neutron [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Successfully updated port: 12a2295e-315b-425f-aec6-0c57f15c2edf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.970 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.971 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:12 np0005593233 nova_compute[222017]: 2026-01-23 09:50:12.971 222021 DEBUG nova.network.neutron [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:13 np0005593233 nova_compute[222017]: 2026-01-23 09:50:13.206 222021 DEBUG nova.network.neutron [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:50:13 np0005593233 nova_compute[222017]: 2026-01-23 09:50:13.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:13 np0005593233 nova_compute[222017]: 2026-01-23 09:50:13.499 222021 DEBUG nova.compute.manager [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-changed-12a2295e-315b-425f-aec6-0c57f15c2edf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:13 np0005593233 nova_compute[222017]: 2026-01-23 09:50:13.500 222021 DEBUG nova.compute.manager [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Refreshing instance network info cache due to event network-changed-12a2295e-315b-425f-aec6-0c57f15c2edf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:50:13 np0005593233 nova_compute[222017]: 2026-01-23 09:50:13.500 222021 DEBUG oslo_concurrency.lockutils [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:13.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.271 222021 DEBUG nova.network.neutron [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Updating instance_info_cache with network_info: [{"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.292 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.292 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Instance network_info: |[{"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.293 222021 DEBUG oslo_concurrency.lockutils [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.293 222021 DEBUG nova.network.neutron [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Refreshing network info cache for port 12a2295e-315b-425f-aec6-0c57f15c2edf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.297 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Start _get_guest_xml network_info=[{"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.301 222021 WARNING nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.307 222021 DEBUG nova.virt.libvirt.host [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.308 222021 DEBUG nova.virt.libvirt.host [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.315 222021 DEBUG nova.virt.libvirt.host [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.315 222021 DEBUG nova.virt.libvirt.host [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.317 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.317 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.318 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.318 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.318 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.319 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.319 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.319 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.320 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.320 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.320 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.321 222021 DEBUG nova.virt.hardware [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.324 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:14.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1002680193' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.793 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.823 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.828 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:14 np0005593233 nova_compute[222017]: 2026-01-23 09:50:14.906 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1464231786' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.299 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.302 222021 DEBUG nova.virt.libvirt.vif [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1759327834',display_name='tempest-DeleteServersTestJSON-server-1759327834',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1759327834',id=68,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-v040bm82',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:11Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=c59b553e-fba3-4556-8bfb-574ef73b361e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.303 222021 DEBUG nova.network.os_vif_util [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.305 222021 DEBUG nova.network.os_vif_util [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.308 222021 DEBUG nova.objects.instance [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid c59b553e-fba3-4556-8bfb-574ef73b361e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.349 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <uuid>c59b553e-fba3-4556-8bfb-574ef73b361e</uuid>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <name>instance-00000044</name>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:name>tempest-DeleteServersTestJSON-server-1759327834</nova:name>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:50:14</nova:creationTime>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <nova:port uuid="12a2295e-315b-425f-aec6-0c57f15c2edf">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <entry name="serial">c59b553e-fba3-4556-8bfb-574ef73b361e</entry>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <entry name="uuid">c59b553e-fba3-4556-8bfb-574ef73b361e</entry>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c59b553e-fba3-4556-8bfb-574ef73b361e_disk">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c59b553e-fba3-4556-8bfb-574ef73b361e_disk.config">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:80:bf:99"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <target dev="tap12a2295e-31"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/console.log" append="off"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:50:15 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:50:15 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:50:15 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:50:15 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.350 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Preparing to wait for external event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.351 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.351 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.351 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.352 222021 DEBUG nova.virt.libvirt.vif [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1759327834',display_name='tempest-DeleteServersTestJSON-server-1759327834',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1759327834',id=68,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-v040bm82',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:11Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=c59b553e-fba3-4556-8bfb-574ef73b361e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.352 222021 DEBUG nova.network.os_vif_util [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.353 222021 DEBUG nova.network.os_vif_util [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.353 222021 DEBUG os_vif [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.354 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.354 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.355 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.359 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.359 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12a2295e-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.360 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12a2295e-31, col_values=(('external_ids', {'iface-id': '12a2295e-315b-425f-aec6-0c57f15c2edf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:bf:99', 'vm-uuid': 'c59b553e-fba3-4556-8bfb-574ef73b361e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.362 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593233 NetworkManager[48871]: <info>  [1769161815.3636] manager: (tap12a2295e-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.364 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.374 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.376 222021 INFO os_vif [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31')#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.447 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.448 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.448 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:80:bf:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.449 222021 INFO nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Using config drive#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.481 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.929 222021 DEBUG nova.network.neutron [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Updated VIF entry in instance network info cache for port 12a2295e-315b-425f-aec6-0c57f15c2edf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:50:15 np0005593233 nova_compute[222017]: 2026-01-23 09:50:15.931 222021 DEBUG nova.network.neutron [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Updating instance_info_cache with network_info: [{"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:15.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:16.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:16 np0005593233 nova_compute[222017]: 2026-01-23 09:50:16.690 222021 DEBUG oslo_concurrency.lockutils [req-99dc3625-b0e7-4f1a-aaf4-c94a001ea1fb req-4fe2ec54-e465-4b2e-b580-5e25cdd9cca8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:17.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:18.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:18 np0005593233 nova_compute[222017]: 2026-01-23 09:50:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:19 np0005593233 nova_compute[222017]: 2026-01-23 09:50:19.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:19 np0005593233 nova_compute[222017]: 2026-01-23 09:50:19.909 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:19.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.376 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:20.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.412 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.412 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.412 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.412 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.579 222021 INFO nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Creating config drive at /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/disk.config#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.586 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi0e3hpze execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.728 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi0e3hpze" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.763 222021 DEBUG nova.storage.rbd_utils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image c59b553e-fba3-4556-8bfb-574ef73b361e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.768 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/disk.config c59b553e-fba3-4556-8bfb-574ef73b361e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2916599866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:20 np0005593233 nova_compute[222017]: 2026-01-23 09:50:20.935 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.015 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000044 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.015 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000044 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.226 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.229 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4695MB free_disk=20.876388549804688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.230 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.231 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.329 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance c59b553e-fba3-4556-8bfb-574ef73b361e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.330 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.331 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.379 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.648 222021 DEBUG oslo_concurrency.processutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/disk.config c59b553e-fba3-4556-8bfb-574ef73b361e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.880s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.650 222021 INFO nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Deleting local config drive /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e/disk.config because it was imported into RBD.#033[00m
Jan 23 04:50:21 np0005593233 kernel: tap12a2295e-31: entered promiscuous mode
Jan 23 04:50:21 np0005593233 NetworkManager[48871]: <info>  [1769161821.7170] manager: (tap12a2295e-31): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Jan 23 04:50:21 np0005593233 systemd-udevd[249359]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:21Z|00219|binding|INFO|Claiming lport 12a2295e-315b-425f-aec6-0c57f15c2edf for this chassis.
Jan 23 04:50:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:21Z|00220|binding|INFO|12a2295e-315b-425f-aec6-0c57f15c2edf: Claiming fa:16:3e:80:bf:99 10.100.0.14
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.774 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:bf:99 10.100.0.14'], port_security=['fa:16:3e:80:bf:99 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c59b553e-fba3-4556-8bfb-574ef73b361e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=12a2295e-315b-425f-aec6-0c57f15c2edf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.775 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 12a2295e-315b-425f-aec6-0c57f15c2edf in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.777 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:50:21 np0005593233 NetworkManager[48871]: <info>  [1769161821.7791] device (tap12a2295e-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:50:21 np0005593233 NetworkManager[48871]: <info>  [1769161821.7801] device (tap12a2295e-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.795 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a5afca46-030b-4454-a48b-99b7ec06ced2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.797 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:50:21 np0005593233 systemd-machined[190954]: New machine qemu-34-instance-00000044.
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.802 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.802 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe84c32-c273-4b3b-8c8b-0758f1c6e21b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.803 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[55ed7cb9-bda7-4e71-82ec-5047285141b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.824 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ed17e3-4b2f-4d81-ad26-a479c8986cdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 systemd[1]: Started Virtual Machine qemu-34-instance-00000044.
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.850 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:21Z|00221|binding|INFO|Setting lport 12a2295e-315b-425f-aec6-0c57f15c2edf ovn-installed in OVS
Jan 23 04:50:21 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:21Z|00222|binding|INFO|Setting lport 12a2295e-315b-425f-aec6-0c57f15c2edf up in Southbound
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.854 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.856 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc747ad-fc3b-4d9e-a8fd-4824102f22ee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:21 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3399372386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.903 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9c9ef7-a016-4d6d-b6b9-ea729f72f30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.910 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:21 np0005593233 NetworkManager[48871]: <info>  [1769161821.9187] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.917 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34b57266-e85e-418e-b500-94c01c61e178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.921 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.943 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.958 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd658b1-283f-46ac-92fd-7f528d625de9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.962 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f8a5c1-34f4-4fb8-9071-1e4a27d4266e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 NetworkManager[48871]: <info>  [1769161821.9896] device (tapa3788149-e0): carrier: link connected
Jan 23 04:50:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:21.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:21.995 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a823edc8-42dc-484e-bb22-60fded49bf3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.997 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:50:21 np0005593233 nova_compute[222017]: 2026-01-23 09:50:21.998 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.021 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe82a2e-5b0a-4b0c-9c61-77839a4f78b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573462, 'reachable_time': 44888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249397, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.043 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[99267534-7b87-4e70-a05c-5386f85365db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573462, 'tstamp': 573462}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249398, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.066 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[66e7ca1b-9196-4ccb-aec5-209b4b0f33e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573462, 'reachable_time': 44888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249399, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.116 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[91dcb0fc-3f82-4e1d-8f26-55102549e5e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.167 222021 DEBUG nova.compute.manager [req-5463332e-aafa-4ed6-a012-7aab2418004e req-5a9e3725-c455-4620-beba-efdfa5c317a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.168 222021 DEBUG oslo_concurrency.lockutils [req-5463332e-aafa-4ed6-a012-7aab2418004e req-5a9e3725-c455-4620-beba-efdfa5c317a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.168 222021 DEBUG oslo_concurrency.lockutils [req-5463332e-aafa-4ed6-a012-7aab2418004e req-5a9e3725-c455-4620-beba-efdfa5c317a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.168 222021 DEBUG oslo_concurrency.lockutils [req-5463332e-aafa-4ed6-a012-7aab2418004e req-5a9e3725-c455-4620-beba-efdfa5c317a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.169 222021 DEBUG nova.compute.manager [req-5463332e-aafa-4ed6-a012-7aab2418004e req-5a9e3725-c455-4620-beba-efdfa5c317a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Processing event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.200 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[00168e0d-fee2-4156-bd1b-3ba8ca42301b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.202 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.203 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.204 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:22 np0005593233 NetworkManager[48871]: <info>  [1769161822.2071] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 23 04:50:22 np0005593233 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.206 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.209 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:22Z|00223|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.227 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.228 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.229 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6a122a40-fb68-42a8-a828-6288323077d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.230 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:22.231 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:50:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:22.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.644 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161822.6436174, c59b553e-fba3-4556-8bfb-574ef73b361e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.645 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] VM Started (Lifecycle Event)#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.648 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.652 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.657 222021 INFO nova.virt.libvirt.driver [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Instance spawned successfully.#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.658 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.673 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.681 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.685 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.686 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.686 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.687 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.687 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.688 222021 DEBUG nova.virt.libvirt.driver [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.715 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.716 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161822.6440156, c59b553e-fba3-4556-8bfb-574ef73b361e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.716 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:50:22 np0005593233 podman[249473]: 2026-01-23 09:50:22.716490169 +0000 UTC m=+0.066666496 container create 990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.744 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.749 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161822.6520517, c59b553e-fba3-4556-8bfb-574ef73b361e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.750 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.756 222021 INFO nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Took 11.56 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.757 222021 DEBUG nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:22 np0005593233 systemd[1]: Started libpod-conmon-990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5.scope.
Jan 23 04:50:22 np0005593233 podman[249473]: 2026-01-23 09:50:22.682761535 +0000 UTC m=+0.032937882 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.790 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.794 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:22 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:50:22 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bed9b4acd2f91c0f73c4d40a85efaa0708fdc7025d755582d7e3f4d49ba7e607/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.828 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:22 np0005593233 podman[249473]: 2026-01-23 09:50:22.829302563 +0000 UTC m=+0.179478920 container init 990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:50:22 np0005593233 podman[249473]: 2026-01-23 09:50:22.837593119 +0000 UTC m=+0.187769446 container start 990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.841 222021 INFO nova.compute.manager [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Took 12.70 seconds to build instance.#033[00m
Jan 23 04:50:22 np0005593233 nova_compute[222017]: 2026-01-23 09:50:22.859 222021 DEBUG oslo_concurrency.lockutils [None req-240656b6-5261-4ca5-b027-51d9137bc676 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:22 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [NOTICE]   (249493) : New worker (249495) forked
Jan 23 04:50:22 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [NOTICE]   (249493) : Loading success.
Jan 23 04:50:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:23.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:23 np0005593233 nova_compute[222017]: 2026-01-23 09:50:23.999 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:23.999 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:23.999 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.222 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.222 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.222 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.223 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c59b553e-fba3-4556-8bfb-574ef73b361e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:50:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:24.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.599 222021 DEBUG nova.compute.manager [req-1aabffca-db43-4c78-b137-7f3a9b07b8f7 req-53412c9f-f304-452a-bdb7-4240066d00d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.599 222021 DEBUG oslo_concurrency.lockutils [req-1aabffca-db43-4c78-b137-7f3a9b07b8f7 req-53412c9f-f304-452a-bdb7-4240066d00d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.600 222021 DEBUG oslo_concurrency.lockutils [req-1aabffca-db43-4c78-b137-7f3a9b07b8f7 req-53412c9f-f304-452a-bdb7-4240066d00d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.600 222021 DEBUG oslo_concurrency.lockutils [req-1aabffca-db43-4c78-b137-7f3a9b07b8f7 req-53412c9f-f304-452a-bdb7-4240066d00d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.600 222021 DEBUG nova.compute.manager [req-1aabffca-db43-4c78-b137-7f3a9b07b8f7 req-53412c9f-f304-452a-bdb7-4240066d00d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] No waiting events found dispatching network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.601 222021 WARNING nova.compute.manager [req-1aabffca-db43-4c78-b137-7f3a9b07b8f7 req-53412c9f-f304-452a-bdb7-4240066d00d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received unexpected event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:24 np0005593233 nova_compute[222017]: 2026-01-23 09:50:24.912 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.173 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.173 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.174 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.174 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.174 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.175 222021 INFO nova.compute.manager [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Terminating instance#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.176 222021 DEBUG nova.compute.manager [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:50:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:25 np0005593233 kernel: tap12a2295e-31 (unregistering): left promiscuous mode
Jan 23 04:50:25 np0005593233 NetworkManager[48871]: <info>  [1769161825.2799] device (tap12a2295e-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.292 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:25Z|00224|binding|INFO|Releasing lport 12a2295e-315b-425f-aec6-0c57f15c2edf from this chassis (sb_readonly=0)
Jan 23 04:50:25 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:25Z|00225|binding|INFO|Setting lport 12a2295e-315b-425f-aec6-0c57f15c2edf down in Southbound
Jan 23 04:50:25 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:25Z|00226|binding|INFO|Removing iface tap12a2295e-31 ovn-installed in OVS
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.296 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.317 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000044.scope: Deactivated successfully.
Jan 23 04:50:25 np0005593233 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000044.scope: Consumed 3.443s CPU time.
Jan 23 04:50:25 np0005593233 systemd-machined[190954]: Machine qemu-34-instance-00000044 terminated.
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.378 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.424 222021 INFO nova.virt.libvirt.driver [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Instance destroyed successfully.#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.425 222021 DEBUG nova.objects.instance [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid c59b553e-fba3-4556-8bfb-574ef73b361e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:25 np0005593233 podman[249506]: 2026-01-23 09:50:25.42780029 +0000 UTC m=+0.114616736 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:50:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:25.958 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:bf:99 10.100.0.14'], port_security=['fa:16:3e:80:bf:99 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c59b553e-fba3-4556-8bfb-574ef73b361e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=12a2295e-315b-425f-aec6-0c57f15c2edf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:25.960 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 12a2295e-315b-425f-aec6-0c57f15c2edf in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:50:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:25.961 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:50:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:25.963 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d07a2dbe-f121-4b5b-b436-abcbe0e413a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:25.963 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.987 222021 DEBUG nova.virt.libvirt.vif [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1759327834',display_name='tempest-DeleteServersTestJSON-server-1759327834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1759327834',id=68,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-v040bm82',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:22Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=c59b553e-fba3-4556-8bfb-574ef73b361e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.988 222021 DEBUG nova.network.os_vif_util [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.989 222021 DEBUG nova.network.os_vif_util [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.990 222021 DEBUG os_vif [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.993 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.993 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12a2295e-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.995 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593233 nova_compute[222017]: 2026-01-23 09:50:25.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:26 np0005593233 nova_compute[222017]: 2026-01-23 09:50:26.000 222021 INFO os_vif [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:bf:99,bridge_name='br-int',has_traffic_filtering=True,id=12a2295e-315b-425f-aec6-0c57f15c2edf,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12a2295e-31')#033[00m
Jan 23 04:50:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:25.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:26 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [NOTICE]   (249493) : haproxy version is 2.8.14-c23fe91
Jan 23 04:50:26 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [NOTICE]   (249493) : path to executable is /usr/sbin/haproxy
Jan 23 04:50:26 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [WARNING]  (249493) : Exiting Master process...
Jan 23 04:50:26 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [ALERT]    (249493) : Current worker (249495) exited with code 143 (Terminated)
Jan 23 04:50:26 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[249489]: [WARNING]  (249493) : All workers exited. Exiting... (0)
Jan 23 04:50:26 np0005593233 systemd[1]: libpod-990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5.scope: Deactivated successfully.
Jan 23 04:50:26 np0005593233 podman[249578]: 2026-01-23 09:50:26.107160202 +0000 UTC m=+0.044412870 container died 990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:50:26 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5-userdata-shm.mount: Deactivated successfully.
Jan 23 04:50:26 np0005593233 systemd[1]: var-lib-containers-storage-overlay-bed9b4acd2f91c0f73c4d40a85efaa0708fdc7025d755582d7e3f4d49ba7e607-merged.mount: Deactivated successfully.
Jan 23 04:50:26 np0005593233 podman[249578]: 2026-01-23 09:50:26.150757828 +0000 UTC m=+0.088010476 container cleanup 990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:50:26 np0005593233 systemd[1]: libpod-conmon-990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5.scope: Deactivated successfully.
Jan 23 04:50:26 np0005593233 podman[249612]: 2026-01-23 09:50:26.227194662 +0000 UTC m=+0.047287462 container remove 990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.234 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[363baf6f-a6cc-47bd-a2b3-e1c2512886a2]: (4, ('Fri Jan 23 09:50:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5)\n990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5\nFri Jan 23 09:50:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5)\n990aa23433f6c5f64a1ccdf3cde0c14e24228352836fb025322b3ad975177ba5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.237 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9bae7029-1116-46b3-800d-ae1acb605a66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.238 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:26 np0005593233 nova_compute[222017]: 2026-01-23 09:50:26.240 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:26 np0005593233 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:50:26 np0005593233 nova_compute[222017]: 2026-01-23 09:50:26.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.268 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e92b47-adfd-4bcb-967e-94436a8e5b92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.283 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[261bfe57-bd9a-4938-b6e9-ddca28f7530a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.284 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce95445d-8ba5-453d-a79b-72f1f34583f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.306 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[38ae633e-02f3-479f-beae-8f5a7d0dd828]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573452, 'reachable_time': 34057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249628, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.311 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:50:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:26.311 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd09833-4728-4de0-9b4d-7d79cf383595]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:26.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:26 np0005593233 nova_compute[222017]: 2026-01-23 09:50:26.486 222021 INFO nova.virt.libvirt.driver [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Deleting instance files /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e_del#033[00m
Jan 23 04:50:26 np0005593233 nova_compute[222017]: 2026-01-23 09:50:26.488 222021 INFO nova.virt.libvirt.driver [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Deletion of /var/lib/nova/instances/c59b553e-fba3-4556-8bfb-574ef73b361e_del complete#033[00m
Jan 23 04:50:27 np0005593233 nova_compute[222017]: 2026-01-23 09:50:27.635 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Updating instance_info_cache with network_info: [{"id": "12a2295e-315b-425f-aec6-0c57f15c2edf", "address": "fa:16:3e:80:bf:99", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12a2295e-31", "ovs_interfaceid": "12a2295e-315b-425f-aec6-0c57f15c2edf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:28.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:28.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.218 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-c59b553e-fba3-4556-8bfb-574ef73b361e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.218 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.218 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.219 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.361 222021 INFO nova.compute.manager [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Took 4.19 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.362 222021 DEBUG oslo.service.loopingcall [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.362 222021 DEBUG nova.compute.manager [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.363 222021 DEBUG nova.network.neutron [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:50:29 np0005593233 nova_compute[222017]: 2026-01-23 09:50:29.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:30.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.367 222021 DEBUG nova.compute.manager [req-37b6a546-27f3-4925-addc-4091fc84af08 req-20923cb4-c035-4d72-8a3a-03fc0d69fc1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-vif-unplugged-12a2295e-315b-425f-aec6-0c57f15c2edf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.368 222021 DEBUG oslo_concurrency.lockutils [req-37b6a546-27f3-4925-addc-4091fc84af08 req-20923cb4-c035-4d72-8a3a-03fc0d69fc1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.368 222021 DEBUG oslo_concurrency.lockutils [req-37b6a546-27f3-4925-addc-4091fc84af08 req-20923cb4-c035-4d72-8a3a-03fc0d69fc1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.368 222021 DEBUG oslo_concurrency.lockutils [req-37b6a546-27f3-4925-addc-4091fc84af08 req-20923cb4-c035-4d72-8a3a-03fc0d69fc1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.368 222021 DEBUG nova.compute.manager [req-37b6a546-27f3-4925-addc-4091fc84af08 req-20923cb4-c035-4d72-8a3a-03fc0d69fc1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] No waiting events found dispatching network-vif-unplugged-12a2295e-315b-425f-aec6-0c57f15c2edf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.369 222021 DEBUG nova.compute.manager [req-37b6a546-27f3-4925-addc-4091fc84af08 req-20923cb4-c035-4d72-8a3a-03fc0d69fc1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-vif-unplugged-12a2295e-315b-425f-aec6-0c57f15c2edf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:50:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:30.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:30 np0005593233 nova_compute[222017]: 2026-01-23 09:50:30.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:32.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:32.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.562 222021 DEBUG nova.network.neutron [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.588 222021 INFO nova.compute.manager [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Took 3.23 seconds to deallocate network for instance.#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.638 222021 DEBUG nova.compute.manager [req-b602cc8b-46c6-4591-9787-15eff348b616 req-bdac86fc-f1fb-490b-8f43-2a971c384ee5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.639 222021 DEBUG oslo_concurrency.lockutils [req-b602cc8b-46c6-4591-9787-15eff348b616 req-bdac86fc-f1fb-490b-8f43-2a971c384ee5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.639 222021 DEBUG oslo_concurrency.lockutils [req-b602cc8b-46c6-4591-9787-15eff348b616 req-bdac86fc-f1fb-490b-8f43-2a971c384ee5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.640 222021 DEBUG oslo_concurrency.lockutils [req-b602cc8b-46c6-4591-9787-15eff348b616 req-bdac86fc-f1fb-490b-8f43-2a971c384ee5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.640 222021 DEBUG nova.compute.manager [req-b602cc8b-46c6-4591-9787-15eff348b616 req-bdac86fc-f1fb-490b-8f43-2a971c384ee5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] No waiting events found dispatching network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.641 222021 WARNING nova.compute.manager [req-b602cc8b-46c6-4591-9787-15eff348b616 req-bdac86fc-f1fb-490b-8f43-2a971c384ee5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received unexpected event network-vif-plugged-12a2295e-315b-425f-aec6-0c57f15c2edf for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.657 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.658 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.673 222021 DEBUG nova.compute.manager [req-3ef2f44a-aa83-4cf4-ae95-df729e3fde20 req-92546315-cc67-4266-8173-da9fbf55a0c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Received event network-vif-deleted-12a2295e-315b-425f-aec6-0c57f15c2edf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:32 np0005593233 nova_compute[222017]: 2026-01-23 09:50:32.732 222021 DEBUG oslo_concurrency.processutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/352487586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.244 222021 DEBUG oslo_concurrency.processutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.251 222021 DEBUG nova.compute.provider_tree [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.599 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.644 222021 DEBUG nova.scheduler.client.report [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.732 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.765 222021 INFO nova.scheduler.client.report [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance c59b553e-fba3-4556-8bfb-574ef73b361e#033[00m
Jan 23 04:50:33 np0005593233 nova_compute[222017]: 2026-01-23 09:50:33.871 222021 DEBUG oslo_concurrency.lockutils [None req-892da5cc-da6f-45d7-ac62-070888c205c7 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "c59b553e-fba3-4556-8bfb-574ef73b361e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:34.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:34.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:34 np0005593233 nova_compute[222017]: 2026-01-23 09:50:34.915 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:36 np0005593233 nova_compute[222017]: 2026-01-23 09:50:36.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:36.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:36.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:38.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:38 np0005593233 podman[249651]: 2026-01-23 09:50:38.044818457 +0000 UTC m=+0.058485302 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:50:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:38.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.761668) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839761715, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 803, "num_deletes": 250, "total_data_size": 1567034, "memory_usage": 1583264, "flush_reason": "Manual Compaction"}
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839769659, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 668299, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41135, "largest_seqno": 41933, "table_properties": {"data_size": 665073, "index_size": 1070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8668, "raw_average_key_size": 20, "raw_value_size": 658303, "raw_average_value_size": 1567, "num_data_blocks": 49, "num_entries": 420, "num_filter_entries": 420, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161779, "oldest_key_time": 1769161779, "file_creation_time": 1769161839, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 8041 microseconds, and 3186 cpu microseconds.
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.769706) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 668299 bytes OK
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.769727) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.771064) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.771077) EVENT_LOG_v1 {"time_micros": 1769161839771073, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.771093) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1562807, prev total WAL file size 1562807, number of live WAL files 2.
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.772020) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323535' seq:72057594037927935, type:22 .. '6D6772737461740031353036' seq:0, type:0; will stop at (end)
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(652KB)], [78(11MB)]
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839772051, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 12508288, "oldest_snapshot_seqno": -1}
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6508 keys, 9021501 bytes, temperature: kUnknown
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839892827, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9021501, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8979378, "index_size": 24733, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16325, "raw_key_size": 167147, "raw_average_key_size": 25, "raw_value_size": 8863984, "raw_average_value_size": 1362, "num_data_blocks": 985, "num_entries": 6508, "num_filter_entries": 6508, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161839, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:50:39 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:50:39 np0005593233 nova_compute[222017]: 2026-01-23 09:50:39.917 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 04:50:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:40.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.893116) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9021501 bytes
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.062271) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.5 rd, 74.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.3 +0.0 blob) out(8.6 +0.0 blob), read-write-amplify(32.2) write-amplify(13.5) OK, records in: 6994, records dropped: 486 output_compression: NoCompression
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.062335) EVENT_LOG_v1 {"time_micros": 1769161840062313, "job": 48, "event": "compaction_finished", "compaction_time_micros": 120900, "compaction_time_cpu_micros": 22794, "output_level": 6, "num_output_files": 1, "total_output_size": 9021501, "num_input_records": 6994, "num_output_records": 6508, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840062845, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840066252, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:39.771863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.066442) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.066451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.066452) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.066455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:50:40.066457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.105 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.106 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.133 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.223 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.223 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.230 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.231 222021 INFO nova.compute.claims [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.382 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:40.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.424 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161825.423104, c59b553e-fba3-4556-8bfb-574ef73b361e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.425 222021 INFO nova.compute.manager [-] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.451 222021 DEBUG nova.compute.manager [None req-630bd8eb-3999-4079-96f6-2678ebe20ea5 - - - - - -] [instance: c59b553e-fba3-4556-8bfb-574ef73b361e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/894813455' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.847 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.857 222021 DEBUG nova.compute.provider_tree [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.874 222021 DEBUG nova.scheduler.client.report [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.898 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.900 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.950 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.951 222021 DEBUG nova.network.neutron [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:50:40 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.976 222021 INFO nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:40.999 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.119 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.121 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.121 222021 INFO nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Creating image(s)#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.159 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.192 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.222 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.228 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.311 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.312 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.313 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.314 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.349 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.356 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 463b029c-94eb-4160-9199-43759bb23b61_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.472 222021 DEBUG nova.policy [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.728 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 463b029c-94eb-4160-9199-43759bb23b61_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.372s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.807 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.919 222021 DEBUG nova.objects.instance [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid 463b029c-94eb-4160-9199-43759bb23b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.946 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.947 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Ensure instance console log exists: /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.948 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.948 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:41 np0005593233 nova_compute[222017]: 2026-01-23 09:50:41.949 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:42.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:42.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:42.653 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:42.653 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:42.653 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:42 np0005593233 nova_compute[222017]: 2026-01-23 09:50:42.855 222021 DEBUG nova.network.neutron [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Successfully created port: aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:50:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:50:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:44.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:44.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:50:44 np0005593233 nova_compute[222017]: 2026-01-23 09:50:44.921 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.109 222021 DEBUG nova.network.neutron [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Successfully updated port: aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.133 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-463b029c-94eb-4160-9199-43759bb23b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.134 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-463b029c-94eb-4160-9199-43759bb23b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.134 222021 DEBUG nova.network.neutron [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.256 222021 DEBUG nova.compute.manager [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-changed-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.256 222021 DEBUG nova.compute.manager [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Refreshing instance network info cache due to event network-changed-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.257 222021 DEBUG oslo_concurrency.lockutils [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-463b029c-94eb-4160-9199-43759bb23b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:45 np0005593233 nova_compute[222017]: 2026-01-23 09:50:45.423 222021 DEBUG nova.network.neutron [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:50:46 np0005593233 nova_compute[222017]: 2026-01-23 09:50:46.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:46.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:46.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:46 np0005593233 nova_compute[222017]: 2026-01-23 09:50:46.978 222021 DEBUG nova.network.neutron [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Updating instance_info_cache with network_info: [{"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.318 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-463b029c-94eb-4160-9199-43759bb23b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.318 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Instance network_info: |[{"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.319 222021 DEBUG oslo_concurrency.lockutils [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-463b029c-94eb-4160-9199-43759bb23b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.319 222021 DEBUG nova.network.neutron [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Refreshing network info cache for port aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.321 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Start _get_guest_xml network_info=[{"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.327 222021 WARNING nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.331 222021 DEBUG nova.virt.libvirt.host [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.332 222021 DEBUG nova.virt.libvirt.host [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.338 222021 DEBUG nova.virt.libvirt.host [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.339 222021 DEBUG nova.virt.libvirt.host [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.340 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.340 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.341 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.341 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.341 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.341 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.341 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.342 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.342 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.342 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.342 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.342 222021 DEBUG nova.virt.hardware [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.345 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3415581460' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.799 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.827 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:47 np0005593233 nova_compute[222017]: 2026-01-23 09:50:47.832 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:48.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.291 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.293 222021 DEBUG nova.virt.libvirt.vif [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1560667876',display_name='tempest-DeleteServersTestJSON-server-1560667876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1560667876',id=71,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-xa7x09sv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:41Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=463b029c-94eb-4160-9199-43759bb23b61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.293 222021 DEBUG nova.network.os_vif_util [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.294 222021 DEBUG nova.network.os_vif_util [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.295 222021 DEBUG nova.objects.instance [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid 463b029c-94eb-4160-9199-43759bb23b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.316 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <uuid>463b029c-94eb-4160-9199-43759bb23b61</uuid>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <name>instance-00000047</name>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:name>tempest-DeleteServersTestJSON-server-1560667876</nova:name>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:50:47</nova:creationTime>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <nova:port uuid="aa3a69c3-ff94-4b3a-92ce-3e1f7709e284">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <entry name="serial">463b029c-94eb-4160-9199-43759bb23b61</entry>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <entry name="uuid">463b029c-94eb-4160-9199-43759bb23b61</entry>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/463b029c-94eb-4160-9199-43759bb23b61_disk">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/463b029c-94eb-4160-9199-43759bb23b61_disk.config">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:34:e2:cc"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <target dev="tapaa3a69c3-ff"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/console.log" append="off"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:50:48 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:50:48 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:50:48 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:50:48 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.318 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Preparing to wait for external event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.319 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.319 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.319 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.320 222021 DEBUG nova.virt.libvirt.vif [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1560667876',display_name='tempest-DeleteServersTestJSON-server-1560667876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1560667876',id=71,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-xa7x09sv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:41Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=463b029c-94eb-4160-9199-43759bb23b61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.320 222021 DEBUG nova.network.os_vif_util [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.321 222021 DEBUG nova.network.os_vif_util [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.321 222021 DEBUG os_vif [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.322 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.322 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.323 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.327 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.327 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa3a69c3-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.328 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaa3a69c3-ff, col_values=(('external_ids', {'iface-id': 'aa3a69c3-ff94-4b3a-92ce-3e1f7709e284', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:e2:cc', 'vm-uuid': '463b029c-94eb-4160-9199-43759bb23b61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:48 np0005593233 NetworkManager[48871]: <info>  [1769161848.3725] manager: (tapaa3a69c3-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.375 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.377 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.378 222021 INFO os_vif [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff')#033[00m
Jan 23 04:50:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:48.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.579 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.580 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.580 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:34:e2:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.581 222021 INFO nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Using config drive#033[00m
Jan 23 04:50:48 np0005593233 nova_compute[222017]: 2026-01-23 09:50:48.608 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:49 np0005593233 nova_compute[222017]: 2026-01-23 09:50:49.751 222021 INFO nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Creating config drive at /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/disk.config#033[00m
Jan 23 04:50:49 np0005593233 nova_compute[222017]: 2026-01-23 09:50:49.757 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsbo8m0yc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:49 np0005593233 nova_compute[222017]: 2026-01-23 09:50:49.793 222021 DEBUG nova.network.neutron [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Updated VIF entry in instance network info cache for port aa3a69c3-ff94-4b3a-92ce-3e1f7709e284. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:50:49 np0005593233 nova_compute[222017]: 2026-01-23 09:50:49.794 222021 DEBUG nova.network.neutron [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Updating instance_info_cache with network_info: [{"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:49 np0005593233 nova_compute[222017]: 2026-01-23 09:50:49.841 222021 DEBUG oslo_concurrency.lockutils [req-93aeb6fe-6d83-441e-8efd-a4fef7ce6666 req-8c40eb66-c933-4ca5-9317-ba5dd7499743 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-463b029c-94eb-4160-9199-43759bb23b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:49 np0005593233 nova_compute[222017]: 2026-01-23 09:50:49.900 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsbo8m0yc" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:50.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.126 222021 DEBUG nova.storage.rbd_utils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 463b029c-94eb-4160-9199-43759bb23b61_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.132 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/disk.config 463b029c-94eb-4160-9199-43759bb23b61_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.160 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.170 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.171 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.195 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:50:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.287 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.288 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.298 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.299 222021 INFO nova.compute.claims [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:50:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:50.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.455 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.562 222021 DEBUG oslo_concurrency.processutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/disk.config 463b029c-94eb-4160-9199-43759bb23b61_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.563 222021 INFO nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Deleting local config drive /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61/disk.config because it was imported into RBD.#033[00m
Jan 23 04:50:50 np0005593233 kernel: tapaa3a69c3-ff: entered promiscuous mode
Jan 23 04:50:50 np0005593233 NetworkManager[48871]: <info>  [1769161850.6282] manager: (tapaa3a69c3-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 23 04:50:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:50Z|00227|binding|INFO|Claiming lport aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 for this chassis.
Jan 23 04:50:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:50Z|00228|binding|INFO|aa3a69c3-ff94-4b3a-92ce-3e1f7709e284: Claiming fa:16:3e:34:e2:cc 10.100.0.12
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.630 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:50Z|00229|binding|INFO|Setting lport aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 ovn-installed in OVS
Jan 23 04:50:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:50Z|00230|binding|INFO|Setting lport aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 up in Southbound
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.655 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.657 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:e2:cc 10.100.0.12'], port_security=['fa:16:3e:34:e2:cc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '463b029c-94eb-4160-9199-43759bb23b61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.659 140224 INFO neutron.agent.ovn.metadata.agent [-] Port aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.661 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:50 np0005593233 systemd-udevd[250145]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.679 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f4ee96-a60e-4276-aa1b-7ad94fcb8972]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.682 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.684 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.684 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4140998f-ac5b-435e-953a-a8aa41221cf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.685 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c353000a-c368-452b-9da6-95c0d4a7a586]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 NetworkManager[48871]: <info>  [1769161850.6921] device (tapaa3a69c3-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:50:50 np0005593233 NetworkManager[48871]: <info>  [1769161850.6934] device (tapaa3a69c3-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.700 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[0afb8ad0-dcd1-403a-80fa-ef7cc2e62fef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 systemd-machined[190954]: New machine qemu-35-instance-00000047.
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.759 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[99a1132e-4fcb-42c1-9a9a-9d48f898253e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 systemd[1]: Started Virtual Machine qemu-35-instance-00000047.
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.823 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f1710823-b9a3-4dab-83e2-84d2b4128e7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 NetworkManager[48871]: <info>  [1769161850.8343] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Jan 23 04:50:50 np0005593233 systemd-udevd[250150]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.835 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3877e479-2ceb-4cb8-9597-1f57fae55901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.871 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3d88efa8-2240-423b-9814-f8fdb94c188d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.874 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fb557d80-1d09-452c-9b32-b5850e2b5447]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 NetworkManager[48871]: <info>  [1769161850.9008] device (tapa3788149-e0): carrier: link connected
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.908 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a7269ef9-7b7b-40b7-8035-8bc62acb0690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.919 222021 DEBUG nova.compute.manager [req-b4b33048-1dd7-434a-a100-359242351628 req-40d6204e-119e-4593-8879-40cf4b9461f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.919 222021 DEBUG oslo_concurrency.lockutils [req-b4b33048-1dd7-434a-a100-359242351628 req-40d6204e-119e-4593-8879-40cf4b9461f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.919 222021 DEBUG oslo_concurrency.lockutils [req-b4b33048-1dd7-434a-a100-359242351628 req-40d6204e-119e-4593-8879-40cf4b9461f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.920 222021 DEBUG oslo_concurrency.lockutils [req-b4b33048-1dd7-434a-a100-359242351628 req-40d6204e-119e-4593-8879-40cf4b9461f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:50 np0005593233 nova_compute[222017]: 2026-01-23 09:50:50.920 222021 DEBUG nova.compute.manager [req-b4b33048-1dd7-434a-a100-359242351628 req-40d6204e-119e-4593-8879-40cf4b9461f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Processing event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.929 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d86978a1-f69b-4349-a9e5-ee354f48ae8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576353, 'reachable_time': 27427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250179, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.950 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0785fa8a-3e94-4fcf-8c92-bae89cecdfe5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 576353, 'tstamp': 576353}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250180, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:50.972 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d457de47-523d-4b38-86c3-96b92f967dce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576353, 'reachable_time': 27427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250181, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.019 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1e0497-68b9-43b9-becb-4e43e91dbce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3707987799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.051 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.595s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.060 222021 DEBUG nova.compute.provider_tree [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.092 222021 DEBUG nova.scheduler.client.report [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.093 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0591cd-f660-462c-b760-6bf5d8f7a97a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.095 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.095 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.096 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:51 np0005593233 NetworkManager[48871]: <info>  [1769161851.0992] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 23 04:50:51 np0005593233 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.102 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.102 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.103 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:51Z|00231|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.153 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.155 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.157 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.157 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8843424f-f93c-4196-8ca0-7760dd65cef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.158 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.159 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:50:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:51.161 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.227 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.228 222021 DEBUG nova.network.neutron [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.254 222021 INFO nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.277 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.402 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.404 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.404 222021 INFO nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Creating image(s)#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.433 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:51 np0005593233 podman[250258]: 2026-01-23 09:50:51.558119186 +0000 UTC m=+0.023223425 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.772 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.806 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.811 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.847 222021 DEBUG nova.policy [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.903 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.904 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.905 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.906 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.943 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:51 np0005593233 nova_compute[222017]: 2026-01-23 09:50:51.949 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d0bb0470-cc5c-4b6f-be0d-20839267c340_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:52.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:52 np0005593233 podman[250258]: 2026-01-23 09:50:52.325408691 +0000 UTC m=+0.790512900 container create b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:50:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:52.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:52 np0005593233 systemd[1]: Started libpod-conmon-b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a.scope.
Jan 23 04:50:52 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:50:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39bfbfde616ad4316f81afc61e9b47526a1b385ed40a3ac6ca9894eb859cfa4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.540 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.542 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161852.5392828, 463b029c-94eb-4160-9199-43759bb23b61 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.543 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] VM Started (Lifecycle Event)#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.546 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.550 222021 INFO nova.virt.libvirt.driver [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Instance spawned successfully.#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.550 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.572 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.577 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.587 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.587 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.588 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.588 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.589 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.589 222021 DEBUG nova.virt.libvirt.driver [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.613 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.613 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161852.5414534, 463b029c-94eb-4160-9199-43759bb23b61 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.613 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.649 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.654 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161852.5473015, 463b029c-94eb-4160-9199-43759bb23b61 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.654 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.677 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.683 222021 INFO nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Took 11.56 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.683 222021 DEBUG nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.685 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.718 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:52 np0005593233 podman[250258]: 2026-01-23 09:50:52.720640975 +0000 UTC m=+1.185745204 container init b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:50:52 np0005593233 podman[250258]: 2026-01-23 09:50:52.728221811 +0000 UTC m=+1.193326010 container start b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:50:52 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [NOTICE]   (250370) : New worker (250372) forked
Jan 23 04:50:52 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [NOTICE]   (250370) : Loading success.
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.771 222021 INFO nova.compute.manager [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Took 12.58 seconds to build instance.#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.793 222021 DEBUG oslo_concurrency.lockutils [None req-f154a3f8-8c4f-45b6-a544-080c8a95b461 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:52 np0005593233 nova_compute[222017]: 2026-01-23 09:50:52.981 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d0bb0470-cc5c-4b6f-be0d-20839267c340_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.031 222021 DEBUG nova.compute.manager [req-8ffe2354-446f-46b7-97a3-26037f7b227e req-411c62a8-e9b2-4f56-9869-a71c598ffed5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.033 222021 DEBUG oslo_concurrency.lockutils [req-8ffe2354-446f-46b7-97a3-26037f7b227e req-411c62a8-e9b2-4f56-9869-a71c598ffed5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.033 222021 DEBUG oslo_concurrency.lockutils [req-8ffe2354-446f-46b7-97a3-26037f7b227e req-411c62a8-e9b2-4f56-9869-a71c598ffed5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.033 222021 DEBUG oslo_concurrency.lockutils [req-8ffe2354-446f-46b7-97a3-26037f7b227e req-411c62a8-e9b2-4f56-9869-a71c598ffed5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.034 222021 DEBUG nova.compute.manager [req-8ffe2354-446f-46b7-97a3-26037f7b227e req-411c62a8-e9b2-4f56-9869-a71c598ffed5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] No waiting events found dispatching network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.034 222021 WARNING nova.compute.manager [req-8ffe2354-446f-46b7-97a3-26037f7b227e req-411c62a8-e9b2-4f56-9869-a71c598ffed5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received unexpected event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.093 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] resizing rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.210 222021 DEBUG nova.network.neutron [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Successfully created port: e137f0ac-1409-48af-9c44-4d589d8b9bf9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.219 222021 DEBUG nova.objects.instance [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'migration_context' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.238 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.239 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Ensure instance console log exists: /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.239 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.240 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.240 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:53 np0005593233 nova_compute[222017]: 2026-01-23 09:50:53.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:54.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:54.292 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.292 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:54.293 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:50:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:54.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.734 222021 DEBUG nova.network.neutron [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Successfully updated port: e137f0ac-1409-48af-9c44-4d589d8b9bf9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.753 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.754 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.755 222021 DEBUG nova.network.neutron [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.834 222021 DEBUG nova.compute.manager [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.835 222021 DEBUG nova.compute.manager [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing instance network info cache due to event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.836 222021 DEBUG oslo_concurrency.lockutils [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.926 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:54 np0005593233 nova_compute[222017]: 2026-01-23 09:50:54.939 222021 DEBUG nova.network.neutron [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:50:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3355357516' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.093 222021 DEBUG nova.network.neutron [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.123 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.123 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Instance network_info: |[{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.125 222021 DEBUG oslo_concurrency.lockutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.125 222021 DEBUG oslo_concurrency.lockutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.126 222021 DEBUG oslo_concurrency.lockutils [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.126 222021 DEBUG nova.network.neutron [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.129 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Start _get_guest_xml network_info=[{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.148 222021 WARNING nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.155 222021 DEBUG nova.objects.instance [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'flavor' on Instance uuid 463b029c-94eb-4160-9199-43759bb23b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.163 222021 DEBUG nova.virt.libvirt.host [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.164 222021 DEBUG nova.virt.libvirt.host [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.168 222021 DEBUG nova.virt.libvirt.host [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.169 222021 DEBUG nova.virt.libvirt.host [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.170 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.171 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.171 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.171 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.171 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.171 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.172 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.172 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.172 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.172 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.172 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.173 222021 DEBUG nova.virt.hardware [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.176 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.233 222021 DEBUG oslo_concurrency.lockutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:56.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:50:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:56.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:50:56 np0005593233 podman[250523]: 2026-01-23 09:50:56.496374581 +0000 UTC m=+0.105391811 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.497 222021 DEBUG oslo_concurrency.lockutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.498 222021 DEBUG oslo_concurrency.lockutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.498 222021 INFO nova.compute.manager [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Attaching volume 97e56534-c8f3-4b8d-bb6b-37ecd5ac131a to /dev/vdb#033[00m
Jan 23 04:50:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1055028614' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.674 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.711 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.716 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.757 222021 DEBUG os_brick.utils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.759 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.776 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.776 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca31d63-6bde-41e0-bfe7-79c5fda6f4cd]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.778 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.789 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.789 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[33aed70e-8b7f-48e8-bb88-617666aeb795]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.791 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.807 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.807 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[dabfd989-963f-493b-8842-f65c375acf2a]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.810 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[3b617fff-a4fc-4e9b-9d35-6cb79f46b4f8]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.810 222021 DEBUG oslo_concurrency.processutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.839 222021 DEBUG oslo_concurrency.processutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.842 222021 DEBUG os_brick.initiator.connectors.lightos [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.842 222021 DEBUG os_brick.initiator.connectors.lightos [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.843 222021 DEBUG os_brick.initiator.connectors.lightos [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.843 222021 DEBUG os_brick.utils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] <== get_connector_properties: return (85ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:50:56 np0005593233 nova_compute[222017]: 2026-01-23 09:50:56.843 222021 DEBUG nova.virt.block_device [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Updating existing volume attachment record: 38765fd5-b063-42f9-92b7-1b569932cb85 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:50:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/457895385' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.212 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.214 222021 DEBUG nova.virt.libvirt.vif [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.214 222021 DEBUG nova.network.os_vif_util [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.216 222021 DEBUG nova.network.os_vif_util [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.218 222021 DEBUG nova.objects.instance [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_devices' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.238 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <uuid>d0bb0470-cc5c-4b6f-be0d-20839267c340</uuid>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <name>instance-00000048</name>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:name>tempest-tempest.common.compute-instance-687494396</nova:name>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:50:56</nova:creationTime>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <nova:port uuid="e137f0ac-1409-48af-9c44-4d589d8b9bf9">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <entry name="serial">d0bb0470-cc5c-4b6f-be0d-20839267c340</entry>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <entry name="uuid">d0bb0470-cc5c-4b6f-be0d-20839267c340</entry>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/d0bb0470-cc5c-4b6f-be0d-20839267c340_disk">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:60:14:59"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <target dev="tape137f0ac-14"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/console.log" append="off"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:50:57 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:50:57 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.238 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Preparing to wait for external event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.239 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.239 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.239 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.240 222021 DEBUG nova.virt.libvirt.vif [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.240 222021 DEBUG nova.network.os_vif_util [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.241 222021 DEBUG nova.network.os_vif_util [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.241 222021 DEBUG os_vif [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.242 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.242 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.245 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.246 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape137f0ac-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.246 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape137f0ac-14, col_values=(('external_ids', {'iface-id': 'e137f0ac-1409-48af-9c44-4d589d8b9bf9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:14:59', 'vm-uuid': 'd0bb0470-cc5c-4b6f-be0d-20839267c340'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.247 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:57 np0005593233 NetworkManager[48871]: <info>  [1769161857.2495] manager: (tape137f0ac-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.255 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.256 222021 INFO os_vif [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14')#033[00m
Jan 23 04:50:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:57.295 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.336 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.337 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.337 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:60:14:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.339 222021 INFO nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Using config drive#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.375 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.681 222021 DEBUG nova.network.neutron [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated VIF entry in instance network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.682 222021 DEBUG nova.network.neutron [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.707 222021 DEBUG oslo_concurrency.lockutils [req-efe9aa31-f44b-42e5-9e01-b3f4e08cb071 req-4989a220-8a10-4470-8e62-59ccc8a155ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.765 222021 DEBUG nova.objects.instance [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'flavor' on Instance uuid 463b029c-94eb-4160-9199-43759bb23b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.798 222021 DEBUG nova.virt.libvirt.driver [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Attempting to attach volume 97e56534-c8f3-4b8d-bb6b-37ecd5ac131a with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.802 222021 DEBUG nova.virt.libvirt.guest [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-97e56534-c8f3-4b8d-bb6b-37ecd5ac131a">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </source>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 04:50:57 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  </auth>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:50:57 np0005593233 nova_compute[222017]:  <serial>97e56534-c8f3-4b8d-bb6b-37ecd5ac131a</serial>
Jan 23 04:50:57 np0005593233 nova_compute[222017]: </disk>
Jan 23 04:50:57 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.826 222021 INFO nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Creating config drive at /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/disk.config#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.836 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84dmnydu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:57 np0005593233 nova_compute[222017]: 2026-01-23 09:50:57.978 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84dmnydu" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.020 222021 DEBUG nova.storage.rbd_utils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.025 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/disk.config d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.072 222021 DEBUG nova.virt.libvirt.driver [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.072 222021 DEBUG nova.virt.libvirt.driver [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.072 222021 DEBUG nova.virt.libvirt.driver [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.073 222021 DEBUG nova.virt.libvirt.driver [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:34:e2:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.205 222021 DEBUG oslo_concurrency.processutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/disk.config d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.180s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.206 222021 INFO nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Deleting local config drive /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/disk.config because it was imported into RBD.#033[00m
Jan 23 04:50:58 np0005593233 kernel: tape137f0ac-14: entered promiscuous mode
Jan 23 04:50:58 np0005593233 NetworkManager[48871]: <info>  [1769161858.2660] manager: (tape137f0ac-14): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.267 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:58Z|00232|binding|INFO|Claiming lport e137f0ac-1409-48af-9c44-4d589d8b9bf9 for this chassis.
Jan 23 04:50:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:58Z|00233|binding|INFO|e137f0ac-1409-48af-9c44-4d589d8b9bf9: Claiming fa:16:3e:60:14:59 10.100.0.5
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.280 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:14:59 10.100.0.5'], port_security=['fa:16:3e:60:14:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd0bb0470-cc5c-4b6f-be0d-20839267c340', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0547d145-6526-47bb-a492-48772f700715', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e137f0ac-1409-48af-9c44-4d589d8b9bf9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.280 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e137f0ac-1409-48af-9c44-4d589d8b9bf9 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.282 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.298 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d787ac-3034-4c49-8a24-22d5497d9c95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.300 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7808328e-21 in ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.302 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7808328e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.303 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[640df122-48e9-44b1-aada-8f1c93a92aa3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.303 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[234e8717-0475-4246-aaa5-9f0daddfdc2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 systemd-udevd[250694]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:58 np0005593233 systemd-machined[190954]: New machine qemu-36-instance-00000048.
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.322 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e98de650-214c-484a-8b30-e3604555d35a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 NetworkManager[48871]: <info>  [1769161858.3319] device (tape137f0ac-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:50:58 np0005593233 NetworkManager[48871]: <info>  [1769161858.3329] device (tape137f0ac-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:50:58 np0005593233 systemd[1]: Started Virtual Machine qemu-36-instance-00000048.
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.343 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:58Z|00234|binding|INFO|Setting lport e137f0ac-1409-48af-9c44-4d589d8b9bf9 ovn-installed in OVS
Jan 23 04:50:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:58Z|00235|binding|INFO|Setting lport e137f0ac-1409-48af-9c44-4d589d8b9bf9 up in Southbound
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.351 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a8824570-e50f-4a3c-98b8-abc45c024dbc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:50:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:58.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.395 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2188cc76-d091-4369-96ea-da163ca328c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 NetworkManager[48871]: <info>  [1769161858.4092] manager: (tap7808328e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.407 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea51113-9ab4-4b2e-9a57-aa138947f60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.422 222021 DEBUG oslo_concurrency.lockutils [None req-9445cb87-f9c5-463a-9745-858fd95ed017 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:50:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:58.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.449 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[349abfac-2085-482e-befb-ee462c6f509a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.453 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3843062d-cb19-4a6e-a1ab-3adbb9fb96e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 NetworkManager[48871]: <info>  [1769161858.4893] device (tap7808328e-20): carrier: link connected
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.496 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[05197987-012d-494f-9109-83d1e834aaaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.519 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6be88a01-cb2e-40af-8183-4d5911c87e3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577112, 'reachable_time': 35670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250725, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.536 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[24b9262e-9aea-4418-86e0-2e073f9df8af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:22ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577112, 'tstamp': 577112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250726, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.555 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c017e130-ba3d-4446-a7da-5f157dfb967e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577112, 'reachable_time': 35670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250727, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.585 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[feb2bcdc-eca0-4a63-b5a6-bc48bdfcca96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.681 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[045f03c3-7414-4d0f-b739-58f20659f46b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.684 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.684 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.685 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:58 np0005593233 NetworkManager[48871]: <info>  [1769161858.6888] manager: (tap7808328e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 23 04:50:58 np0005593233 kernel: tap7808328e-20: entered promiscuous mode
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.693 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:50:58Z|00236|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.711 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.712 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[69037cd5-b09c-4142-ae2b-c4c735b65272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.713 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-7808328e-22f9-46df-ac06-f8c3d6ad10c4
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 7808328e-22f9-46df-ac06-f8c3d6ad10c4
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:50:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:50:58.715 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'env', 'PROCESS_TAG=haproxy-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7808328e-22f9-46df-ac06-f8c3d6ad10c4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.998 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161858.9979565, d0bb0470-cc5c-4b6f-be0d-20839267c340 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:58 np0005593233 nova_compute[222017]: 2026-01-23 09:50:58.999 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] VM Started (Lifecycle Event)#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.029 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.035 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161858.9981651, d0bb0470-cc5c-4b6f-be0d-20839267c340 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.036 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.075 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.081 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.109 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:59 np0005593233 podman[250801]: 2026-01-23 09:50:59.140850845 +0000 UTC m=+0.046772687 container create cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:50:59 np0005593233 systemd[1]: Started libpod-conmon-cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b.scope.
Jan 23 04:50:59 np0005593233 podman[250801]: 2026-01-23 09:50:59.116555271 +0000 UTC m=+0.022477133 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.220 222021 DEBUG nova.compute.manager [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.220 222021 DEBUG oslo_concurrency.lockutils [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.220 222021 DEBUG oslo_concurrency.lockutils [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.221 222021 DEBUG oslo_concurrency.lockutils [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.221 222021 DEBUG nova.compute.manager [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Processing event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.221 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:50:59 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.226 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161859.2263114, d0bb0470-cc5c-4b6f-be0d-20839267c340 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.227 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.228 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:50:59 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b400941d92e4d10aafea7a67e56f5e99475e7147881026c4e913ac4c54e791ed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.243 222021 INFO nova.virt.libvirt.driver [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Instance spawned successfully.#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.244 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.251 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:59 np0005593233 podman[250801]: 2026-01-23 09:50:59.255667566 +0000 UTC m=+0.161589408 container init cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.258 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:59 np0005593233 podman[250801]: 2026-01-23 09:50:59.264724465 +0000 UTC m=+0.170646307 container start cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.267 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.268 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.268 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.269 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.269 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.269 222021 DEBUG nova.virt.libvirt.driver [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.287 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:59 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [NOTICE]   (250820) : New worker (250822) forked
Jan 23 04:50:59 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [NOTICE]   (250820) : Loading success.
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.341 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.342 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.342 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.342 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.342 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.343 222021 INFO nova.compute.manager [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Terminating instance#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.344 222021 DEBUG nova.compute.manager [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.366 222021 INFO nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Took 7.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.367 222021 DEBUG nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.439 222021 INFO nova.compute.manager [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Took 9.18 seconds to build instance.#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.466 222021 DEBUG oslo_concurrency.lockutils [None req-c53859a5-3777-42bb-8531-65f6740e3bc9 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:59 np0005593233 nova_compute[222017]: 2026-01-23 09:50:59.929 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 kernel: tapaa3a69c3-ff (unregistering): left promiscuous mode
Jan 23 04:51:00 np0005593233 NetworkManager[48871]: <info>  [1769161860.1097] device (tapaa3a69c3-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:51:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:00Z|00237|binding|INFO|Releasing lport aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 from this chassis (sb_readonly=0)
Jan 23 04:51:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:00Z|00238|binding|INFO|Setting lport aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 down in Southbound
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.117 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:00Z|00239|binding|INFO|Removing iface tapaa3a69c3-ff ovn-installed in OVS
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.118 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.129 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:e2:cc 10.100.0.12'], port_security=['fa:16:3e:34:e2:cc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '463b029c-94eb-4160-9199-43759bb23b61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.131 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.133 140224 INFO neutron.agent.ovn.metadata.agent [-] Port aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.134 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.136 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cc63e7-dcde-4b1e-b574-8292a10ad625]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.137 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:51:00 np0005593233 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000047.scope: Deactivated successfully.
Jan 23 04:51:00 np0005593233 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000047.scope: Consumed 6.589s CPU time.
Jan 23 04:51:00 np0005593233 systemd-machined[190954]: Machine qemu-35-instance-00000047 terminated.
Jan 23 04:51:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:00.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:00 np0005593233 NetworkManager[48871]: <info>  [1769161860.3704] manager: (tapaa3a69c3-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/121)
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.375 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.383 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.415 222021 INFO nova.virt.libvirt.driver [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Instance destroyed successfully.#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.416 222021 DEBUG nova.objects.instance [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid 463b029c-94eb-4160-9199-43759bb23b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.439 222021 DEBUG nova.virt.libvirt.vif [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1560667876',display_name='tempest-DeleteServersTestJSON-server-1560667876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1560667876',id=71,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-xa7x09sv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:52Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=463b029c-94eb-4160-9199-43759bb23b61,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.440 222021 DEBUG nova.network.os_vif_util [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "address": "fa:16:3e:34:e2:cc", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa3a69c3-ff", "ovs_interfaceid": "aa3a69c3-ff94-4b3a-92ce-3e1f7709e284", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.441 222021 DEBUG nova.network.os_vif_util [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.442 222021 DEBUG os_vif [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:51:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:00.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.446 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.446 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa3a69c3-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.448 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.450 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.453 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.456 222021 INFO os_vif [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:e2:cc,bridge_name='br-int',has_traffic_filtering=True,id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa3a69c3-ff')#033[00m
Jan 23 04:51:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:00 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [NOTICE]   (250370) : haproxy version is 2.8.14-c23fe91
Jan 23 04:51:00 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [NOTICE]   (250370) : path to executable is /usr/sbin/haproxy
Jan 23 04:51:00 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [WARNING]  (250370) : Exiting Master process...
Jan 23 04:51:00 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [ALERT]    (250370) : Current worker (250372) exited with code 143 (Terminated)
Jan 23 04:51:00 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[250365]: [WARNING]  (250370) : All workers exited. Exiting... (0)
Jan 23 04:51:00 np0005593233 systemd[1]: libpod-b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a.scope: Deactivated successfully.
Jan 23 04:51:00 np0005593233 podman[250852]: 2026-01-23 09:51:00.482206763 +0000 UTC m=+0.247711719 container died b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:51:00 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a-userdata-shm.mount: Deactivated successfully.
Jan 23 04:51:00 np0005593233 systemd[1]: var-lib-containers-storage-overlay-39bfbfde616ad4316f81afc61e9b47526a1b385ed40a3ac6ca9894eb859cfa4a-merged.mount: Deactivated successfully.
Jan 23 04:51:00 np0005593233 podman[250852]: 2026-01-23 09:51:00.546482959 +0000 UTC m=+0.311987905 container cleanup b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:51:00 np0005593233 systemd[1]: libpod-conmon-b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a.scope: Deactivated successfully.
Jan 23 04:51:00 np0005593233 podman[250910]: 2026-01-23 09:51:00.626968949 +0000 UTC m=+0.054476228 container remove b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.635 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dd0be873-0bf2-409b-b688-9e360238e495]: (4, ('Fri Jan 23 09:51:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a)\nb394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a\nFri Jan 23 09:51:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (b394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a)\nb394a961ac60bb0d0b37fc7824b7a74d8e74c67fc7131133fa7bb9c0e5587b6a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.637 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[650be46e-7ae5-4b07-bcfd-e5251d830953]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.639 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:51:00 np0005593233 nova_compute[222017]: 2026-01-23 09:51:00.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.659 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ea7f00-95d5-4a08-a976-47675202ce0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.681 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[91b87a50-ca7b-4fd2-97df-18495bb545a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.683 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[89672216-1e1f-4d0a-803e-2ee2425e9894]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.709 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78e1d4b9-c6ee-41ea-b06f-0d5e9ebc7191]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576343, 'reachable_time': 42384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250925, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593233 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.712 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:51:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:00.712 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[6818ca12-fa5c-4ce9-8b5f-9533679ac6ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.022 222021 INFO nova.virt.libvirt.driver [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Deleting instance files /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61_del#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.023 222021 INFO nova.virt.libvirt.driver [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Deletion of /var/lib/nova/instances/463b029c-94eb-4160-9199-43759bb23b61_del complete#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.109 222021 INFO nova.compute.manager [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Took 1.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.110 222021 DEBUG oslo.service.loopingcall [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.111 222021 DEBUG nova.compute.manager [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.111 222021 DEBUG nova.network.neutron [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.348 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.348 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.349 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.349 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.349 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.350 222021 WARNING nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received unexpected event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.350 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-vif-unplugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.350 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.351 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.351 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.351 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] No waiting events found dispatching network-vif-unplugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.352 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-vif-unplugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.352 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.352 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "463b029c-94eb-4160-9199-43759bb23b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.353 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.353 222021 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.353 222021 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] No waiting events found dispatching network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:01 np0005593233 nova_compute[222017]: 2026-01-23 09:51:01.354 222021 WARNING nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received unexpected event network-vif-plugged-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:51:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:02.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:02.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:02 np0005593233 NetworkManager[48871]: <info>  [1769161862.4492] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 23 04:51:02 np0005593233 nova_compute[222017]: 2026-01-23 09:51:02.448 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:02 np0005593233 NetworkManager[48871]: <info>  [1769161862.4499] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Jan 23 04:51:02 np0005593233 nova_compute[222017]: 2026-01-23 09:51:02.576 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:02 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:02Z|00240|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:51:02 np0005593233 nova_compute[222017]: 2026-01-23 09:51:02.596 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.239 222021 DEBUG nova.network.neutron [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.365 222021 DEBUG nova.compute.manager [req-5952bcc0-f322-40bf-8c29-8b60230a67b7 req-a0b255f0-3ef6-48bf-b759-6cf2239b591e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Received event network-vif-deleted-aa3a69c3-ff94-4b3a-92ce-3e1f7709e284 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.366 222021 INFO nova.compute.manager [req-5952bcc0-f322-40bf-8c29-8b60230a67b7 req-a0b255f0-3ef6-48bf-b759-6cf2239b591e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Neutron deleted interface aa3a69c3-ff94-4b3a-92ce-3e1f7709e284; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.367 222021 DEBUG nova.network.neutron [req-5952bcc0-f322-40bf-8c29-8b60230a67b7 req-a0b255f0-3ef6-48bf-b759-6cf2239b591e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.550 222021 DEBUG nova.compute.manager [req-5952bcc0-f322-40bf-8c29-8b60230a67b7 req-a0b255f0-3ef6-48bf-b759-6cf2239b591e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Detach interface failed, port_id=aa3a69c3-ff94-4b3a-92ce-3e1f7709e284, reason: Instance 463b029c-94eb-4160-9199-43759bb23b61 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.558 222021 INFO nova.compute.manager [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Took 2.45 seconds to deallocate network for instance.#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.852 222021 INFO nova.compute.manager [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Took 0.29 seconds to detach 1 volumes for instance.#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.932 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:03 np0005593233 nova_compute[222017]: 2026-01-23 09:51:03.933 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.011 222021 DEBUG oslo_concurrency.processutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:04.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/702792103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:04.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.470 222021 DEBUG oslo_concurrency.processutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.481 222021 DEBUG nova.compute.provider_tree [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.542 222021 DEBUG nova.scheduler.client.report [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.579 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.661 222021 INFO nova.scheduler.client.report [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance 463b029c-94eb-4160-9199-43759bb23b61#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.792 222021 DEBUG oslo_concurrency.lockutils [None req-37b5822f-5227-458c-92c2-f21d827d863f 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "463b029c-94eb-4160-9199-43759bb23b61" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:04 np0005593233 nova_compute[222017]: 2026-01-23 09:51:04.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:05 np0005593233 nova_compute[222017]: 2026-01-23 09:51:05.449 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:06.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:06.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:07 np0005593233 nova_compute[222017]: 2026-01-23 09:51:07.290 222021 DEBUG nova.compute.manager [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:07 np0005593233 nova_compute[222017]: 2026-01-23 09:51:07.291 222021 DEBUG nova.compute.manager [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing instance network info cache due to event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:07 np0005593233 nova_compute[222017]: 2026-01-23 09:51:07.292 222021 DEBUG oslo_concurrency.lockutils [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:07 np0005593233 nova_compute[222017]: 2026-01-23 09:51:07.292 222021 DEBUG oslo_concurrency.lockutils [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:07 np0005593233 nova_compute[222017]: 2026-01-23 09:51:07.293 222021 DEBUG nova.network.neutron [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:08.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:08.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:09 np0005593233 podman[250951]: 2026-01-23 09:51:09.064978536 +0000 UTC m=+0.068393145 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:51:09 np0005593233 nova_compute[222017]: 2026-01-23 09:51:09.851 222021 DEBUG nova.network.neutron [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated VIF entry in instance network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:09 np0005593233 nova_compute[222017]: 2026-01-23 09:51:09.854 222021 DEBUG nova.network.neutron [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:09 np0005593233 nova_compute[222017]: 2026-01-23 09:51:09.887 222021 DEBUG oslo_concurrency.lockutils [req-7da2b4c0-6d12-4c83-989b-d1936a74b7ac req-df321e8b-30a4-4120-a5f6-a39fec15d794 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:09 np0005593233 nova_compute[222017]: 2026-01-23 09:51:09.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:10.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:10 np0005593233 nova_compute[222017]: 2026-01-23 09:51:10.451 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:10.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:12.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:12.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:13 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:13Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:14:59 10.100.0.5
Jan 23 04:51:13 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:13Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:14:59 10.100.0.5
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:14.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:14.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.855 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.856 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.884 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.937 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.991 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.991 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.998 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:51:14 np0005593233 nova_compute[222017]: 2026-01-23 09:51:14.998 222021 INFO nova.compute.claims [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.180 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.386 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161860.3850005, 463b029c-94eb-4160-9199-43759bb23b61 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.387 222021 INFO nova.compute.manager [-] [instance: 463b029c-94eb-4160-9199-43759bb23b61] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.439 222021 DEBUG nova.compute.manager [None req-0a38faca-350e-43e4-92ac-19d32975f904 - - - - - -] [instance: 463b029c-94eb-4160-9199-43759bb23b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.478 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3673466893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.680 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.686 222021 DEBUG nova.compute.provider_tree [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.702 222021 DEBUG nova.scheduler.client.report [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.725 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.726 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.774 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.775 222021 DEBUG nova.network.neutron [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.798 222021 INFO nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.826 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.940 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.942 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.943 222021 INFO nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Creating image(s)#033[00m
Jan 23 04:51:15 np0005593233 nova_compute[222017]: 2026-01-23 09:51:15.982 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.023 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.058 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.063 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.133 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.134 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.135 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.135 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.163 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.168 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2994e136-9897-43dc-ae85-799cd5a7ff47_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:16.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:16.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.535 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2994e136-9897-43dc-ae85-799cd5a7ff47_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.367s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.626 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.731 222021 DEBUG nova.objects.instance [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid 2994e136-9897-43dc-ae85-799cd5a7ff47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.760 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.761 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Ensure instance console log exists: /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.761 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.762 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.762 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:16 np0005593233 nova_compute[222017]: 2026-01-23 09:51:16.877 222021 DEBUG nova.policy [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:51:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:51:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.8 total, 600.0 interval#012Cumulative writes: 26K writes, 101K keys, 26K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.03 MB/s#012Cumulative WAL: 26K writes, 9012 syncs, 2.89 writes per sync, written: 0.09 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8944 writes, 32K keys, 8944 commit groups, 1.0 writes per commit group, ingest: 33.83 MB, 0.06 MB/s#012Interval WAL: 8944 writes, 3554 syncs, 2.52 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 04:51:18 np0005593233 nova_compute[222017]: 2026-01-23 09:51:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:18.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:18.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:19 np0005593233 nova_compute[222017]: 2026-01-23 09:51:19.652 222021 DEBUG nova.network.neutron [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Successfully created port: 792a3d37-021b-4048-b93c-f7a4682a56e2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:51:19 np0005593233 nova_compute[222017]: 2026-01-23 09:51:19.940 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:20 np0005593233 nova_compute[222017]: 2026-01-23 09:51:20.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:20 np0005593233 nova_compute[222017]: 2026-01-23 09:51:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:20 np0005593233 nova_compute[222017]: 2026-01-23 09:51:20.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:51:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:20.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:20 np0005593233 nova_compute[222017]: 2026-01-23 09:51:20.480 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:20.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.422 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.423 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:21 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3014972364' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.888 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.980 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:51:21 np0005593233 nova_compute[222017]: 2026-01-23 09:51:21.981 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.147 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.149 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4424MB free_disk=20.892486572265625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.149 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.150 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.358 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance d0bb0470-cc5c-4b6f-be0d-20839267c340 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.359 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 2994e136-9897-43dc-ae85-799cd5a7ff47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.359 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.359 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:51:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:22.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.438 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:22.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2056395388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.901 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.909 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.933 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.979 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:51:22 np0005593233 nova_compute[222017]: 2026-01-23 09:51:22.979 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:23 np0005593233 nova_compute[222017]: 2026-01-23 09:51:23.979 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:23 np0005593233 nova_compute[222017]: 2026-01-23 09:51:23.980 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:51:23 np0005593233 nova_compute[222017]: 2026-01-23 09:51:23.980 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.008 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.334 222021 DEBUG nova.network.neutron [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Successfully updated port: 792a3d37-021b-4048-b93c-f7a4682a56e2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:51:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:24.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:24.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.639 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-2994e136-9897-43dc-ae85-799cd5a7ff47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.640 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-2994e136-9897-43dc-ae85-799cd5a7ff47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.640 222021 DEBUG nova.network.neutron [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.790 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.791 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.791 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.791 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.812 222021 DEBUG nova.compute.manager [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-changed-792a3d37-021b-4048-b93c-f7a4682a56e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.812 222021 DEBUG nova.compute.manager [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Refreshing instance network info cache due to event network-changed-792a3d37-021b-4048-b93c-f7a4682a56e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.812 222021 DEBUG oslo_concurrency.lockutils [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-2994e136-9897-43dc-ae85-799cd5a7ff47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:24 np0005593233 nova_compute[222017]: 2026-01-23 09:51:24.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:25 np0005593233 nova_compute[222017]: 2026-01-23 09:51:25.482 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:25 np0005593233 nova_compute[222017]: 2026-01-23 09:51:25.698 222021 DEBUG nova.network.neutron [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:51:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:26.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:26.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:27 np0005593233 podman[251204]: 2026-01-23 09:51:27.137935313 +0000 UTC m=+0.140246559 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:51:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:28.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:28.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.148 222021 DEBUG nova.network.neutron [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Updating instance_info_cache with network_info: [{"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.176 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-2994e136-9897-43dc-ae85-799cd5a7ff47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.177 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Instance network_info: |[{"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.177 222021 DEBUG oslo_concurrency.lockutils [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-2994e136-9897-43dc-ae85-799cd5a7ff47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.177 222021 DEBUG nova.network.neutron [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Refreshing network info cache for port 792a3d37-021b-4048-b93c-f7a4682a56e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.181 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Start _get_guest_xml network_info=[{"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.187 222021 WARNING nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.193 222021 DEBUG nova.virt.libvirt.host [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.194 222021 DEBUG nova.virt.libvirt.host [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.201 222021 DEBUG nova.virt.libvirt.host [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.202 222021 DEBUG nova.virt.libvirt.host [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.203 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.204 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.204 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.204 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.205 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.205 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.205 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.206 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.206 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.206 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.206 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.207 222021 DEBUG nova.virt.hardware [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.210 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1129181476' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.661 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.707 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.713 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:29 np0005593233 nova_compute[222017]: 2026-01-23 09:51:29.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1062854653' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.191 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.193 222021 DEBUG nova.virt.libvirt.vif [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:51:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1912230762',display_name='tempest-DeleteServersTestJSON-server-1912230762',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1912230762',id=75,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-k6tz7oqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:15Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=2994e136-9897-43dc-ae85-799cd5a7ff47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.194 222021 DEBUG nova.network.os_vif_util [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.197 222021 DEBUG nova.network.os_vif_util [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.198 222021 DEBUG nova.objects.instance [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid 2994e136-9897-43dc-ae85-799cd5a7ff47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.221 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <uuid>2994e136-9897-43dc-ae85-799cd5a7ff47</uuid>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <name>instance-0000004b</name>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:name>tempest-DeleteServersTestJSON-server-1912230762</nova:name>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:51:29</nova:creationTime>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <nova:port uuid="792a3d37-021b-4048-b93c-f7a4682a56e2">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <entry name="serial">2994e136-9897-43dc-ae85-799cd5a7ff47</entry>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <entry name="uuid">2994e136-9897-43dc-ae85-799cd5a7ff47</entry>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2994e136-9897-43dc-ae85-799cd5a7ff47_disk">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2994e136-9897-43dc-ae85-799cd5a7ff47_disk.config">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:c2:3a:07"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <target dev="tap792a3d37-02"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/console.log" append="off"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:51:30 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:51:30 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:51:30 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:51:30 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.223 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Preparing to wait for external event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.224 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.224 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.225 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.225 222021 DEBUG nova.virt.libvirt.vif [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:51:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1912230762',display_name='tempest-DeleteServersTestJSON-server-1912230762',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1912230762',id=75,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-k6tz7oqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:15Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=2994e136-9897-43dc-ae85-799cd5a7ff47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.226 222021 DEBUG nova.network.os_vif_util [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.226 222021 DEBUG nova.network.os_vif_util [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.227 222021 DEBUG os_vif [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.227 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.228 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.228 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.234 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap792a3d37-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.234 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap792a3d37-02, col_values=(('external_ids', {'iface-id': '792a3d37-021b-4048-b93c-f7a4682a56e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:3a:07', 'vm-uuid': '2994e136-9897-43dc-ae85-799cd5a7ff47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:30 np0005593233 NetworkManager[48871]: <info>  [1769161890.2786] manager: (tap792a3d37-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.283 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.285 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.286 222021 INFO os_vif [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02')#033[00m
Jan 23 04:51:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.423 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.423 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.424 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:c2:3a:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.424 222021 INFO nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Using config drive#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.456 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:30.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.738 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.941 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.941 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.943 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:30 np0005593233 nova_compute[222017]: 2026-01-23 09:51:30.943 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:31 np0005593233 nova_compute[222017]: 2026-01-23 09:51:31.997 222021 INFO nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Creating config drive at /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/disk.config#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.004 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpahxkwc0a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.147 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpahxkwc0a" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.187 222021 DEBUG nova.storage.rbd_utils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2994e136-9897-43dc-ae85-799cd5a7ff47_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.215 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/disk.config 2994e136-9897-43dc-ae85-799cd5a7ff47_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.392 222021 DEBUG oslo_concurrency.processutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/disk.config 2994e136-9897-43dc-ae85-799cd5a7ff47_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.177s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.394 222021 INFO nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Deleting local config drive /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47/disk.config because it was imported into RBD.#033[00m
Jan 23 04:51:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:32.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:32 np0005593233 kernel: tap792a3d37-02: entered promiscuous mode
Jan 23 04:51:32 np0005593233 NetworkManager[48871]: <info>  [1769161892.4705] manager: (tap792a3d37-02): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 23 04:51:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:32Z|00241|binding|INFO|Claiming lport 792a3d37-021b-4048-b93c-f7a4682a56e2 for this chassis.
Jan 23 04:51:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:32Z|00242|binding|INFO|792a3d37-021b-4048-b93c-f7a4682a56e2: Claiming fa:16:3e:c2:3a:07 10.100.0.4
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.471 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.480 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:3a:07 10.100.0.4'], port_security=['fa:16:3e:c2:3a:07 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2994e136-9897-43dc-ae85-799cd5a7ff47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=792a3d37-021b-4048-b93c-f7a4682a56e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.482 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 792a3d37-021b-4048-b93c-f7a4682a56e2 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.483 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:51:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:32Z|00243|binding|INFO|Setting lport 792a3d37-021b-4048-b93c-f7a4682a56e2 ovn-installed in OVS
Jan 23 04:51:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:32Z|00244|binding|INFO|Setting lport 792a3d37-021b-4048-b93c-f7a4682a56e2 up in Southbound
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.491 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.494 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.499 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[37486dda-dd0c-4414-8a79-9231bd414760]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.500 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:51:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:32.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.503 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.504 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3ccd40cb-dcf3-4f4f-b8cc-ba22fffd4e5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.506 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f6019acc-18ce-4838-a437-e412ea6da786]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 systemd-udevd[251364]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:32 np0005593233 NetworkManager[48871]: <info>  [1769161892.5206] device (tap792a3d37-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:51:32 np0005593233 NetworkManager[48871]: <info>  [1769161892.5219] device (tap792a3d37-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.520 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1da83e-a27c-4472-b9c0-b8506a8026c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 systemd-machined[190954]: New machine qemu-37-instance-0000004b.
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.537 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[70add69b-bd08-4158-ae5b-8096d7e45bb6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 systemd[1]: Started Virtual Machine qemu-37-instance-0000004b.
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.570 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e61d5fe0-0f14-4d43-b4af-be51fd929a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.575 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[974a81fb-b8d4-4d21-b431-7619d661043f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 NetworkManager[48871]: <info>  [1769161892.5767] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/126)
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.615 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e9131904-1659-4119-ba01-f0f2e430ea16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.617 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9946c1-1ff1-4336-bbe8-ba7582f0cbd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 NetworkManager[48871]: <info>  [1769161892.6408] device (tapa3788149-e0): carrier: link connected
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.652 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca45a4e-5d17-49b4-8573-d680bec6ffbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.672 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3938d3f5-f405-452e-9c06-7b6e2b0adaef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580527, 'reachable_time': 27791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251398, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.693 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3047818c-bdb8-4e03-b55a-311a27e7adc7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580527, 'tstamp': 580527}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251399, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.713 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2ead5469-7be1-44d2-bd26-3b89d1fb7b29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580527, 'reachable_time': 27791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251400, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.748 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9809b530-a367-41e2-94f0-d86541fa9dc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.812 222021 DEBUG nova.compute.manager [req-680e6836-7b4c-441b-aa9d-75c244745095 req-19b87fc9-d962-41ab-b86e-7ae39c6c2a3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.812 222021 DEBUG oslo_concurrency.lockutils [req-680e6836-7b4c-441b-aa9d-75c244745095 req-19b87fc9-d962-41ab-b86e-7ae39c6c2a3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.813 222021 DEBUG oslo_concurrency.lockutils [req-680e6836-7b4c-441b-aa9d-75c244745095 req-19b87fc9-d962-41ab-b86e-7ae39c6c2a3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.813 222021 DEBUG oslo_concurrency.lockutils [req-680e6836-7b4c-441b-aa9d-75c244745095 req-19b87fc9-d962-41ab-b86e-7ae39c6c2a3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.813 222021 DEBUG nova.compute.manager [req-680e6836-7b4c-441b-aa9d-75c244745095 req-19b87fc9-d962-41ab-b86e-7ae39c6c2a3d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Processing event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.829 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e3cf0a2f-f2de-49e0-8246-ca8055d1624b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.831 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.831 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.832 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:51:32 np0005593233 NetworkManager[48871]: <info>  [1769161892.8369] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.838 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.839 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:32 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:32Z|00245|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.860 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.861 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcc674c-1903-426c-a23b-35aa09293a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.862 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:51:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:32.863 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.952 222021 DEBUG nova.compute.manager [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.957 222021 DEBUG nova.compute.manager [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing instance network info cache due to event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.957 222021 DEBUG oslo_concurrency.lockutils [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.957 222021 DEBUG oslo_concurrency.lockutils [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:32 np0005593233 nova_compute[222017]: 2026-01-23 09:51:32.958 222021 DEBUG nova.network.neutron [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.063 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161893.0631857, 2994e136-9897-43dc-ae85-799cd5a7ff47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.064 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] VM Started (Lifecycle Event)#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.068 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.073 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.077 222021 INFO nova.virt.libvirt.driver [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Instance spawned successfully.#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.078 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.120 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.129 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.135 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.136 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.137 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.138 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.138 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.139 222021 DEBUG nova.virt.libvirt.driver [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.196 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.199 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161893.0671515, 2994e136-9897-43dc-ae85-799cd5a7ff47 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.200 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.240 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.243 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161893.0734546, 2994e136-9897-43dc-ae85-799cd5a7ff47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.244 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.249 222021 INFO nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Took 17.31 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.250 222021 DEBUG nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:33 np0005593233 podman[251472]: 2026-01-23 09:51:33.299175851 +0000 UTC m=+0.065877763 container create 7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.300 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.313 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:33 np0005593233 systemd[1]: Started libpod-conmon-7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db.scope.
Jan 23 04:51:33 np0005593233 podman[251472]: 2026-01-23 09:51:33.266415175 +0000 UTC m=+0.033117147 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.379 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:51:33 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.387 222021 INFO nova.compute.manager [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Took 18.44 seconds to build instance.#033[00m
Jan 23 04:51:33 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e27398a07abe95c83e67395f05b66e539c4fd703e2f3adddabda904c21ec88b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:33 np0005593233 podman[251472]: 2026-01-23 09:51:33.404297935 +0000 UTC m=+0.170999897 container init 7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:51:33 np0005593233 podman[251472]: 2026-01-23 09:51:33.410102121 +0000 UTC m=+0.176804063 container start 7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:51:33 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [NOTICE]   (251491) : New worker (251493) forked
Jan 23 04:51:33 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [NOTICE]   (251491) : Loading success.
Jan 23 04:51:33 np0005593233 nova_compute[222017]: 2026-01-23 09:51:33.443 222021 DEBUG oslo_concurrency.lockutils [None req-4e727082-da64-4f9e-a3ff-d45ef1d2c383 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:34 np0005593233 nova_compute[222017]: 2026-01-23 09:51:34.280 222021 DEBUG nova.network.neutron [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Updated VIF entry in instance network info cache for port 792a3d37-021b-4048-b93c-f7a4682a56e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:34 np0005593233 nova_compute[222017]: 2026-01-23 09:51:34.281 222021 DEBUG nova.network.neutron [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Updating instance_info_cache with network_info: [{"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:34.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:34.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:34 np0005593233 nova_compute[222017]: 2026-01-23 09:51:34.681 222021 DEBUG oslo_concurrency.lockutils [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-2994e136-9897-43dc-ae85-799cd5a7ff47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:34 np0005593233 nova_compute[222017]: 2026-01-23 09:51:34.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.056 222021 INFO nova.compute.manager [None req-f6c9dd95-5209-493e-b4e3-787524f3b6b5 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Pausing#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.057 222021 DEBUG nova.objects.instance [None req-f6c9dd95-5209-493e-b4e3-787524f3b6b5 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'flavor' on Instance uuid 2994e136-9897-43dc-ae85-799cd5a7ff47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.161 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161895.1614802, 2994e136-9897-43dc-ae85-799cd5a7ff47 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.162 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.164 222021 DEBUG nova.compute.manager [None req-f6c9dd95-5209-493e-b4e3-787524f3b6b5 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.219 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.225 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.248 222021 DEBUG nova.compute.manager [req-0a35356e-b925-45b7-b7ea-53418c6ea49e req-e4505709-d377-4ce5-a713-85b7acc72632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.248 222021 DEBUG oslo_concurrency.lockutils [req-0a35356e-b925-45b7-b7ea-53418c6ea49e req-e4505709-d377-4ce5-a713-85b7acc72632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.249 222021 DEBUG oslo_concurrency.lockutils [req-0a35356e-b925-45b7-b7ea-53418c6ea49e req-e4505709-d377-4ce5-a713-85b7acc72632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.250 222021 DEBUG oslo_concurrency.lockutils [req-0a35356e-b925-45b7-b7ea-53418c6ea49e req-e4505709-d377-4ce5-a713-85b7acc72632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.250 222021 DEBUG nova.compute.manager [req-0a35356e-b925-45b7-b7ea-53418c6ea49e req-e4505709-d377-4ce5-a713-85b7acc72632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] No waiting events found dispatching network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.251 222021 WARNING nova.compute.manager [req-0a35356e-b925-45b7-b7ea-53418c6ea49e req-e4505709-d377-4ce5-a713-85b7acc72632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received unexpected event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 for instance with vm_state active and task_state pausing.#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.277 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:35 np0005593233 nova_compute[222017]: 2026-01-23 09:51:35.283 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 23 04:51:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:35Z|00246|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:51:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:35Z|00247|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.082 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:51:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:36.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.451 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.452 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.453 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.453 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.453 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.454 222021 INFO nova.compute.manager [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Terminating instance#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.455 222021 DEBUG nova.compute.manager [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:51:36 np0005593233 kernel: tap792a3d37-02 (unregistering): left promiscuous mode
Jan 23 04:51:36 np0005593233 NetworkManager[48871]: <info>  [1769161896.5022] device (tap792a3d37-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.508 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:36Z|00248|binding|INFO|Releasing lport 792a3d37-021b-4048-b93c-f7a4682a56e2 from this chassis (sb_readonly=0)
Jan 23 04:51:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:36Z|00249|binding|INFO|Setting lport 792a3d37-021b-4048-b93c-f7a4682a56e2 down in Southbound
Jan 23 04:51:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:36Z|00250|binding|INFO|Removing iface tap792a3d37-02 ovn-installed in OVS
Jan 23 04:51:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:36.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.511 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.518 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:3a:07 10.100.0.4'], port_security=['fa:16:3e:c2:3a:07 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2994e136-9897-43dc-ae85-799cd5a7ff47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=792a3d37-021b-4048-b93c-f7a4682a56e2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.520 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 792a3d37-021b-4048-b93c-f7a4682a56e2 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.522 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.524 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e3604071-bdd3-4166-af28-cf0ed975d10a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.525 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Jan 23 04:51:36 np0005593233 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004b.scope: Consumed 2.769s CPU time.
Jan 23 04:51:36 np0005593233 systemd-machined[190954]: Machine qemu-37-instance-0000004b terminated.
Jan 23 04:51:36 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [NOTICE]   (251491) : haproxy version is 2.8.14-c23fe91
Jan 23 04:51:36 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [NOTICE]   (251491) : path to executable is /usr/sbin/haproxy
Jan 23 04:51:36 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [WARNING]  (251491) : Exiting Master process...
Jan 23 04:51:36 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [WARNING]  (251491) : Exiting Master process...
Jan 23 04:51:36 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [ALERT]    (251491) : Current worker (251493) exited with code 143 (Terminated)
Jan 23 04:51:36 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[251487]: [WARNING]  (251491) : All workers exited. Exiting... (0)
Jan 23 04:51:36 np0005593233 systemd[1]: libpod-7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db.scope: Deactivated successfully.
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 podman[251527]: 2026-01-23 09:51:36.682869866 +0000 UTC m=+0.045809690 container died 7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.696 222021 INFO nova.virt.libvirt.driver [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Instance destroyed successfully.#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.696 222021 DEBUG nova.objects.instance [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid 2994e136-9897-43dc-ae85-799cd5a7ff47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:36 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db-userdata-shm.mount: Deactivated successfully.
Jan 23 04:51:36 np0005593233 systemd[1]: var-lib-containers-storage-overlay-e27398a07abe95c83e67395f05b66e539c4fd703e2f3adddabda904c21ec88b5-merged.mount: Deactivated successfully.
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.721 222021 DEBUG nova.virt.libvirt.vif [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:51:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1912230762',display_name='tempest-DeleteServersTestJSON-server-1912230762',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1912230762',id=75,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-k6tz7oqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:51:35Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=2994e136-9897-43dc-ae85-799cd5a7ff47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.722 222021 DEBUG nova.network.os_vif_util [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "792a3d37-021b-4048-b93c-f7a4682a56e2", "address": "fa:16:3e:c2:3a:07", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap792a3d37-02", "ovs_interfaceid": "792a3d37-021b-4048-b93c-f7a4682a56e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:36 np0005593233 podman[251527]: 2026-01-23 09:51:36.724019612 +0000 UTC m=+0.086959436 container cleanup 7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.723 222021 DEBUG nova.network.os_vif_util [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.724 222021 DEBUG os_vif [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.729 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap792a3d37-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.731 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 systemd[1]: libpod-conmon-7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db.scope: Deactivated successfully.
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.736 222021 INFO os_vif [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:3a:07,bridge_name='br-int',has_traffic_filtering=True,id=792a3d37-021b-4048-b93c-f7a4682a56e2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap792a3d37-02')#033[00m
Jan 23 04:51:36 np0005593233 podman[251569]: 2026-01-23 09:51:36.793095036 +0000 UTC m=+0.045499151 container remove 7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.801 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e35f84-310b-4a4f-b8c3-25ebf5b3e4f2]: (4, ('Fri Jan 23 09:51:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db)\n7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db\nFri Jan 23 09:51:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db)\n7893ac66bbaf1b926f98ed0b703d4c0da56fb2355a474de1657a6fdb7ed599db\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.804 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b0439ca9-ce8f-4263-bd5c-b48b52e1e0ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.805 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.828 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.831 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ea86519c-1daa-457a-a03d-a615577dcf1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.852 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5778b955-2754-4dbc-b763-893daf9cbd3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.854 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d51242da-d51c-4808-abc5-03b29c2aa939]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.876 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[95f8d8fa-33f6-47e4-a641-7644769ca2bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580519, 'reachable_time': 28872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251603, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.880 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:51:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:36.880 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[4d3c0f01-6272-426d-ba69-67b1f65a861f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:36 np0005593233 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.937 222021 DEBUG nova.network.neutron [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated VIF entry in instance network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.938 222021 DEBUG nova.network.neutron [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:36 np0005593233 nova_compute[222017]: 2026-01-23 09:51:36.966 222021 DEBUG oslo_concurrency.lockutils [req-31e8ac00-ff59-4b98-8841-85e534a51bbc req-9d463314-f86d-4185-9d26-820f091a4a09 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.229 222021 INFO nova.virt.libvirt.driver [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Deleting instance files /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47_del#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.230 222021 INFO nova.virt.libvirt.driver [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Deletion of /var/lib/nova/instances/2994e136-9897-43dc-ae85-799cd5a7ff47_del complete#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.299 222021 INFO nova.compute.manager [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Took 0.84 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.300 222021 DEBUG oslo.service.loopingcall [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.300 222021 DEBUG nova.compute.manager [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.301 222021 DEBUG nova.network.neutron [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.342 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.342 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.390 222021 DEBUG nova.compute.manager [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-vif-unplugged-792a3d37-021b-4048-b93c-f7a4682a56e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.390 222021 DEBUG oslo_concurrency.lockutils [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.391 222021 DEBUG oslo_concurrency.lockutils [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.391 222021 DEBUG oslo_concurrency.lockutils [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.391 222021 DEBUG nova.compute.manager [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] No waiting events found dispatching network-vif-unplugged-792a3d37-021b-4048-b93c-f7a4682a56e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.391 222021 DEBUG nova.compute.manager [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-vif-unplugged-792a3d37-021b-4048-b93c-f7a4682a56e2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.391 222021 DEBUG nova.compute.manager [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.392 222021 DEBUG oslo_concurrency.lockutils [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.392 222021 DEBUG oslo_concurrency.lockutils [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.392 222021 DEBUG oslo_concurrency.lockutils [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.392 222021 DEBUG nova.compute.manager [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] No waiting events found dispatching network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:37 np0005593233 nova_compute[222017]: 2026-01-23 09:51:37.392 222021 WARNING nova.compute.manager [req-28ee0efb-17a6-4059-8fb1-1a1fedf65b95 req-959af47c-4c7c-4a0d-a062-ca792f35e660 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received unexpected event network-vif-plugged-792a3d37-021b-4048-b93c-f7a4682a56e2 for instance with vm_state paused and task_state deleting.#033[00m
Jan 23 04:51:38 np0005593233 nova_compute[222017]: 2026-01-23 09:51:38.400 222021 DEBUG nova.network.neutron [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:38.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:38 np0005593233 nova_compute[222017]: 2026-01-23 09:51:38.445 222021 INFO nova.compute.manager [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 23 04:51:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:38 np0005593233 nova_compute[222017]: 2026-01-23 09:51:38.512 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:38.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:38 np0005593233 nova_compute[222017]: 2026-01-23 09:51:38.513 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:38 np0005593233 nova_compute[222017]: 2026-01-23 09:51:38.530 222021 DEBUG nova.compute.manager [req-adfccab1-c310-4f2e-ad8f-4da415eb0818 req-c9b1037b-bf7a-40bc-8693-9d0f4c3b1fa7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Received event network-vif-deleted-792a3d37-021b-4048-b93c-f7a4682a56e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:38 np0005593233 nova_compute[222017]: 2026-01-23 09:51:38.609 222021 DEBUG oslo_concurrency.processutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3167954203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.087 222021 DEBUG oslo_concurrency.processutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.098 222021 DEBUG nova.compute.provider_tree [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.131 222021 DEBUG nova.scheduler.client.report [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.171 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.203 222021 INFO nova.scheduler.client.report [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance 2994e136-9897-43dc-ae85-799cd5a7ff47#033[00m
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.301 222021 DEBUG oslo_concurrency.lockutils [None req-76b54871-f174-4abf-b1d4-c495a0e18ae0 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2994e136-9897-43dc-ae85-799cd5a7ff47" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:39 np0005593233 nova_compute[222017]: 2026-01-23 09:51:39.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:40 np0005593233 podman[251627]: 2026-01-23 09:51:40.046030975 +0000 UTC m=+0.058712899 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:51:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:40.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:40 np0005593233 nova_compute[222017]: 2026-01-23 09:51:40.630 222021 DEBUG oslo_concurrency.lockutils [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-d0bb0470-cc5c-4b6f-be0d-20839267c340-af80eab2-c3b9-439d-baae-ee0d90b6cdda" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:40 np0005593233 nova_compute[222017]: 2026-01-23 09:51:40.631 222021 DEBUG oslo_concurrency.lockutils [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-d0bb0470-cc5c-4b6f-be0d-20839267c340-af80eab2-c3b9-439d-baae-ee0d90b6cdda" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:40 np0005593233 nova_compute[222017]: 2026-01-23 09:51:40.631 222021 DEBUG nova.objects.instance [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:41 np0005593233 nova_compute[222017]: 2026-01-23 09:51:41.439 222021 DEBUG nova.objects.instance [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_requests' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:41 np0005593233 nova_compute[222017]: 2026-01-23 09:51:41.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:42 np0005593233 nova_compute[222017]: 2026-01-23 09:51:42.104 222021 DEBUG nova.network.neutron [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:51:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:42.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:42.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:42.654 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:42.654 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:42.655 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:42 np0005593233 nova_compute[222017]: 2026-01-23 09:51:42.811 222021 DEBUG nova.policy [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:51:43 np0005593233 nova_compute[222017]: 2026-01-23 09:51:43.460 222021 DEBUG nova.compute.manager [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:43 np0005593233 nova_compute[222017]: 2026-01-23 09:51:43.460 222021 DEBUG nova.compute.manager [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing instance network info cache due to event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:43 np0005593233 nova_compute[222017]: 2026-01-23 09:51:43.461 222021 DEBUG oslo_concurrency.lockutils [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:43 np0005593233 nova_compute[222017]: 2026-01-23 09:51:43.461 222021 DEBUG oslo_concurrency.lockutils [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:43 np0005593233 nova_compute[222017]: 2026-01-23 09:51:43.462 222021 DEBUG nova.network.neutron [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:44.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:44.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:44 np0005593233 nova_compute[222017]: 2026-01-23 09:51:44.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.090 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.091 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.114 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.164 222021 DEBUG nova.network.neutron [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Successfully updated port: af80eab2-c3b9-439d-baae-ee0d90b6cdda _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.214 222021 DEBUG oslo_concurrency.lockutils [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.230 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.231 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.237 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.238 222021 INFO nova.compute.claims [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.302 222021 DEBUG nova.compute.manager [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-changed-af80eab2-c3b9-439d-baae-ee0d90b6cdda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.302 222021 DEBUG nova.compute.manager [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing instance network info cache due to event network-changed-af80eab2-c3b9-439d-baae-ee0d90b6cdda. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.302 222021 DEBUG oslo_concurrency.lockutils [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:45 np0005593233 nova_compute[222017]: 2026-01-23 09:51:45.478 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2245273769' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.045 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.053 222021 DEBUG nova.compute.provider_tree [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.081 222021 DEBUG nova.scheduler.client.report [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.120 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.122 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.183 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.184 222021 DEBUG nova.network.neutron [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.212 222021 INFO nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.261 222021 DEBUG nova.network.neutron [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated VIF entry in instance network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.262 222021 DEBUG nova.network.neutron [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.281 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.304 222021 DEBUG oslo_concurrency.lockutils [req-f539c36e-b1ed-4621-9799-8f79f5657b91 req-ae27e069-7468-4639-8323-4595c0328616 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.306 222021 DEBUG oslo_concurrency.lockutils [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.306 222021 DEBUG nova.network.neutron [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:51:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:46.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.449 222021 DEBUG nova.policy [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:51:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.526 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:51:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:46.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.527 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.528 222021 INFO nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Creating image(s)#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.557 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.585 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.611 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.615 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.686 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.687 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.687 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.688 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.711 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.716 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.744 222021 WARNING nova.network.neutron [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.749 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:46 np0005593233 nova_compute[222017]: 2026-01-23 09:51:46.983 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.075 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.184 222021 DEBUG nova.objects.instance [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid 2821a66b-54cd-4ffc-9b8f-317909716a0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.215 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.216 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Ensure instance console log exists: /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.216 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.217 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.217 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:47 np0005593233 nova_compute[222017]: 2026-01-23 09:51:47.735 222021 DEBUG nova.network.neutron [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Successfully created port: 0190cadd-3cd8-481a-b2c9-bc91136daab2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:51:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:48.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:48.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.384 222021 DEBUG nova.network.neutron [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Successfully updated port: 0190cadd-3cd8-481a-b2c9-bc91136daab2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.404 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.404 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.405 222021 DEBUG nova.network.neutron [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.551 222021 DEBUG nova.compute.manager [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received event network-changed-0190cadd-3cd8-481a-b2c9-bc91136daab2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.552 222021 DEBUG nova.compute.manager [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Refreshing instance network info cache due to event network-changed-0190cadd-3cd8-481a-b2c9-bc91136daab2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.552 222021 DEBUG oslo_concurrency.lockutils [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.709 222021 DEBUG nova.network.neutron [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.991 222021 DEBUG nova.network.neutron [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:49 np0005593233 nova_compute[222017]: 2026-01-23 09:51:49.994 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.016 222021 DEBUG oslo_concurrency.lockutils [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.018 222021 DEBUG oslo_concurrency.lockutils [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.018 222021 DEBUG nova.network.neutron [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing network info cache for port af80eab2-c3b9-439d-baae-ee0d90b6cdda _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.023 222021 DEBUG nova.virt.libvirt.vif [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.024 222021 DEBUG nova.network.os_vif_util [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.025 222021 DEBUG nova.network.os_vif_util [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.026 222021 DEBUG os_vif [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.027 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.027 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.030 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.030 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf80eab2-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.031 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf80eab2-c3, col_values=(('external_ids', {'iface-id': 'af80eab2-c3b9-439d-baae-ee0d90b6cdda', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:38:92', 'vm-uuid': 'd0bb0470-cc5c-4b6f-be0d-20839267c340'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.032 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 NetworkManager[48871]: <info>  [1769161910.0334] manager: (tapaf80eab2-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.040 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.042 222021 INFO os_vif [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3')#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.043 222021 DEBUG nova.virt.libvirt.vif [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.044 222021 DEBUG nova.network.os_vif_util [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.045 222021 DEBUG nova.network.os_vif_util [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.049 222021 DEBUG nova.virt.libvirt.guest [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] attach device xml: <interface type="ethernet">
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <mac address="fa:16:3e:7d:38:92"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <model type="virtio"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <mtu size="1442"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <target dev="tapaf80eab2-c3"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]: </interface>
Jan 23 04:51:50 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:51:50 np0005593233 kernel: tapaf80eab2-c3: entered promiscuous mode
Jan 23 04:51:50 np0005593233 NetworkManager[48871]: <info>  [1769161910.0656] manager: (tapaf80eab2-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Jan 23 04:51:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:50Z|00251|binding|INFO|Claiming lport af80eab2-c3b9-439d-baae-ee0d90b6cdda for this chassis.
Jan 23 04:51:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:50Z|00252|binding|INFO|af80eab2-c3b9-439d-baae-ee0d90b6cdda: Claiming fa:16:3e:7d:38:92 10.100.0.12
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.066 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:50Z|00253|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.086 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:38:92 10.100.0.12'], port_security=['fa:16:3e:7d:38:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2098344084', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd0bb0470-cc5c-4b6f-be0d-20839267c340', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2098344084', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=af80eab2-c3b9-439d-baae-ee0d90b6cdda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.087 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.088 140224 INFO neutron.agent.ovn.metadata.agent [-] Port af80eab2-c3b9-439d-baae-ee0d90b6cdda in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.090 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.101 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 systemd-udevd[251842]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.109 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f466df-9628-423a-9b4d-92ef2f55c3e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593233 NetworkManager[48871]: <info>  [1769161910.1230] device (tapaf80eab2-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:51:50 np0005593233 NetworkManager[48871]: <info>  [1769161910.1240] device (tapaf80eab2-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:51:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:50Z|00254|binding|INFO|Setting lport af80eab2-c3b9-439d-baae-ee0d90b6cdda ovn-installed in OVS
Jan 23 04:51:50 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:50Z|00255|binding|INFO|Setting lport af80eab2-c3b9-439d-baae-ee0d90b6cdda up in Southbound
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.129 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.149 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbd6c99-36f3-496f-8555-b4b4a025e75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.153 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a6bd3e76-e42e-40cc-a002-57cf2b46ed40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.178 222021 DEBUG nova.virt.libvirt.driver [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.178 222021 DEBUG nova.virt.libvirt.driver [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.179 222021 DEBUG nova.virt.libvirt.driver [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:60:14:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.179 222021 DEBUG nova.virt.libvirt.driver [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:7d:38:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.199 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6f345290-25ca-4b61-aa6b-eb5566401ad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.212 222021 DEBUG nova.virt.libvirt.guest [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:name>tempest-tempest.common.compute-instance-687494396</nova:name>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:creationTime>2026-01-23 09:51:50</nova:creationTime>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:flavor name="m1.nano">
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:memory>128</nova:memory>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:disk>1</nova:disk>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:swap>0</nova:swap>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  </nova:flavor>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:owner>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  </nova:owner>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  <nova:ports>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:port uuid="e137f0ac-1409-48af-9c44-4d589d8b9bf9">
Jan 23 04:51:50 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    <nova:port uuid="af80eab2-c3b9-439d-baae-ee0d90b6cdda">
Jan 23 04:51:50 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:50 np0005593233 nova_compute[222017]:  </nova:ports>
Jan 23 04:51:50 np0005593233 nova_compute[222017]: </nova:instance>
Jan 23 04:51:50 np0005593233 nova_compute[222017]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.224 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fd011c70-0250-4eef-b9c6-296390169a23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577112, 'reachable_time': 35670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251849, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.243 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[948ba14a-3fa7-4797-aa06-215995440b40]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577125, 'tstamp': 577125}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251850, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577130, 'tstamp': 577130}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251850, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.244 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.248 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.248 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.249 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:50.249 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.249 222021 DEBUG oslo_concurrency.lockutils [None req-77928296-b6e0-47cd-a626-8dd4cdc2affb 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-d0bb0470-cc5c-4b6f-be0d-20839267c340-af80eab2-c3b9-439d-baae-ee0d90b6cdda" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:50 np0005593233 nova_compute[222017]: 2026-01-23 09:51:50.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:50.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:50.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.478 222021 DEBUG nova.network.neutron [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Updating instance_info_cache with network_info: [{"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:51Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:38:92 10.100.0.12
Jan 23 04:51:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:51Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:38:92 10.100.0.12
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.502 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.503 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance network_info: |[{"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.503 222021 DEBUG oslo_concurrency.lockutils [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.503 222021 DEBUG nova.network.neutron [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Refreshing network info cache for port 0190cadd-3cd8-481a-b2c9-bc91136daab2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.505 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Start _get_guest_xml network_info=[{"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.510 222021 WARNING nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.517 222021 DEBUG nova.virt.libvirt.host [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.517 222021 DEBUG nova.virt.libvirt.host [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.525 222021 DEBUG nova.virt.libvirt.host [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.526 222021 DEBUG nova.virt.libvirt.host [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.527 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.527 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.528 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.528 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.528 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.529 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.529 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.529 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.529 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.530 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.530 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.530 222021 DEBUG nova.virt.hardware [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.533 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.694 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161896.6931918, 2994e136-9897-43dc-ae85-799cd5a7ff47 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.695 222021 INFO nova.compute.manager [-] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.701 222021 DEBUG nova.compute.manager [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.702 222021 DEBUG oslo_concurrency.lockutils [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.702 222021 DEBUG oslo_concurrency.lockutils [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.702 222021 DEBUG oslo_concurrency.lockutils [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.702 222021 DEBUG nova.compute.manager [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.703 222021 WARNING nova.compute.manager [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received unexpected event network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.703 222021 DEBUG nova.compute.manager [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.703 222021 DEBUG oslo_concurrency.lockutils [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.703 222021 DEBUG oslo_concurrency.lockutils [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.703 222021 DEBUG oslo_concurrency.lockutils [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.704 222021 DEBUG nova.compute.manager [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.704 222021 WARNING nova.compute.manager [req-68b7350e-6687-429f-8ab5-590f584e11c2 req-271f22b8-f8d4-451d-9d2e-361fb8f02043 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received unexpected event network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:51 np0005593233 nova_compute[222017]: 2026-01-23 09:51:51.725 222021 DEBUG nova.compute.manager [None req-f9a94a01-f5a8-420a-82c7-7056ee4999a4 - - - - - -] [instance: 2994e136-9897-43dc-ae85-799cd5a7ff47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1651587413' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.023 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.053 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.058 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.262 222021 DEBUG oslo_concurrency.lockutils [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-d0bb0470-cc5c-4b6f-be0d-20839267c340-af80eab2-c3b9-439d-baae-ee0d90b6cdda" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.263 222021 DEBUG oslo_concurrency.lockutils [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-d0bb0470-cc5c-4b6f-be0d-20839267c340-af80eab2-c3b9-439d-baae-ee0d90b6cdda" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.284 222021 DEBUG nova.objects.instance [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.313 222021 DEBUG nova.virt.libvirt.vif [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.313 222021 DEBUG nova.network.os_vif_util [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.314 222021 DEBUG nova.network.os_vif_util [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.318 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:7d:38:92"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaf80eab2-c3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.321 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:7d:38:92"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaf80eab2-c3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.324 222021 DEBUG nova.virt.libvirt.driver [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Attempting to detach device tapaf80eab2-c3 from instance d0bb0470-cc5c-4b6f-be0d-20839267c340 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.325 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] detach device xml: <interface type="ethernet">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <mac address="fa:16:3e:7d:38:92"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <model type="virtio"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <mtu size="1442"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <target dev="tapaf80eab2-c3"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </interface>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.332 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:7d:38:92"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaf80eab2-c3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.337 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:7d:38:92"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaf80eab2-c3"/></interface>not found in domain: <domain type='kvm' id='36'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <name>instance-00000048</name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <uuid>d0bb0470-cc5c-4b6f-be0d-20839267c340</uuid>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:name>tempest-tempest.common.compute-instance-687494396</nova:name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:creationTime>2026-01-23 09:51:50</nova:creationTime>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:flavor name="m1.nano">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:memory>128</nova:memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:disk>1</nova:disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:swap>0</nova:swap>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:flavor>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:port uuid="e137f0ac-1409-48af-9c44-4d589d8b9bf9">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:port uuid="af80eab2-c3b9-439d-baae-ee0d90b6cdda">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </nova:instance>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <memory unit='KiB'>131072</memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <resource>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <partition>/machine</partition>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </resource>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <sysinfo type='smbios'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='serial'>d0bb0470-cc5c-4b6f-be0d-20839267c340</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='uuid'>d0bb0470-cc5c-4b6f-be0d-20839267c340</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <boot dev='hd'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <smbios mode='sysinfo'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <vmcoreinfo state='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <feature policy='require' name='x2apic'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <feature policy='require' name='vme'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <clock offset='utc'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name='hpet' present='no'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <on_reboot>restart</on_reboot>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <on_crash>destroy</on_crash>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <disk type='network' device='disk'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <auth username='openstack'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source protocol='rbd' name='vms/d0bb0470-cc5c-4b6f-be0d-20839267c340_disk' index='2'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='vda' bus='virtio'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='virtio-disk0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <disk type='network' device='cdrom'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <auth username='openstack'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source protocol='rbd' name='vms/d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config' index='1'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='sda' bus='sata'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <readonly/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='sata0-0-0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pcie.0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='1' port='0x10'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='2' port='0x11'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='3' port='0x12'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='4' port='0x13'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='5' port='0x14'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='6' port='0x15'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='7' port='0x16'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='8' port='0x17'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.8'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='9' port='0x18'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.9'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='10' port='0x19'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.10'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='11' port='0x1a'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.11'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='12' port='0x1b'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.12'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='13' port='0x1c'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.13'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='14' port='0x1d'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.14'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='15' port='0x1e'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.15'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='16' port='0x1f'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.16'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='17' port='0x20'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.17'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='18' port='0x21'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.18'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='19' port='0x22'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.19'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='20' port='0x23'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.20'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='21' port='0x24'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.21'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='22' port='0x25'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.22'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='23' port='0x26'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.23'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='24' port='0x27'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.24'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='25' port='0x28'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.25'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-pci-bridge'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.26'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='usb'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='sata' index='0'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='ide'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <interface type='ethernet'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mac address='fa:16:3e:60:14:59'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='tape137f0ac-14'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type='virtio'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mtu size='1442'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='net0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <interface type='ethernet'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mac address='fa:16:3e:7d:38:92'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='tapaf80eab2-c3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type='virtio'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mtu size='1442'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='net1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <serial type='pty'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source path='/dev/pts/1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <log file='/var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/console.log' append='off'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target type='isa-serial' port='0'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <model name='isa-serial'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </target>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='serial0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <console type='pty' tty='/dev/pts/1'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source path='/dev/pts/1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <log file='/var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/console.log' append='off'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target type='serial' port='0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='serial0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </console>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type='tablet' bus='usb'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='input0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </input>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type='mouse' bus='ps2'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='input1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </input>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type='keyboard' bus='ps2'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='input2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </input>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <listen type='address' address='::0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </graphics>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <audio id='1' type='none'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='video0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <watchdog model='itco' action='reset'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='watchdog0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </watchdog>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <memballoon model='virtio'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <stats period='10'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='balloon0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <rng model='virtio'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='rng0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <label>system_u:system_r:svirt_t:s0:c499,c659</label>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c499,c659</imagelabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </seclabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <label>+107:+107</label>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </seclabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.340 222021 INFO nova.virt.libvirt.driver [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully detached device tapaf80eab2-c3 from instance d0bb0470-cc5c-4b6f-be0d-20839267c340 from the persistent domain config.#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.341 222021 DEBUG nova.virt.libvirt.driver [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] (1/8): Attempting to detach device tapaf80eab2-c3 with device alias net1 from instance d0bb0470-cc5c-4b6f-be0d-20839267c340 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.341 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] detach device xml: <interface type="ethernet">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <mac address="fa:16:3e:7d:38:92"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <model type="virtio"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <mtu size="1442"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <target dev="tapaf80eab2-c3"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </interface>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:51:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:52.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:52 np0005593233 kernel: tapaf80eab2-c3 (unregistering): left promiscuous mode
Jan 23 04:51:52 np0005593233 NetworkManager[48871]: <info>  [1769161912.4752] device (tapaf80eab2-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:51:52 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:52Z|00256|binding|INFO|Releasing lport af80eab2-c3b9-439d-baae-ee0d90b6cdda from this chassis (sb_readonly=0)
Jan 23 04:51:52 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:52Z|00257|binding|INFO|Setting lport af80eab2-c3b9-439d-baae-ee0d90b6cdda down in Southbound
Jan 23 04:51:52 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:52Z|00258|binding|INFO|Removing iface tapaf80eab2-c3 ovn-installed in OVS
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.487 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769161912.4840863, d0bb0470-cc5c-4b6f-be0d-20839267c340 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.488 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.490 222021 DEBUG nova.virt.libvirt.driver [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Start waiting for the detach event from libvirt for device tapaf80eab2-c3 with device alias net1 for instance d0bb0470-cc5c-4b6f-be0d-20839267c340 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.490 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:7d:38:92"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaf80eab2-c3"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:51:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1736278220' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.494 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:7d:38:92"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaf80eab2-c3"/></interface>not found in domain: <domain type='kvm' id='36'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <name>instance-00000048</name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <uuid>d0bb0470-cc5c-4b6f-be0d-20839267c340</uuid>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:name>tempest-tempest.common.compute-instance-687494396</nova:name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:creationTime>2026-01-23 09:51:50</nova:creationTime>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:flavor name="m1.nano">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:memory>128</nova:memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:disk>1</nova:disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:swap>0</nova:swap>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:flavor>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:port uuid="e137f0ac-1409-48af-9c44-4d589d8b9bf9">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:port uuid="af80eab2-c3b9-439d-baae-ee0d90b6cdda">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </nova:instance>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <memory unit='KiB'>131072</memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <resource>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <partition>/machine</partition>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </resource>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <sysinfo type='smbios'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='serial'>d0bb0470-cc5c-4b6f-be0d-20839267c340</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='uuid'>d0bb0470-cc5c-4b6f-be0d-20839267c340</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <boot dev='hd'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <smbios mode='sysinfo'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <vmcoreinfo state='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <feature policy='require' name='x2apic'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <feature policy='require' name='vme'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <clock offset='utc'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name='hpet' present='no'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <on_reboot>restart</on_reboot>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <on_crash>destroy</on_crash>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <disk type='network' device='disk'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <auth username='openstack'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source protocol='rbd' name='vms/d0bb0470-cc5c-4b6f-be0d-20839267c340_disk' index='2'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='vda' bus='virtio'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='virtio-disk0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <disk type='network' device='cdrom'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <auth username='openstack'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source protocol='rbd' name='vms/d0bb0470-cc5c-4b6f-be0d-20839267c340_disk.config' index='1'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='sda' bus='sata'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <readonly/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='sata0-0-0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pcie.0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='1' port='0x10'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='2' port='0x11'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='3' port='0x12'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='4' port='0x13'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='5' port='0x14'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='6' port='0x15'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='7' port='0x16'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='8' port='0x17'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.8'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='9' port='0x18'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.9'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='10' port='0x19'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.10'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='11' port='0x1a'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.11'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='12' port='0x1b'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.12'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='13' port='0x1c'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.13'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='14' port='0x1d'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.14'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='15' port='0x1e'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.15'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='16' port='0x1f'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.16'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='17' port='0x20'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.17'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='18' port='0x21'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.18'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='19' port='0x22'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.19'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='20' port='0x23'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.20'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='21' port='0x24'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.21'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='22' port='0x25'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.22'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='23' port='0x26'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.23'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='24' port='0x27'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.24'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-root-port'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target chassis='25' port='0x28'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.25'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model name='pcie-pci-bridge'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='pci.26'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='usb'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type='sata' index='0'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='ide'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </controller>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <interface type='ethernet'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mac address='fa:16:3e:60:14:59'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev='tape137f0ac-14'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type='virtio'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mtu size='1442'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='net0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <serial type='pty'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source path='/dev/pts/1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <log file='/var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/console.log' append='off'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target type='isa-serial' port='0'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <model name='isa-serial'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </target>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='serial0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <console type='pty' tty='/dev/pts/1'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source path='/dev/pts/1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <log file='/var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340/console.log' append='off'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target type='serial' port='0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='serial0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </console>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type='tablet' bus='usb'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='input0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </input>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type='mouse' bus='ps2'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='input1'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </input>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type='keyboard' bus='ps2'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='input2'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </input>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <listen type='address' address='::0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </graphics>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <audio id='1' type='none'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='video0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <watchdog model='itco' action='reset'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='watchdog0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </watchdog>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <memballoon model='virtio'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <stats period='10'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='balloon0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <rng model='virtio'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <alias name='rng0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <label>system_u:system_r:svirt_t:s0:c499,c659</label>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c499,c659</imagelabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </seclabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <label>+107:+107</label>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </seclabel>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.497 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:38:92 10.100.0.12'], port_security=['fa:16:3e:7d:38:92 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-2098344084', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd0bb0470-cc5c-4b6f-be0d-20839267c340', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-2098344084', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=af80eab2-c3b9-439d-baae-ee0d90b6cdda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.498 140224 INFO neutron.agent.ovn.metadata.agent [-] Port af80eab2-c3b9-439d-baae-ee0d90b6cdda in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.496 222021 INFO nova.virt.libvirt.driver [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully detached device tapaf80eab2-c3 from instance d0bb0470-cc5c-4b6f-be0d-20839267c340 from the live domain config.#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.497 222021 DEBUG nova.virt.libvirt.vif [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.497 222021 DEBUG nova.network.os_vif_util [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.498 222021 DEBUG nova.network.os_vif_util [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.498 222021 DEBUG os_vif [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.500 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.501 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.501 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf80eab2-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.506 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.508 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.514 222021 INFO os_vif [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:38:92,bridge_name='br-int',has_traffic_filtering=True,id=af80eab2-c3b9-439d-baae-ee0d90b6cdda,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaf80eab2-c3')#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.516 222021 DEBUG nova.virt.libvirt.guest [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:name>tempest-tempest.common.compute-instance-687494396</nova:name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:creationTime>2026-01-23 09:51:52</nova:creationTime>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:flavor name="m1.nano">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:memory>128</nova:memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:disk>1</nova:disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:swap>0</nova:swap>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:flavor>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:port uuid="e137f0ac-1409-48af-9c44-4d589d8b9bf9">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </nova:port>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </nova:instance>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.525 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[46bdf1c8-a057-4372-b1d1-9f297211f663]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.535 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:52.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.537 222021 DEBUG nova.virt.libvirt.vif [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:51:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-840112235',display_name='tempest-DeleteServersTestJSON-server-840112235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-840112235',id=76,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-l10ffb6i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:46Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=2821a66b-54cd-4ffc-9b8f-317909716a0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.538 222021 DEBUG nova.network.os_vif_util [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.539 222021 DEBUG nova.network.os_vif_util [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.541 222021 DEBUG nova.objects.instance [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid 2821a66b-54cd-4ffc-9b8f-317909716a0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.567 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b9693baa-c7e2-40eb-beba-47129a75ca3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.571 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[adbd29a4-73cf-486d-b62b-9535caccec68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.573 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <uuid>2821a66b-54cd-4ffc-9b8f-317909716a0c</uuid>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <name>instance-0000004c</name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:name>tempest-DeleteServersTestJSON-server-840112235</nova:name>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:51:51</nova:creationTime>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <nova:port uuid="0190cadd-3cd8-481a-b2c9-bc91136daab2">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name="serial">2821a66b-54cd-4ffc-9b8f-317909716a0c</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name="uuid">2821a66b-54cd-4ffc-9b8f-317909716a0c</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2821a66b-54cd-4ffc-9b8f-317909716a0c_disk">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2821a66b-54cd-4ffc-9b8f-317909716a0c_disk.config">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:fa:e8:95"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <target dev="tap0190cadd-3c"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/console.log" append="off"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:51:52 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:51:52 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:51:52 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.573 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Preparing to wait for external event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.574 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.574 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.574 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.575 222021 DEBUG nova.virt.libvirt.vif [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:51:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-840112235',display_name='tempest-DeleteServersTestJSON-server-840112235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-840112235',id=76,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-l10ffb6i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:46Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=2821a66b-54cd-4ffc-9b8f-317909716a0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.576 222021 DEBUG nova.network.os_vif_util [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.576 222021 DEBUG nova.network.os_vif_util [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.577 222021 DEBUG os_vif [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.577 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.578 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.578 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.581 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0190cadd-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.581 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0190cadd-3c, col_values=(('external_ids', {'iface-id': '0190cadd-3cd8-481a-b2c9-bc91136daab2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:e8:95', 'vm-uuid': '2821a66b-54cd-4ffc-9b8f-317909716a0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 NetworkManager[48871]: <info>  [1769161912.5846] manager: (tap0190cadd-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.585 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.588 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.589 222021 INFO os_vif [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c')#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.611 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[99bfc716-f702-49a8-a696-0b2cc618b623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.630 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c64ad995-4326-43c9-9b92-c86bd4996558]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577112, 'reachable_time': 35670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251925, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.649 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[791ee75f-2185-4524-aa9b-d8f84fc7f1d4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577125, 'tstamp': 577125}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251926, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577130, 'tstamp': 577130}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251926, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.652 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.655 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.659 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.659 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.659 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:fa:e8:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.659 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.660 222021 INFO nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Using config drive#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.660 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.661 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:52.662 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.686 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.826 222021 DEBUG nova.network.neutron [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated VIF entry in instance network info cache for port af80eab2-c3b9-439d-baae-ee0d90b6cdda. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.827 222021 DEBUG nova.network.neutron [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "address": "fa:16:3e:7d:38:92", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf80eab2-c3", "ovs_interfaceid": "af80eab2-c3b9-439d-baae-ee0d90b6cdda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:52 np0005593233 nova_compute[222017]: 2026-01-23 09:51:52.850 222021 DEBUG oslo_concurrency.lockutils [req-2321b1fc-25cf-4110-908a-e89332e9cda1 req-67682703-17a9-4c61-bea9-d80a042de5e6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.090 222021 DEBUG nova.network.neutron [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Updated VIF entry in instance network info cache for port 0190cadd-3cd8-481a-b2c9-bc91136daab2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.091 222021 DEBUG nova.network.neutron [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Updating instance_info_cache with network_info: [{"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.127 222021 DEBUG oslo_concurrency.lockutils [req-b0bdc698-7590-4621-a452-6cb48e17ba10 req-32293601-ca8d-4cb2-a403-a1f704e6db97 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.793 222021 INFO nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Creating config drive at /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/disk.config#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.799 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7wy1_27n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.834 222021 DEBUG nova.compute.manager [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-unplugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.835 222021 DEBUG oslo_concurrency.lockutils [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.835 222021 DEBUG oslo_concurrency.lockutils [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.836 222021 DEBUG oslo_concurrency.lockutils [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.836 222021 DEBUG nova.compute.manager [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-unplugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.836 222021 WARNING nova.compute.manager [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received unexpected event network-vif-unplugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.837 222021 DEBUG nova.compute.manager [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.837 222021 DEBUG oslo_concurrency.lockutils [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.837 222021 DEBUG oslo_concurrency.lockutils [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.838 222021 DEBUG oslo_concurrency.lockutils [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.838 222021 DEBUG nova.compute.manager [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.838 222021 WARNING nova.compute.manager [req-0c6f0b55-0f38-474e-b671-03f40d99c99e req-6492dd51-dd61-43ea-aa4a-9fbe24869e12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received unexpected event network-vif-plugged-af80eab2-c3b9-439d-baae-ee0d90b6cdda for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.939 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7wy1_27n" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.971 222021 DEBUG nova.storage.rbd_utils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:53 np0005593233 nova_compute[222017]: 2026-01-23 09:51:53.976 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/disk.config 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:54.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:51:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:54.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:51:54 np0005593233 nova_compute[222017]: 2026-01-23 09:51:54.563 222021 DEBUG oslo_concurrency.processutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/disk.config 2821a66b-54cd-4ffc-9b8f-317909716a0c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.587s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:54 np0005593233 nova_compute[222017]: 2026-01-23 09:51:54.564 222021 INFO nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Deleting local config drive /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c/disk.config because it was imported into RBD.#033[00m
Jan 23 04:51:54 np0005593233 kernel: tap0190cadd-3c: entered promiscuous mode
Jan 23 04:51:54 np0005593233 NetworkManager[48871]: <info>  [1769161914.6322] manager: (tap0190cadd-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Jan 23 04:51:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:54Z|00259|binding|INFO|Claiming lport 0190cadd-3cd8-481a-b2c9-bc91136daab2 for this chassis.
Jan 23 04:51:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:54Z|00260|binding|INFO|0190cadd-3cd8-481a-b2c9-bc91136daab2: Claiming fa:16:3e:fa:e8:95 10.100.0.4
Jan 23 04:51:54 np0005593233 nova_compute[222017]: 2026-01-23 09:51:54.633 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:54Z|00261|binding|INFO|Setting lport 0190cadd-3cd8-481a-b2c9-bc91136daab2 ovn-installed in OVS
Jan 23 04:51:54 np0005593233 nova_compute[222017]: 2026-01-23 09:51:54.649 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593233 nova_compute[222017]: 2026-01-23 09:51:54.652 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593233 systemd-udevd[252127]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:54 np0005593233 NetworkManager[48871]: <info>  [1769161914.6799] device (tap0190cadd-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:51:54 np0005593233 NetworkManager[48871]: <info>  [1769161914.6818] device (tap0190cadd-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:51:54 np0005593233 systemd-machined[190954]: New machine qemu-38-instance-0000004c.
Jan 23 04:51:54 np0005593233 systemd[1]: Started Virtual Machine qemu-38-instance-0000004c.
Jan 23 04:51:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:54Z|00262|binding|INFO|Setting lport 0190cadd-3cd8-481a-b2c9-bc91136daab2 up in Southbound
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.793 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:e8:95 10.100.0.4'], port_security=['fa:16:3e:fa:e8:95 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2821a66b-54cd-4ffc-9b8f-317909716a0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=0190cadd-3cd8-481a-b2c9-bc91136daab2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.794 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 0190cadd-3cd8-481a-b2c9-bc91136daab2 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.796 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.809 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1fd999-685a-4137-ab2c-016bdc4e484c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.810 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.812 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.812 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb21b95-6e3b-44c3-8ea0-2e5beda74b18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.813 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ab430b40-a4c6-4203-8945-b5282fd82d75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.827 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc17590-2179-4216-88a5-28dcea969915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.852 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9f3e19-838a-4ecc-9936-2f86e9a1cc6c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.889 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d83e3132-d2d4-4d2a-a956-053a8edd80af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 NetworkManager[48871]: <info>  [1769161914.9003] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/132)
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.899 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ef8afb-7370-4d0e-a23b-68c4f0d27a0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 systemd-udevd[252131]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.952 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[11c9c3be-4fc8-4370-9ed8-894b964ef520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.955 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[30e5a17f-b125-420b-8256-afd5bb1c9a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 NetworkManager[48871]: <info>  [1769161914.9864] device (tapa3788149-e0): carrier: link connected
Jan 23 04:51:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:54.991 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[378113a3-1be7-4aa0-9651-481fadbe44d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593233 nova_compute[222017]: 2026-01-23 09:51:54.993 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.012 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[12eed073-c1d2-48a8-a216-a87ab132950c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582761, 'reachable_time': 38696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252163, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.031 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ff55f61a-bc37-4468-ae5d-e0fc100e6495]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 582761, 'tstamp': 582761}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252164, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.049 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[60aaed97-c710-47ae-ba1d-b55ae0ea5ac7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582761, 'reachable_time': 38696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252165, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.094 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2cea75-1b15-49e8-97b4-fc425ea9b0b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.169 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[512377d4-4214-4d59-a138-0b22ac34fa08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.171 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.171 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.171 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.173 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:55 np0005593233 NetworkManager[48871]: <info>  [1769161915.1738] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 23 04:51:55 np0005593233 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.181 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.182 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.183 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:55 np0005593233 ovn_controller[130653]: 2026-01-23T09:51:55Z|00263|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.200 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.201 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.202 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dc43891b-1885-4901-9831-6cb32e20b97c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.203 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:51:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:55.204 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:51:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:51:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:51:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:51:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.546 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161915.5460963, 2821a66b-54cd-4ffc-9b8f-317909716a0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.546 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] VM Started (Lifecycle Event)#033[00m
Jan 23 04:51:55 np0005593233 podman[252238]: 2026-01-23 09:51:55.582485028 +0000 UTC m=+0.063073089 container create bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:51:55 np0005593233 systemd[1]: Started libpod-conmon-bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f.scope.
Jan 23 04:51:55 np0005593233 podman[252238]: 2026-01-23 09:51:55.548002051 +0000 UTC m=+0.028590142 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:51:55 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:51:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e886930c0fd0055dea06c2951e09dc17ba63a4b24082b9d53bdc2392d76619a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.716 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.722 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161915.5490024, 2821a66b-54cd-4ffc-9b8f-317909716a0c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.722 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.746 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.750 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:55 np0005593233 nova_compute[222017]: 2026-01-23 09:51:55.800 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:51:55 np0005593233 podman[252238]: 2026-01-23 09:51:55.869497373 +0000 UTC m=+0.350085454 container init bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 04:51:55 np0005593233 podman[252238]: 2026-01-23 09:51:55.879221455 +0000 UTC m=+0.359809536 container start bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:51:55 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [NOTICE]   (252257) : New worker (252259) forked
Jan 23 04:51:55 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [NOTICE]   (252257) : Loading success.
Jan 23 04:51:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:51:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:51:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:56.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:56.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:51:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8174 writes, 42K keys, 8174 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8174 writes, 8174 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1678 writes, 8498 keys, 1678 commit groups, 1.0 writes per commit group, ingest: 16.88 MB, 0.03 MB/s#012Interval WAL: 1678 writes, 1678 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     59.2      0.88              0.23        24    0.037       0      0       0.0       0.0#012  L6      1/0    8.60 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9     90.1     74.3      2.75              0.70        23    0.120    125K    13K       0.0       0.0#012 Sum      1/0    8.60 MB   0.0      0.2     0.1      0.2       0.3      0.1       0.0   4.9     68.3     70.7      3.63              0.93        47    0.077    125K    13K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.5     61.2     60.1      1.12              0.25        12    0.093     40K   3070       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0     90.1     74.3      2.75              0.70        23    0.120    125K    13K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     59.3      0.88              0.23        23    0.038       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.051, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.09 MB/s write, 0.24 GB read, 0.08 MB/s read, 3.6 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 1.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 27.68 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.00018 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1601,26.70 MB,8.78438%) FilterBlock(47,359.36 KB,0.11544%) IndexBlock(47,640.44 KB,0.205733%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:51:56 np0005593233 nova_compute[222017]: 2026-01-23 09:51:56.773 222021 DEBUG oslo_concurrency.lockutils [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:56 np0005593233 nova_compute[222017]: 2026-01-23 09:51:56.774 222021 DEBUG oslo_concurrency.lockutils [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:56 np0005593233 nova_compute[222017]: 2026-01-23 09:51:56.774 222021 DEBUG nova.network.neutron [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.583 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.922 222021 DEBUG nova.compute.manager [req-dc74facd-1220-47e8-a2cb-7b387774d8f8 req-0f39f34f-ff15-4921-a166-d608d30ba35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.923 222021 DEBUG oslo_concurrency.lockutils [req-dc74facd-1220-47e8-a2cb-7b387774d8f8 req-0f39f34f-ff15-4921-a166-d608d30ba35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.923 222021 DEBUG oslo_concurrency.lockutils [req-dc74facd-1220-47e8-a2cb-7b387774d8f8 req-0f39f34f-ff15-4921-a166-d608d30ba35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.924 222021 DEBUG oslo_concurrency.lockutils [req-dc74facd-1220-47e8-a2cb-7b387774d8f8 req-0f39f34f-ff15-4921-a166-d608d30ba35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.924 222021 DEBUG nova.compute.manager [req-dc74facd-1220-47e8-a2cb-7b387774d8f8 req-0f39f34f-ff15-4921-a166-d608d30ba35b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Processing event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.925 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.929 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161917.9292417, 2821a66b-54cd-4ffc-9b8f-317909716a0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.929 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.932 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.936 222021 INFO nova.virt.libvirt.driver [-] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance spawned successfully.#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.936 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.964 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.970 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.973 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.974 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.974 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.975 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.975 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:57 np0005593233 nova_compute[222017]: 2026-01-23 09:51:57.976 222021 DEBUG nova.virt.libvirt.driver [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:58 np0005593233 nova_compute[222017]: 2026-01-23 09:51:58.032 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:51:58 np0005593233 nova_compute[222017]: 2026-01-23 09:51:58.077 222021 INFO nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Took 11.55 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:51:58 np0005593233 nova_compute[222017]: 2026-01-23 09:51:58.077 222021 DEBUG nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:58 np0005593233 podman[252268]: 2026-01-23 09:51:58.097873696 +0000 UTC m=+0.105491298 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 23 04:51:58 np0005593233 nova_compute[222017]: 2026-01-23 09:51:58.152 222021 INFO nova.compute.manager [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Took 12.95 seconds to build instance.#033[00m
Jan 23 04:51:58 np0005593233 nova_compute[222017]: 2026-01-23 09:51:58.171 222021 DEBUG oslo_concurrency.lockutils [None req-7f620cff-8874-468b-a66a-80b5cec664af 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:51:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:58.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:51:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:51:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:58.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:58 np0005593233 nova_compute[222017]: 2026-01-23 09:51:58.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:58.773 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:58.775 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:51:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:51:59.777 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:59 np0005593233 nova_compute[222017]: 2026-01-23 09:51:59.995 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:00.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:00.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:00 np0005593233 nova_compute[222017]: 2026-01-23 09:52:00.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.258 222021 DEBUG nova.compute.manager [req-a1956635-57f2-4df2-85e5-b9a7eceac83c req-7daa07a5-45f2-4c99-8045-e3206c82c777 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.258 222021 DEBUG oslo_concurrency.lockutils [req-a1956635-57f2-4df2-85e5-b9a7eceac83c req-7daa07a5-45f2-4c99-8045-e3206c82c777 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.258 222021 DEBUG oslo_concurrency.lockutils [req-a1956635-57f2-4df2-85e5-b9a7eceac83c req-7daa07a5-45f2-4c99-8045-e3206c82c777 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.258 222021 DEBUG oslo_concurrency.lockutils [req-a1956635-57f2-4df2-85e5-b9a7eceac83c req-7daa07a5-45f2-4c99-8045-e3206c82c777 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.259 222021 DEBUG nova.compute.manager [req-a1956635-57f2-4df2-85e5-b9a7eceac83c req-7daa07a5-45f2-4c99-8045-e3206c82c777 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] No waiting events found dispatching network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.259 222021 WARNING nova.compute.manager [req-a1956635-57f2-4df2-85e5-b9a7eceac83c req-7daa07a5-45f2-4c99-8045-e3206c82c777 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received unexpected event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.565 222021 INFO nova.network.neutron [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Port af80eab2-c3b9-439d-baae-ee0d90b6cdda from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.570 222021 DEBUG nova.network.neutron [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.649 222021 DEBUG oslo_concurrency.lockutils [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:52:01 np0005593233 nova_compute[222017]: 2026-01-23 09:52:01.678 222021 DEBUG oslo_concurrency.lockutils [None req-4a0439d4-2e9a-4531-ae61-923063b87c4a 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-d0bb0470-cc5c-4b6f-be0d-20839267c340-af80eab2-c3b9-439d-baae-ee0d90b6cdda" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 9.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:52:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:52:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:02.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:02.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:02 np0005593233 nova_compute[222017]: 2026-01-23 09:52:02.585 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.363 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.363 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.364 222021 INFO nova.compute.manager [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Shelving#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.399 222021 DEBUG nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.689 222021 DEBUG nova.compute.manager [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.689 222021 DEBUG nova.compute.manager [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing instance network info cache due to event network-changed-e137f0ac-1409-48af-9c44-4d589d8b9bf9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.689 222021 DEBUG oslo_concurrency.lockutils [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.690 222021 DEBUG oslo_concurrency.lockutils [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:52:03 np0005593233 nova_compute[222017]: 2026-01-23 09:52:03.690 222021 DEBUG nova.network.neutron [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Refreshing network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:52:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:04.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:04.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:04 np0005593233 nova_compute[222017]: 2026-01-23 09:52:04.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:06.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:07 np0005593233 nova_compute[222017]: 2026-01-23 09:52:07.334 222021 DEBUG nova.network.neutron [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated VIF entry in instance network info cache for port e137f0ac-1409-48af-9c44-4d589d8b9bf9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:52:07 np0005593233 nova_compute[222017]: 2026-01-23 09:52:07.335 222021 DEBUG nova.network.neutron [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:07 np0005593233 nova_compute[222017]: 2026-01-23 09:52:07.531 222021 DEBUG oslo_concurrency.lockutils [req-5cc393d4-3d13-410a-a3e5-62b1c930e191 req-5428b171-8b16-4de7-90ae-38ab73c765ff 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:52:07 np0005593233 nova_compute[222017]: 2026-01-23 09:52:07.587 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:08.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:08.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:10 np0005593233 nova_compute[222017]: 2026-01-23 09:52:10.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:10.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:10.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:11 np0005593233 podman[252345]: 2026-01-23 09:52:11.073092522 +0000 UTC m=+0.077193794 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 23 04:52:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:12Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:e8:95 10.100.0.4
Jan 23 04:52:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:12Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:e8:95 10.100.0.4
Jan 23 04:52:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:12.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:12.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:12 np0005593233 nova_compute[222017]: 2026-01-23 09:52:12.590 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:13 np0005593233 nova_compute[222017]: 2026-01-23 09:52:13.452 222021 DEBUG nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:52:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:14.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:14.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:15 np0005593233 nova_compute[222017]: 2026-01-23 09:52:15.003 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.311 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:16 np0005593233 kernel: tap0190cadd-3c (unregistering): left promiscuous mode
Jan 23 04:52:16 np0005593233 NetworkManager[48871]: <info>  [1769161936.3998] device (tap0190cadd-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:16 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:16Z|00264|binding|INFO|Releasing lport 0190cadd-3cd8-481a-b2c9-bc91136daab2 from this chassis (sb_readonly=0)
Jan 23 04:52:16 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:16Z|00265|binding|INFO|Setting lport 0190cadd-3cd8-481a-b2c9-bc91136daab2 down in Southbound
Jan 23 04:52:16 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:16Z|00266|binding|INFO|Removing iface tap0190cadd-3c ovn-installed in OVS
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.433 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:16 np0005593233 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 23 04:52:16 np0005593233 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004c.scope: Consumed 14.635s CPU time.
Jan 23 04:52:16 np0005593233 systemd-machined[190954]: Machine qemu-38-instance-0000004c terminated.
Jan 23 04:52:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:16.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:16.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.673 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.687 222021 INFO nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.694 222021 INFO nova.virt.libvirt.driver [-] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance destroyed successfully.#033[00m
Jan 23 04:52:16 np0005593233 nova_compute[222017]: 2026-01-23 09:52:16.694 222021 DEBUG nova.objects.instance [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'numa_topology' on Instance uuid 2821a66b-54cd-4ffc-9b8f-317909716a0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:16.710 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:e8:95 10.100.0.4'], port_security=['fa:16:3e:fa:e8:95 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2821a66b-54cd-4ffc-9b8f-317909716a0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=0190cadd-3cd8-481a-b2c9-bc91136daab2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:52:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:16.711 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 0190cadd-3cd8-481a-b2c9-bc91136daab2 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:52:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:16.713 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:52:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:16.715 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9d49857d-1636-43e1-ace5-c2abef6b5c25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:16.715 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:52:16 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [NOTICE]   (252257) : haproxy version is 2.8.14-c23fe91
Jan 23 04:52:16 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [NOTICE]   (252257) : path to executable is /usr/sbin/haproxy
Jan 23 04:52:16 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [WARNING]  (252257) : Exiting Master process...
Jan 23 04:52:16 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [ALERT]    (252257) : Current worker (252259) exited with code 143 (Terminated)
Jan 23 04:52:16 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[252253]: [WARNING]  (252257) : All workers exited. Exiting... (0)
Jan 23 04:52:16 np0005593233 systemd[1]: libpod-bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f.scope: Deactivated successfully.
Jan 23 04:52:16 np0005593233 podman[252395]: 2026-01-23 09:52:16.895131069 +0000 UTC m=+0.065531228 container died bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f-userdata-shm.mount: Deactivated successfully.
Jan 23 04:52:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay-e886930c0fd0055dea06c2951e09dc17ba63a4b24082b9d53bdc2392d76619a7-merged.mount: Deactivated successfully.
Jan 23 04:52:17 np0005593233 podman[252395]: 2026-01-23 09:52:17.001888591 +0000 UTC m=+0.172288730 container cleanup bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:52:17 np0005593233 systemd[1]: libpod-conmon-bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f.scope: Deactivated successfully.
Jan 23 04:52:17 np0005593233 podman[252424]: 2026-01-23 09:52:17.093306824 +0000 UTC m=+0.062320888 container remove bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.101 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e71b1f2a-f3d2-4e6c-be10-7562f098b3e8]: (4, ('Fri Jan 23 09:52:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f)\nbd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f\nFri Jan 23 09:52:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (bd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f)\nbd221a8d897dc5cc98484269eba434e5c9e8aa205c9e89dedf9b16af5c55890f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.103 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6cdfc445-8b3f-4b4a-bf2a-a30b2b7ef043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.105 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:52:17 np0005593233 nova_compute[222017]: 2026-01-23 09:52:17.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:17 np0005593233 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:52:17 np0005593233 nova_compute[222017]: 2026-01-23 09:52:17.128 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.134 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f3809ed3-f0bf-4469-a899-93add1951833]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.159 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3706a8-8d74-44da-a5b3-89a226c471cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.162 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d854de8a-627a-402b-abc7-2ba45db63afe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.185 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2010b4f3-9cd6-4393-97e3-245bb67fa2e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582751, 'reachable_time': 20162, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252443, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.189 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:52:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:17.189 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0613b2-c17e-435b-b43d-d492642a5c13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:17 np0005593233 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:52:17 np0005593233 nova_compute[222017]: 2026-01-23 09:52:17.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:18 np0005593233 nova_compute[222017]: 2026-01-23 09:52:18.165 222021 INFO nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Beginning cold snapshot process#033[00m
Jan 23 04:52:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:18.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:18.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:18 np0005593233 nova_compute[222017]: 2026-01-23 09:52:18.655 222021 DEBUG nova.virt.libvirt.imagebackend [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:52:19 np0005593233 nova_compute[222017]: 2026-01-23 09:52:19.228 222021 DEBUG nova.storage.rbd_utils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] creating snapshot(ad9d822aad3148d49006c8f40cf52255) on rbd image(2821a66b-54cd-4ffc-9b8f-317909716a0c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:52:20 np0005593233 nova_compute[222017]: 2026-01-23 09:52:20.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e248 e248: 3 total, 3 up, 3 in
Jan 23 04:52:20 np0005593233 nova_compute[222017]: 2026-01-23 09:52:20.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:20.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:20.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:20 np0005593233 nova_compute[222017]: 2026-01-23 09:52:20.675 222021 DEBUG nova.storage.rbd_utils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] cloning vms/2821a66b-54cd-4ffc-9b8f-317909716a0c_disk@ad9d822aad3148d49006c8f40cf52255 to images/1f42e74d-2b40-42bc-b3a7-8f4aab6f6bdd clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.038 222021 DEBUG nova.compute.manager [req-d08e7c35-8b49-4f81-ac5f-ce45f2f20056 req-4ce60529-71ea-46e1-b9fc-afbd88a21c6f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received event network-vif-unplugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.039 222021 DEBUG oslo_concurrency.lockutils [req-d08e7c35-8b49-4f81-ac5f-ce45f2f20056 req-4ce60529-71ea-46e1-b9fc-afbd88a21c6f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.039 222021 DEBUG oslo_concurrency.lockutils [req-d08e7c35-8b49-4f81-ac5f-ce45f2f20056 req-4ce60529-71ea-46e1-b9fc-afbd88a21c6f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.039 222021 DEBUG oslo_concurrency.lockutils [req-d08e7c35-8b49-4f81-ac5f-ce45f2f20056 req-4ce60529-71ea-46e1-b9fc-afbd88a21c6f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.040 222021 DEBUG nova.compute.manager [req-d08e7c35-8b49-4f81-ac5f-ce45f2f20056 req-4ce60529-71ea-46e1-b9fc-afbd88a21c6f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] No waiting events found dispatching network-vif-unplugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.040 222021 WARNING nova.compute.manager [req-d08e7c35-8b49-4f81-ac5f-ce45f2f20056 req-4ce60529-71ea-46e1-b9fc-afbd88a21c6f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received unexpected event network-vif-unplugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.105 222021 DEBUG nova.storage.rbd_utils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] flattening images/1f42e74d-2b40-42bc-b3a7-8f4aab6f6bdd flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:21 np0005593233 nova_compute[222017]: 2026-01-23 09:52:21.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.423 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.424 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:22.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:22.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.654 222021 DEBUG nova.storage.rbd_utils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] removing snapshot(ad9d822aad3148d49006c8f40cf52255) on rbd image(2821a66b-54cd-4ffc-9b8f-317909716a0c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:52:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1097509369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:22 np0005593233 nova_compute[222017]: 2026-01-23 09:52:22.919 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:23 np0005593233 nova_compute[222017]: 2026-01-23 09:52:23.872 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:52:23 np0005593233 nova_compute[222017]: 2026-01-23 09:52:23.872 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000048 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:52:23 np0005593233 nova_compute[222017]: 2026-01-23 09:52:23.875 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000004c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:52:23 np0005593233 nova_compute[222017]: 2026-01-23 09:52:23.876 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000004c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.024 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.025 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4431MB free_disk=20.830829620361328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.025 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.025 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e249 e249: 3 total, 3 up, 3 in
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.220 222021 DEBUG nova.compute.manager [req-b878993e-7ea1-4460-97d4-ae4c679a1207 req-74cd2e6d-ffff-46fb-9759-0c305a32fc85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.221 222021 DEBUG oslo_concurrency.lockutils [req-b878993e-7ea1-4460-97d4-ae4c679a1207 req-74cd2e6d-ffff-46fb-9759-0c305a32fc85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.221 222021 DEBUG oslo_concurrency.lockutils [req-b878993e-7ea1-4460-97d4-ae4c679a1207 req-74cd2e6d-ffff-46fb-9759-0c305a32fc85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.222 222021 DEBUG oslo_concurrency.lockutils [req-b878993e-7ea1-4460-97d4-ae4c679a1207 req-74cd2e6d-ffff-46fb-9759-0c305a32fc85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.222 222021 DEBUG nova.compute.manager [req-b878993e-7ea1-4460-97d4-ae4c679a1207 req-74cd2e6d-ffff-46fb-9759-0c305a32fc85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] No waiting events found dispatching network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.222 222021 WARNING nova.compute.manager [req-b878993e-7ea1-4460-97d4-ae4c679a1207 req-74cd2e6d-ffff-46fb-9759-0c305a32fc85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received unexpected event network-vif-plugged-0190cadd-3cd8-481a-b2c9-bc91136daab2 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.235 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance d0bb0470-cc5c-4b6f-be0d-20839267c340 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.235 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 2821a66b-54cd-4ffc-9b8f-317909716a0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.236 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.236 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.424 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:24.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.562 222021 DEBUG nova.storage.rbd_utils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] creating snapshot(snap) on rbd image(1f42e74d-2b40-42bc-b3a7-8f4aab6f6bdd) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:52:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:24.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3378454074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.922 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.929 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:52:24 np0005593233 nova_compute[222017]: 2026-01-23 09:52:24.960 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:52:25 np0005593233 nova_compute[222017]: 2026-01-23 09:52:25.002 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:52:25 np0005593233 nova_compute[222017]: 2026-01-23 09:52:25.002 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:25 np0005593233 nova_compute[222017]: 2026-01-23 09:52:25.009 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e250 e250: 3 total, 3 up, 3 in
Jan 23 04:52:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:26 np0005593233 nova_compute[222017]: 2026-01-23 09:52:26.002 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:26 np0005593233 nova_compute[222017]: 2026-01-23 09:52:26.002 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:52:26 np0005593233 nova_compute[222017]: 2026-01-23 09:52:26.003 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:52:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:26.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:26.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:27 np0005593233 nova_compute[222017]: 2026-01-23 09:52:27.637 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:28 np0005593233 nova_compute[222017]: 2026-01-23 09:52:28.007 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:52:28 np0005593233 nova_compute[222017]: 2026-01-23 09:52:28.007 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:52:28 np0005593233 nova_compute[222017]: 2026-01-23 09:52:28.007 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:52:28 np0005593233 nova_compute[222017]: 2026-01-23 09:52:28.008 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:28.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:28.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:29 np0005593233 podman[252630]: 2026-01-23 09:52:29.136457381 +0000 UTC m=+0.136730413 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.285 222021 INFO nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Snapshot image upload complete#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.286 222021 DEBUG nova.compute.manager [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.393 222021 INFO nova.compute.manager [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Shelve offloading#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.401 222021 INFO nova.virt.libvirt.driver [-] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance destroyed successfully.#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.402 222021 DEBUG nova.compute.manager [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.404 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.404 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:52:29 np0005593233 nova_compute[222017]: 2026-01-23 09:52:29.404 222021 DEBUG nova.network.neutron [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:52:30 np0005593233 nova_compute[222017]: 2026-01-23 09:52:30.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e251 e251: 3 total, 3 up, 3 in
Jan 23 04:52:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:30.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:30.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:31 np0005593233 nova_compute[222017]: 2026-01-23 09:52:31.687 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161936.686127, 2821a66b-54cd-4ffc-9b8f-317909716a0c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:52:31 np0005593233 nova_compute[222017]: 2026-01-23 09:52:31.688 222021 INFO nova.compute.manager [-] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:52:31 np0005593233 nova_compute[222017]: 2026-01-23 09:52:31.739 222021 DEBUG nova.compute.manager [None req-94289acc-359c-44db-90d2-38d8a385fedc - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:52:31 np0005593233 nova_compute[222017]: 2026-01-23 09:52:31.743 222021 DEBUG nova.compute.manager [None req-94289acc-359c-44db-90d2-38d8a385fedc - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:52:31 np0005593233 nova_compute[222017]: 2026-01-23 09:52:31.779 222021 INFO nova.compute.manager [None req-94289acc-359c-44db-90d2-38d8a385fedc - - - - - -] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Jan 23 04:52:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:32.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:32 np0005593233 nova_compute[222017]: 2026-01-23 09:52:32.639 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:32 np0005593233 nova_compute[222017]: 2026-01-23 09:52:32.756 222021 DEBUG nova.network.neutron [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Updating instance_info_cache with network_info: [{"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:32 np0005593233 nova_compute[222017]: 2026-01-23 09:52:32.787 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:52:33 np0005593233 nova_compute[222017]: 2026-01-23 09:52:33.506 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:33 np0005593233 nova_compute[222017]: 2026-01-23 09:52:33.982 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [{"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:34 np0005593233 nova_compute[222017]: 2026-01-23 09:52:34.004 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-d0bb0470-cc5c-4b6f-be0d-20839267c340" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:52:34 np0005593233 nova_compute[222017]: 2026-01-23 09:52:34.005 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:52:34 np0005593233 nova_compute[222017]: 2026-01-23 09:52:34.005 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:34 np0005593233 nova_compute[222017]: 2026-01-23 09:52:34.005 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:34.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:35 np0005593233 nova_compute[222017]: 2026-01-23 09:52:35.013 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.222672) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955222883, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1520, "num_deletes": 256, "total_data_size": 3322248, "memory_usage": 3376232, "flush_reason": "Manual Compaction"}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955450729, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 2171668, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41938, "largest_seqno": 43453, "table_properties": {"data_size": 2165256, "index_size": 3547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14311, "raw_average_key_size": 20, "raw_value_size": 2151977, "raw_average_value_size": 3022, "num_data_blocks": 156, "num_entries": 712, "num_filter_entries": 712, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 1769161840, "file_creation_time": 1769161955, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 228102 microseconds, and 10564 cpu microseconds.
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.450787) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 2171668 bytes OK
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.450811) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.462977) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.463040) EVENT_LOG_v1 {"time_micros": 1769161955463027, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.463068) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 3315000, prev total WAL file size 3315000, number of live WAL files 2.
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.464471) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323534' seq:72057594037927935, type:22 .. '6C6F676D0031353035' seq:0, type:0; will stop at (end)
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(2120KB)], [81(8810KB)]
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955464517, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 11193169, "oldest_snapshot_seqno": -1}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6689 keys, 11052549 bytes, temperature: kUnknown
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955692501, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11052549, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11006962, "index_size": 27747, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 172003, "raw_average_key_size": 25, "raw_value_size": 10886281, "raw_average_value_size": 1627, "num_data_blocks": 1110, "num_entries": 6689, "num_filter_entries": 6689, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161955, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.692831) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11052549 bytes
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.694764) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 49.1 rd, 48.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 8.6 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(10.2) write-amplify(5.1) OK, records in: 7220, records dropped: 531 output_compression: NoCompression
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.694795) EVENT_LOG_v1 {"time_micros": 1769161955694785, "job": 50, "event": "compaction_finished", "compaction_time_micros": 228092, "compaction_time_cpu_micros": 28590, "output_level": 6, "num_output_files": 1, "total_output_size": 11052549, "num_input_records": 7220, "num_output_records": 6689, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955695331, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955697051, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.464363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.697292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.697298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.697300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.697302) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:52:35.697303) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.288 222021 INFO nova.virt.libvirt.driver [-] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Instance destroyed successfully.#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.289 222021 DEBUG nova.objects.instance [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid 2821a66b-54cd-4ffc-9b8f-317909716a0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.422 222021 DEBUG nova.virt.libvirt.vif [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:51:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-840112235',display_name='tempest-DeleteServersTestJSON-server-840112235',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-840112235',id=76,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-l10ffb6i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member',shelved_at='2026-01-23T09:52:29.285963',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='1f42e74d-2b40-42bc-b3a7-8f4aab6f6bdd'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:52:18Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=2821a66b-54cd-4ffc-9b8f-317909716a0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.422 222021 DEBUG nova.network.os_vif_util [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0190cadd-3c", "ovs_interfaceid": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.423 222021 DEBUG nova.network.os_vif_util [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.424 222021 DEBUG os_vif [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.426 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.427 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0190cadd-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.430 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.432 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.435 222021 INFO os_vif [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:e8:95,bridge_name='br-int',has_traffic_filtering=True,id=0190cadd-3cd8-481a-b2c9-bc91136daab2,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0190cadd-3c')#033[00m
Jan 23 04:52:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:36.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:36.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.901 222021 DEBUG nova.compute.manager [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Received event network-changed-0190cadd-3cd8-481a-b2c9-bc91136daab2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.901 222021 DEBUG nova.compute.manager [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Refreshing instance network info cache due to event network-changed-0190cadd-3cd8-481a-b2c9-bc91136daab2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.902 222021 DEBUG oslo_concurrency.lockutils [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.902 222021 DEBUG oslo_concurrency.lockutils [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:52:36 np0005593233 nova_compute[222017]: 2026-01-23 09:52:36.902 222021 DEBUG nova.network.neutron [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Refreshing network info cache for port 0190cadd-3cd8-481a-b2c9-bc91136daab2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:52:37 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Jan 23 04:52:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:38.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:38 np0005593233 nova_compute[222017]: 2026-01-23 09:52:38.569 222021 INFO nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Deleting instance files /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c_del#033[00m
Jan 23 04:52:38 np0005593233 nova_compute[222017]: 2026-01-23 09:52:38.571 222021 INFO nova.virt.libvirt.driver [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Deletion of /var/lib/nova/instances/2821a66b-54cd-4ffc-9b8f-317909716a0c_del complete#033[00m
Jan 23 04:52:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:38.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:38 np0005593233 nova_compute[222017]: 2026-01-23 09:52:38.777 222021 INFO nova.scheduler.client.report [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance 2821a66b-54cd-4ffc-9b8f-317909716a0c#033[00m
Jan 23 04:52:38 np0005593233 nova_compute[222017]: 2026-01-23 09:52:38.918 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:38 np0005593233 nova_compute[222017]: 2026-01-23 09:52:38.919 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.005 222021 DEBUG oslo_concurrency.processutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.382 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3430639485' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.487 222021 DEBUG oslo_concurrency.processutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.495 222021 DEBUG nova.compute.provider_tree [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.533 222021 DEBUG nova.scheduler.client.report [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.610 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:39 np0005593233 nova_compute[222017]: 2026-01-23 09:52:39.747 222021 DEBUG oslo_concurrency.lockutils [None req-300b7d18-84b7-46eb-974c-8dd9d621b652 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "2821a66b-54cd-4ffc-9b8f-317909716a0c" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 36.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:40 np0005593233 nova_compute[222017]: 2026-01-23 09:52:40.015 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:40.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:40.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:41 np0005593233 nova_compute[222017]: 2026-01-23 09:52:41.430 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:41 np0005593233 nova_compute[222017]: 2026-01-23 09:52:41.948 222021 DEBUG nova.network.neutron [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Updated VIF entry in instance network info cache for port 0190cadd-3cd8-481a-b2c9-bc91136daab2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:52:41 np0005593233 nova_compute[222017]: 2026-01-23 09:52:41.949 222021 DEBUG nova.network.neutron [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2821a66b-54cd-4ffc-9b8f-317909716a0c] Updating instance_info_cache with network_info: [{"id": "0190cadd-3cd8-481a-b2c9-bc91136daab2", "address": "fa:16:3e:fa:e8:95", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": null, "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap0190cadd-3c", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:41 np0005593233 nova_compute[222017]: 2026-01-23 09:52:41.989 222021 DEBUG oslo_concurrency.lockutils [req-6b5d91b5-426e-432a-966e-8e41d507e7f7 req-d0be0675-f0ea-4d70-869c-40b86f7511a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-2821a66b-54cd-4ffc-9b8f-317909716a0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:52:42 np0005593233 podman[252698]: 2026-01-23 09:52:42.058033562 +0000 UTC m=+0.066697871 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 04:52:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e252 e252: 3 total, 3 up, 3 in
Jan 23 04:52:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:42.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:42.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:42.654 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:42.655 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:42.655 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:44.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:52:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/534762731' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:52:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:52:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/534762731' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:52:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:44.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:45 np0005593233 nova_compute[222017]: 2026-01-23 09:52:45.049 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:46 np0005593233 nova_compute[222017]: 2026-01-23 09:52:46.434 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:46.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:46.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:48.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:48.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:50 np0005593233 nova_compute[222017]: 2026-01-23 09:52:50.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e253 e253: 3 total, 3 up, 3 in
Jan 23 04:52:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:50.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:50.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:51 np0005593233 nova_compute[222017]: 2026-01-23 09:52:51.437 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:52.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:52.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.328 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.329 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.330 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.330 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.330 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.332 222021 INFO nova.compute.manager [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Terminating instance#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.333 222021 DEBUG nova.compute.manager [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:52:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:54.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:54.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:54 np0005593233 kernel: tape137f0ac-14 (unregistering): left promiscuous mode
Jan 23 04:52:54 np0005593233 NetworkManager[48871]: <info>  [1769161974.9324] device (tape137f0ac-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.952 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:54Z|00267|binding|INFO|Releasing lport e137f0ac-1409-48af-9c44-4d589d8b9bf9 from this chassis (sb_readonly=0)
Jan 23 04:52:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:54Z|00268|binding|INFO|Setting lport e137f0ac-1409-48af-9c44-4d589d8b9bf9 down in Southbound
Jan 23 04:52:54 np0005593233 ovn_controller[130653]: 2026-01-23T09:52:54Z|00269|binding|INFO|Removing iface tape137f0ac-14 ovn-installed in OVS
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:54.962 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:14:59 10.100.0.5'], port_security=['fa:16:3e:60:14:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd0bb0470-cc5c-4b6f-be0d-20839267c340', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0547d145-6526-47bb-a492-48772f700715', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e137f0ac-1409-48af-9c44-4d589d8b9bf9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:54.964 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e137f0ac-1409-48af-9c44-4d589d8b9bf9 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:54.966 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:54.968 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b7420f-061b-4829-8d56-c132224e34f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:54.970 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 namespace which is not needed anymore#033[00m
Jan 23 04:52:54 np0005593233 nova_compute[222017]: 2026-01-23 09:52:54.977 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:54 np0005593233 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 23 04:52:54 np0005593233 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000048.scope: Consumed 19.726s CPU time.
Jan 23 04:52:54 np0005593233 systemd-machined[190954]: Machine qemu-36-instance-00000048 terminated.
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.065 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.098 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.099 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.179 222021 INFO nova.virt.libvirt.driver [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Instance destroyed successfully.#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.180 222021 DEBUG nova.objects.instance [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'resources' on Instance uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:55 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [NOTICE]   (250820) : haproxy version is 2.8.14-c23fe91
Jan 23 04:52:55 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [NOTICE]   (250820) : path to executable is /usr/sbin/haproxy
Jan 23 04:52:55 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [WARNING]  (250820) : Exiting Master process...
Jan 23 04:52:55 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [ALERT]    (250820) : Current worker (250822) exited with code 143 (Terminated)
Jan 23 04:52:55 np0005593233 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[250816]: [WARNING]  (250820) : All workers exited. Exiting... (0)
Jan 23 04:52:55 np0005593233 systemd[1]: libpod-cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b.scope: Deactivated successfully.
Jan 23 04:52:55 np0005593233 podman[252744]: 2026-01-23 09:52:55.211543213 +0000 UTC m=+0.133687939 container died cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:52:55 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b-userdata-shm.mount: Deactivated successfully.
Jan 23 04:52:55 np0005593233 systemd[1]: var-lib-containers-storage-overlay-b400941d92e4d10aafea7a67e56f5e99475e7147881026c4e913ac4c54e791ed-merged.mount: Deactivated successfully.
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.342 222021 DEBUG nova.virt.libvirt.vif [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-687494396',display_name='tempest-tempest.common.compute-instance-687494396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-687494396',id=72,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBISF6L8g87ZfxLrm8Wwm+gzemsck5aetIhd8gCsjpNrTc2Fv/no3h23xzReyi9tgvOePkWLat/BN4ukRmY5i9SKOoCvqi25H2ncCjSqcqS+cT6X1PkedlTAGxBrEwc2adg==',key_name='tempest-keypair-1775870371',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-ab0xdcxv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:50:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=d0bb0470-cc5c-4b6f-be0d-20839267c340,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.343 222021 DEBUG nova.network.os_vif_util [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "address": "fa:16:3e:60:14:59", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape137f0ac-14", "ovs_interfaceid": "e137f0ac-1409-48af-9c44-4d589d8b9bf9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.344 222021 DEBUG nova.network.os_vif_util [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.345 222021 DEBUG os_vif [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.347 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape137f0ac-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.353 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.356 222021 INFO os_vif [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:14:59,bridge_name='br-int',has_traffic_filtering=True,id=e137f0ac-1409-48af-9c44-4d589d8b9bf9,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape137f0ac-14')#033[00m
Jan 23 04:52:55 np0005593233 podman[252744]: 2026-01-23 09:52:55.375372175 +0000 UTC m=+0.297516901 container cleanup cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:52:55 np0005593233 systemd[1]: libpod-conmon-cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b.scope: Deactivated successfully.
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.486 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:52:55 np0005593233 podman[252799]: 2026-01-23 09:52:55.630460746 +0000 UTC m=+0.222009325 container remove cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:52:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.639 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ad29269e-5363-49cf-80c4-691bc2d21e6a]: (4, ('Fri Jan 23 09:52:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 (cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b)\ncc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b\nFri Jan 23 09:52:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 (cc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b)\ncc73c4e7b9695caa94b0bb013b84a08287555dea9dad399b9b7309daec29d26b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.642 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2c77e8b8-acab-460a-b9f6-35afe5d28d68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.643 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:52:55 np0005593233 kernel: tap7808328e-20: left promiscuous mode
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.646 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.657 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.657 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.663 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0cd0b7-1c12-4f7b-887e-a96ad4a00f11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.667 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.668 222021 INFO nova.compute.claims [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.681 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[706604db-9be4-44e9-a0ed-cd368968ed31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.683 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[61d88db3-af3f-4bad-861e-704b2642a1ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.704 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[93d16593-c012-4d49-8ef3-e421768c6575]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577101, 'reachable_time': 17694, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252817, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.707 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:52:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:52:55.707 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[60c72f1a-390a-4acd-a7a4-f83952a7dddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:52:55 np0005593233 systemd[1]: run-netns-ovnmeta\x2d7808328e\x2d22f9\x2d46df\x2dac06\x2df8c3d6ad10c4.mount: Deactivated successfully.
Jan 23 04:52:55 np0005593233 nova_compute[222017]: 2026-01-23 09:52:55.933 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1673121277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.455 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.463 222021 DEBUG nova.compute.provider_tree [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.509 222021 DEBUG nova.scheduler.client.report [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.554 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.555 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:52:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:56.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.640 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.640 222021 DEBUG nova.network.neutron [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:52:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:56.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.765 222021 INFO nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:52:56 np0005593233 nova_compute[222017]: 2026-01-23 09:52:56.820 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.017 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.018 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.019 222021 INFO nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Creating image(s)#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.050 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.086 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.122 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.128 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.182 222021 DEBUG nova.policy [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.198 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.199 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.199 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.200 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.324 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.329 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.765 222021 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-unplugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.765 222021 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.766 222021 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.766 222021 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.766 222021 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-unplugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.767 222021 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-unplugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.767 222021 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.767 222021 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.767 222021 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.767 222021 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.768 222021 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] No waiting events found dispatching network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:57 np0005593233 nova_compute[222017]: 2026-01-23 09:52:57.768 222021 WARNING nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received unexpected event network-vif-plugged-e137f0ac-1409-48af-9c44-4d589d8b9bf9 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.048 222021 INFO nova.virt.libvirt.driver [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Deleting instance files /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340_del#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.050 222021 INFO nova.virt.libvirt.driver [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Deletion of /var/lib/nova/instances/d0bb0470-cc5c-4b6f-be0d-20839267c340_del complete#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.173 222021 INFO nova.compute.manager [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Took 3.84 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.173 222021 DEBUG oslo.service.loopingcall [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.175 222021 DEBUG nova.compute.manager [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.175 222021 DEBUG nova.network.neutron [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.179 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.851s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.260 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.415 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:52:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:58.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.644 222021 DEBUG nova.network.neutron [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Successfully created port: 55780789-5847-4bbc-9da1-92ba78a99ce9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:52:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:52:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:58.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.654 222021 DEBUG nova.objects.instance [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid ce72ab48-cfbc-4521-8957-02ad1b5712e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.693 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.694 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Ensure instance console log exists: /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.694 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.695 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:58 np0005593233 nova_compute[222017]: 2026-01-23 09:52:58.695 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e254 e254: 3 total, 3 up, 3 in
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.067 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:00 np0005593233 podman[253007]: 2026-01-23 09:53:00.092217813 +0000 UTC m=+0.104610044 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:53:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:00.224 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:00.225 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.226 222021 DEBUG nova.network.neutron [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.306 222021 INFO nova.compute.manager [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Took 2.13 seconds to deallocate network for instance.#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.350 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.393 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.491 222021 WARNING nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 0 instances on the hypervisor.#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.492 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid ce72ab48-cfbc-4521-8957-02ad1b5712e6 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.492 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid d0bb0470-cc5c-4b6f-be0d-20839267c340 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.493 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.493 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.531 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.532 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:00.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.609 222021 DEBUG nova.compute.manager [req-8fd57fac-578b-473e-9254-f41231328cf2 req-97a8a0b0-2ec3-4967-9d43-bc2f43ca9713 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Received event network-vif-deleted-e137f0ac-1409-48af-9c44-4d589d8b9bf9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:00.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.695 222021 DEBUG oslo_concurrency.processutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.830 222021 DEBUG nova.network.neutron [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Successfully updated port: 55780789-5847-4bbc-9da1-92ba78a99ce9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.861 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-ce72ab48-cfbc-4521-8957-02ad1b5712e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.862 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-ce72ab48-cfbc-4521-8957-02ad1b5712e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:53:00 np0005593233 nova_compute[222017]: 2026-01-23 09:53:00.862 222021 DEBUG nova.network.neutron [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:53:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/458469576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.207 222021 DEBUG nova.compute.manager [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received event network-changed-55780789-5847-4bbc-9da1-92ba78a99ce9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.208 222021 DEBUG nova.compute.manager [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Refreshing instance network info cache due to event network-changed-55780789-5847-4bbc-9da1-92ba78a99ce9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.208 222021 DEBUG oslo_concurrency.lockutils [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ce72ab48-cfbc-4521-8957-02ad1b5712e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.233 222021 DEBUG oslo_concurrency.processutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.241 222021 DEBUG nova.compute.provider_tree [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.272 222021 DEBUG nova.scheduler.client.report [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.338 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.435 222021 INFO nova.scheduler.client.report [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Deleted allocations for instance d0bb0470-cc5c-4b6f-be0d-20839267c340#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.510 222021 DEBUG nova.network.neutron [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.576 222021 DEBUG oslo_concurrency.lockutils [None req-0d447104-e6c2-437a-b687-8d7286d1985f 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.578 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.579 222021 INFO nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Jan 23 04:53:01 np0005593233 nova_compute[222017]: 2026-01-23 09:53:01.579 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "d0bb0470-cc5c-4b6f-be0d-20839267c340" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:02.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:02.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:03.964429) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983964549, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 578, "num_deletes": 252, "total_data_size": 851778, "memory_usage": 863696, "flush_reason": "Manual Compaction"}
Jan 23 04:53:03 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 23 04:53:03 np0005593233 nova_compute[222017]: 2026-01-23 09:53:03.982 222021 DEBUG nova.network.neutron [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Updating instance_info_cache with network_info: [{"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.031 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-ce72ab48-cfbc-4521-8957-02ad1b5712e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.031 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance network_info: |[{"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.031 222021 DEBUG oslo_concurrency.lockutils [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ce72ab48-cfbc-4521-8957-02ad1b5712e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.032 222021 DEBUG nova.network.neutron [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Refreshing network info cache for port 55780789-5847-4bbc-9da1-92ba78a99ce9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.035 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Start _get_guest_xml network_info=[{"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.042 222021 WARNING nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.047 222021 DEBUG nova.virt.libvirt.host [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.048 222021 DEBUG nova.virt.libvirt.host [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.060 222021 DEBUG nova.virt.libvirt.host [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.061 222021 DEBUG nova.virt.libvirt.host [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.062 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.062 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.063 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.063 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.063 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.064 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.064 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.064 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.064 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.064 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.065 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.065 222021 DEBUG nova.virt.hardware [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.069 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161984122800, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 561940, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43458, "largest_seqno": 44031, "table_properties": {"data_size": 558918, "index_size": 994, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7458, "raw_average_key_size": 19, "raw_value_size": 552672, "raw_average_value_size": 1458, "num_data_blocks": 43, "num_entries": 379, "num_filter_entries": 379, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161956, "oldest_key_time": 1769161956, "file_creation_time": 1769161983, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 158390 microseconds, and 5776 cpu microseconds.
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.122852) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 561940 bytes OK
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.122874) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.150078) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.150153) EVENT_LOG_v1 {"time_micros": 1769161984150141, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.150178) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 848428, prev total WAL file size 848428, number of live WAL files 2.
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.151395) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(548KB)], [84(10MB)]
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161984151471, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11614489, "oldest_snapshot_seqno": -1}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6548 keys, 9739536 bytes, temperature: kUnknown
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161984432276, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 9739536, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9696089, "index_size": 25991, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 169821, "raw_average_key_size": 25, "raw_value_size": 9578882, "raw_average_value_size": 1462, "num_data_blocks": 1028, "num_entries": 6548, "num_filter_entries": 6548, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769161984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1487668298' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.438618) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 9739536 bytes
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.542448) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 41.3 rd, 34.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 10.5 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(38.0) write-amplify(17.3) OK, records in: 7068, records dropped: 520 output_compression: NoCompression
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.542504) EVENT_LOG_v1 {"time_micros": 1769161984542485, "job": 52, "event": "compaction_finished", "compaction_time_micros": 280894, "compaction_time_cpu_micros": 26718, "output_level": 6, "num_output_files": 1, "total_output_size": 9739536, "num_input_records": 7068, "num_output_records": 6548, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161984543093, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161984545294, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.151249) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.545362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.545369) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.545371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.545372) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:53:04.545373) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.564 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:04.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.594 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:04 np0005593233 nova_compute[222017]: 2026-01-23 09:53:04.599 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:04.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:53:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:53:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1628265169' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.049 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.051 222021 DEBUG nova.virt.libvirt.vif [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:52:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-364699491',display_name='tempest-DeleteServersTestJSON-server-364699491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-364699491',id=78,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-p4nso8qp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:52:56Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=ce72ab48-cfbc-4521-8957-02ad1b5712e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.052 222021 DEBUG nova.network.os_vif_util [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.052 222021 DEBUG nova.network.os_vif_util [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.054 222021 DEBUG nova.objects.instance [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid ce72ab48-cfbc-4521-8957-02ad1b5712e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.069 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.094 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <uuid>ce72ab48-cfbc-4521-8957-02ad1b5712e6</uuid>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <name>instance-0000004e</name>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:name>tempest-DeleteServersTestJSON-server-364699491</nova:name>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:53:04</nova:creationTime>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <nova:port uuid="55780789-5847-4bbc-9da1-92ba78a99ce9">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <entry name="serial">ce72ab48-cfbc-4521-8957-02ad1b5712e6</entry>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <entry name="uuid">ce72ab48-cfbc-4521-8957-02ad1b5712e6</entry>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk.config">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:ce:6c:5e"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <target dev="tap55780789-58"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/console.log" append="off"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:53:05 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:53:05 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:53:05 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:53:05 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.095 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Preparing to wait for external event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.095 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.095 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.095 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.096 222021 DEBUG nova.virt.libvirt.vif [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:52:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-364699491',display_name='tempest-DeleteServersTestJSON-server-364699491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-364699491',id=78,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-p4nso8qp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:52:56Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=ce72ab48-cfbc-4521-8957-02ad1b5712e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.096 222021 DEBUG nova.network.os_vif_util [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.097 222021 DEBUG nova.network.os_vif_util [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.097 222021 DEBUG os_vif [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.098 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.099 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.101 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.102 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55780789-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.102 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap55780789-58, col_values=(('external_ids', {'iface-id': '55780789-5847-4bbc-9da1-92ba78a99ce9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:6c:5e', 'vm-uuid': 'ce72ab48-cfbc-4521-8957-02ad1b5712e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:05 np0005593233 NetworkManager[48871]: <info>  [1769161985.1053] manager: (tap55780789-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.107 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.111 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.113 222021 INFO os_vif [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58')#033[00m
Jan 23 04:53:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:05.228 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.283 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.283 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.284 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:ce:6c:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.284 222021 INFO nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Using config drive#033[00m
Jan 23 04:53:05 np0005593233 nova_compute[222017]: 2026-01-23 09:53:05.316 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:06.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:06.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.027 222021 INFO nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Creating config drive at /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/disk.config#033[00m
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.033 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprt_bjkwv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.177 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprt_bjkwv" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.211 222021 DEBUG nova.storage.rbd_utils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.216 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/disk.config ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.400 222021 DEBUG oslo_concurrency.processutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/disk.config ce72ab48-cfbc-4521-8957-02ad1b5712e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.402 222021 INFO nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Deleting local config drive /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6/disk.config because it was imported into RBD.#033[00m
Jan 23 04:53:07 np0005593233 kernel: tap55780789-58: entered promiscuous mode
Jan 23 04:53:07 np0005593233 NetworkManager[48871]: <info>  [1769161987.4782] manager: (tap55780789-58): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Jan 23 04:53:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:07Z|00270|binding|INFO|Claiming lport 55780789-5847-4bbc-9da1-92ba78a99ce9 for this chassis.
Jan 23 04:53:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:07Z|00271|binding|INFO|55780789-5847-4bbc-9da1-92ba78a99ce9: Claiming fa:16:3e:ce:6c:5e 10.100.0.3
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.479 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:07Z|00272|binding|INFO|Setting lport 55780789-5847-4bbc-9da1-92ba78a99ce9 ovn-installed in OVS
Jan 23 04:53:07 np0005593233 nova_compute[222017]: 2026-01-23 09:53:07.499 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:07 np0005593233 systemd-udevd[253444]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:53:07 np0005593233 systemd-machined[190954]: New machine qemu-39-instance-0000004e.
Jan 23 04:53:07 np0005593233 NetworkManager[48871]: <info>  [1769161987.5357] device (tap55780789-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:53:07 np0005593233 NetworkManager[48871]: <info>  [1769161987.5366] device (tap55780789-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:53:07 np0005593233 systemd[1]: Started Virtual Machine qemu-39-instance-0000004e.
Jan 23 04:53:08 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:08Z|00273|binding|INFO|Setting lport 55780789-5847-4bbc-9da1-92ba78a99ce9 up in Southbound
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.003 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:6c:5e 10.100.0.3'], port_security=['fa:16:3e:ce:6c:5e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ce72ab48-cfbc-4521-8957-02ad1b5712e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=55780789-5847-4bbc-9da1-92ba78a99ce9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.005 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 55780789-5847-4bbc-9da1-92ba78a99ce9 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.007 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.024 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[69d9bc8b-d8b6-4d79-8697-a245b67a8d9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.026 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.029 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.029 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[70f578b1-aa79-46af-933f-82aaf15d31c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.030 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b7df81-c3bc-49f7-8e3f-6402b82dc7aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.045 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c4a719-cd1e-4e92-83a4-e2d72f8d3a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.074 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[24a89878-4085-4054-b3fa-2765db47cc2b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.118 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1678fec8-bf6a-47e3-923d-aee5cfd1be5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.125 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[928391c2-3691-4dd2-8b4a-0868efe1c1ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 NetworkManager[48871]: <info>  [1769161988.1275] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.173 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[806e3ec1-dc5a-47e6-90f3-fbbd1fb16114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.177 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ff735437-f4e5-4cf1-a21a-e1421e1d54e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 NetworkManager[48871]: <info>  [1769161988.2112] device (tapa3788149-e0): carrier: link connected
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.216 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b85a36df-b92c-4991-9894-1fab00c7d2e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.237 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f96eb4fe-5f76-45ec-9296-7ae5ba3d615d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590084, 'reachable_time': 26709, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253493, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.255 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[01f67306-f076-407e-8a8a-5a0d282fa160]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 590084, 'tstamp': 590084}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253497, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.276 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f8833b47-962e-4009-8bff-44db66efba30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590084, 'reachable_time': 26709, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253505, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.307 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3b52b563-d012-4931-8960-8b9582e3ef1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.378 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34bc4bd1-5e77-4197-9bf5-a284a5fa5a45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.380 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.381 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.381 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:08 np0005593233 NetworkManager[48871]: <info>  [1769161988.3840] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 23 04:53:08 np0005593233 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.384 222021 DEBUG nova.network.neutron [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Updated VIF entry in instance network info cache for port 55780789-5847-4bbc-9da1-92ba78a99ce9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.386 222021 DEBUG nova.network.neutron [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Updating instance_info_cache with network_info: [{"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.388 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.389 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:08 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:08Z|00274|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.406 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.406 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.407 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f772ce3e-7285-41fa-b154-cc516d6c36a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.409 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:53:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:08.410 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.425 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161988.4242327, ce72ab48-cfbc-4521-8957-02ad1b5712e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.426 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] VM Started (Lifecycle Event)#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.429 222021 DEBUG oslo_concurrency.lockutils [req-631c8bdd-4556-4f9c-8ef0-6279c258604d req-ac829e46-cf64-4570-b4e3-611d6ea5b787 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ce72ab48-cfbc-4521-8957-02ad1b5712e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.480 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.487 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161988.4244635, ce72ab48-cfbc-4521-8957-02ad1b5712e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.488 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.519 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.524 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.565 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:53:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:08.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:08.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.870 222021 DEBUG nova.compute.manager [req-759dd841-9721-4034-a32a-b90a73b329d2 req-de84075c-e759-4e99-9820-26192da37536 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.871 222021 DEBUG oslo_concurrency.lockutils [req-759dd841-9721-4034-a32a-b90a73b329d2 req-de84075c-e759-4e99-9820-26192da37536 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.871 222021 DEBUG oslo_concurrency.lockutils [req-759dd841-9721-4034-a32a-b90a73b329d2 req-de84075c-e759-4e99-9820-26192da37536 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.872 222021 DEBUG oslo_concurrency.lockutils [req-759dd841-9721-4034-a32a-b90a73b329d2 req-de84075c-e759-4e99-9820-26192da37536 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.872 222021 DEBUG nova.compute.manager [req-759dd841-9721-4034-a32a-b90a73b329d2 req-de84075c-e759-4e99-9820-26192da37536 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Processing event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.873 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.876 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769161988.876057, ce72ab48-cfbc-4521-8957-02ad1b5712e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.876 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.879 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.882 222021 INFO nova.virt.libvirt.driver [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance spawned successfully.#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.883 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.911 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.916 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.917 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.917 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.918 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.918 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.918 222021 DEBUG nova.virt.libvirt.driver [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.924 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:53:08 np0005593233 podman[253554]: 2026-01-23 09:53:08.833157728 +0000 UTC m=+0.032893483 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:53:08 np0005593233 nova_compute[222017]: 2026-01-23 09:53:08.985 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.013 222021 INFO nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Took 12.00 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.014 222021 DEBUG nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.117 222021 INFO nova.compute.manager [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Took 13.52 seconds to build instance.#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.143 222021 DEBUG oslo_concurrency.lockutils [None req-70f4723e-c023-4bb4-ba63-968d7a87085c 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.144 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 8.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.146 222021 INFO nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:53:09 np0005593233 nova_compute[222017]: 2026-01-23 09:53:09.147 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:09 np0005593233 podman[253554]: 2026-01-23 09:53:09.201375668 +0000 UTC m=+0.401111403 container create 177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:53:09 np0005593233 systemd[1]: Started libpod-conmon-177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936.scope.
Jan 23 04:53:09 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:53:09 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bee68951d3899fc415ff2a9c3a4781c4add8bfd39b9a37f7edd38d4bdffe8d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:09 np0005593233 podman[253554]: 2026-01-23 09:53:09.318900993 +0000 UTC m=+0.518636748 container init 177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:53:09 np0005593233 podman[253554]: 2026-01-23 09:53:09.325538259 +0000 UTC m=+0.525273994 container start 177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:53:09 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [NOTICE]   (253573) : New worker (253575) forked
Jan 23 04:53:09 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [NOTICE]   (253573) : Loading success.
Jan 23 04:53:10 np0005593233 nova_compute[222017]: 2026-01-23 09:53:10.071 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:10 np0005593233 nova_compute[222017]: 2026-01-23 09:53:10.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:10 np0005593233 nova_compute[222017]: 2026-01-23 09:53:10.178 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161975.1761746, d0bb0470-cc5c-4b6f-be0d-20839267c340 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:10 np0005593233 nova_compute[222017]: 2026-01-23 09:53:10.178 222021 INFO nova.compute.manager [-] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:53:10 np0005593233 nova_compute[222017]: 2026-01-23 09:53:10.215 222021 DEBUG nova.compute.manager [None req-ac8f4358-48c8-4032-96e9-10dafa6f2ec9 - - - - - -] [instance: d0bb0470-cc5c-4b6f-be0d-20839267c340] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:10.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:10.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.090 222021 DEBUG nova.compute.manager [req-4dfbdb15-a2ab-4c87-87bd-e8f2f9029012 req-bfd525e1-ea89-4f04-b913-501e5698ab75 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.090 222021 DEBUG oslo_concurrency.lockutils [req-4dfbdb15-a2ab-4c87-87bd-e8f2f9029012 req-bfd525e1-ea89-4f04-b913-501e5698ab75 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.091 222021 DEBUG oslo_concurrency.lockutils [req-4dfbdb15-a2ab-4c87-87bd-e8f2f9029012 req-bfd525e1-ea89-4f04-b913-501e5698ab75 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.092 222021 DEBUG oslo_concurrency.lockutils [req-4dfbdb15-a2ab-4c87-87bd-e8f2f9029012 req-bfd525e1-ea89-4f04-b913-501e5698ab75 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.092 222021 DEBUG nova.compute.manager [req-4dfbdb15-a2ab-4c87-87bd-e8f2f9029012 req-bfd525e1-ea89-4f04-b913-501e5698ab75 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] No waiting events found dispatching network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.093 222021 WARNING nova.compute.manager [req-4dfbdb15-a2ab-4c87-87bd-e8f2f9029012 req-bfd525e1-ea89-4f04-b913-501e5698ab75 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received unexpected event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.437 222021 DEBUG oslo_concurrency.lockutils [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.437 222021 DEBUG oslo_concurrency.lockutils [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.438 222021 DEBUG nova.compute.manager [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.443 222021 DEBUG nova.compute.manager [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.443 222021 DEBUG nova.objects.instance [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'flavor' on Instance uuid ce72ab48-cfbc-4521-8957-02ad1b5712e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:11 np0005593233 nova_compute[222017]: 2026-01-23 09:53:11.502 222021 DEBUG nova.virt.libvirt.driver [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:53:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:12.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:12 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 04:53:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e255 e255: 3 total, 3 up, 3 in
Jan 23 04:53:12 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593233 podman[253635]: 2026-01-23 09:53:13.058369433 +0000 UTC m=+0.067126483 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:53:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 04:53:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:14.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:14.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:15 np0005593233 nova_compute[222017]: 2026-01-23 09:53:15.074 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:15 np0005593233 nova_compute[222017]: 2026-01-23 09:53:15.106 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:16.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:16.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:17 np0005593233 nova_compute[222017]: 2026-01-23 09:53:17.486 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:17Z|00275|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:53:18 np0005593233 nova_compute[222017]: 2026-01-23 09:53:18.015 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:18Z|00276|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:53:18 np0005593233 nova_compute[222017]: 2026-01-23 09:53:18.259 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:18.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:18.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:20 np0005593233 nova_compute[222017]: 2026-01-23 09:53:20.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:20 np0005593233 nova_compute[222017]: 2026-01-23 09:53:20.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:20 np0005593233 nova_compute[222017]: 2026-01-23 09:53:20.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:20.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:20.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 e256: 3 total, 3 up, 3 in
Jan 23 04:53:21 np0005593233 nova_compute[222017]: 2026-01-23 09:53:21.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:21 np0005593233 nova_compute[222017]: 2026-01-23 09:53:21.549 222021 DEBUG nova.virt.libvirt.driver [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.468 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.468 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.469 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.469 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.469 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:22.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:22.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/947347704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:22 np0005593233 nova_compute[222017]: 2026-01-23 09:53:22.954 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.090 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000004e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.090 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000004e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.274 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.276 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4388MB free_disk=20.921844482421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.277 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.278 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.868 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ce72ab48-cfbc-4521-8957-02ad1b5712e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.868 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:53:23 np0005593233 nova_compute[222017]: 2026-01-23 09:53:23.869 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:53:24 np0005593233 nova_compute[222017]: 2026-01-23 09:53:24.078 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4117691310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:24 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:24Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:6c:5e 10.100.0.3
Jan 23 04:53:24 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:24Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:6c:5e 10.100.0.3
Jan 23 04:53:24 np0005593233 nova_compute[222017]: 2026-01-23 09:53:24.557 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:24 np0005593233 nova_compute[222017]: 2026-01-23 09:53:24.565 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:24.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:24.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:25 np0005593233 nova_compute[222017]: 2026-01-23 09:53:25.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:25 np0005593233 nova_compute[222017]: 2026-01-23 09:53:25.084 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:25 np0005593233 nova_compute[222017]: 2026-01-23 09:53:25.110 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:25 np0005593233 nova_compute[222017]: 2026-01-23 09:53:25.408 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:53:25 np0005593233 nova_compute[222017]: 2026-01-23 09:53:25.408 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.382 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.383 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.411 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:26 np0005593233 nova_compute[222017]: 2026-01-23 09:53:26.413 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:53:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:26.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:26.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:28.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:28.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:30 np0005593233 nova_compute[222017]: 2026-01-23 09:53:30.084 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:30 np0005593233 nova_compute[222017]: 2026-01-23 09:53:30.112 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:30.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:30.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:31 np0005593233 podman[253701]: 2026-01-23 09:53:31.103683087 +0000 UTC m=+0.107595007 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:53:31 np0005593233 nova_compute[222017]: 2026-01-23 09:53:31.409 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:32 np0005593233 nova_compute[222017]: 2026-01-23 09:53:32.622 222021 DEBUG nova.virt.libvirt.driver [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:53:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:32.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:32.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:34.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:34.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:34 np0005593233 kernel: tap55780789-58 (unregistering): left promiscuous mode
Jan 23 04:53:34 np0005593233 NetworkManager[48871]: <info>  [1769162014.8867] device (tap55780789-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:53:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:34Z|00277|binding|INFO|Releasing lport 55780789-5847-4bbc-9da1-92ba78a99ce9 from this chassis (sb_readonly=0)
Jan 23 04:53:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:34Z|00278|binding|INFO|Setting lport 55780789-5847-4bbc-9da1-92ba78a99ce9 down in Southbound
Jan 23 04:53:34 np0005593233 ovn_controller[130653]: 2026-01-23T09:53:34Z|00279|binding|INFO|Removing iface tap55780789-58 ovn-installed in OVS
Jan 23 04:53:34 np0005593233 nova_compute[222017]: 2026-01-23 09:53:34.894 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:34 np0005593233 nova_compute[222017]: 2026-01-23 09:53:34.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:34 np0005593233 nova_compute[222017]: 2026-01-23 09:53:34.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:34.922 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:6c:5e 10.100.0.3'], port_security=['fa:16:3e:ce:6c:5e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ce72ab48-cfbc-4521-8957-02ad1b5712e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=55780789-5847-4bbc-9da1-92ba78a99ce9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:34.923 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 55780789-5847-4bbc-9da1-92ba78a99ce9 in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:53:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:34.925 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:53:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:34.926 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5696fb7a-2066-4b8f-aeb3-fbaf8b651868]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:34.928 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:53:34 np0005593233 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Jan 23 04:53:34 np0005593233 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004e.scope: Consumed 15.360s CPU time.
Jan 23 04:53:34 np0005593233 systemd-machined[190954]: Machine qemu-39-instance-0000004e terminated.
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.086 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [NOTICE]   (253573) : haproxy version is 2.8.14-c23fe91
Jan 23 04:53:35 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [NOTICE]   (253573) : path to executable is /usr/sbin/haproxy
Jan 23 04:53:35 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [WARNING]  (253573) : Exiting Master process...
Jan 23 04:53:35 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [WARNING]  (253573) : Exiting Master process...
Jan 23 04:53:35 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [ALERT]    (253573) : Current worker (253575) exited with code 143 (Terminated)
Jan 23 04:53:35 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[253569]: [WARNING]  (253573) : All workers exited. Exiting... (0)
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.113 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593233 systemd[1]: libpod-177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936.scope: Deactivated successfully.
Jan 23 04:53:35 np0005593233 podman[253751]: 2026-01-23 09:53:35.118611459 +0000 UTC m=+0.070545128 container died 177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.123 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.128 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936-userdata-shm.mount: Deactivated successfully.
Jan 23 04:53:35 np0005593233 systemd[1]: var-lib-containers-storage-overlay-9bee68951d3899fc415ff2a9c3a4781c4add8bfd39b9a37f7edd38d4bdffe8d8-merged.mount: Deactivated successfully.
Jan 23 04:53:35 np0005593233 podman[253751]: 2026-01-23 09:53:35.288647276 +0000 UTC m=+0.240580985 container cleanup 177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:53:35 np0005593233 systemd[1]: libpod-conmon-177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936.scope: Deactivated successfully.
Jan 23 04:53:35 np0005593233 podman[253791]: 2026-01-23 09:53:35.388111314 +0000 UTC m=+0.075368444 container remove 177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.397 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0a33ecaf-74cb-40aa-a237-1931cbe8a2e8]: (4, ('Fri Jan 23 09:53:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936)\n177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936\nFri Jan 23 09:53:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936)\n177946dba5ad1374b298b0e9d69b6203889dd7abe1188c4363bcd3c26e954936\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.399 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b317b53c-6bf5-433b-ae26-b0c5f5eeab95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.401 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593233 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.426 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[acd27de9-40c2-4dad-b393-4b55b4aab5d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.448 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5698bf41-35ab-4d7b-87c3-b8aac5f69d9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.450 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[301ea029-2f1a-43a4-b551-39879780e0fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.470 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a4db33f9-c0fc-4c64-8691-2fb60ac27d09]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 590074, 'reachable_time': 29200, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253809, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.474 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:53:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:35.474 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[fe34aee2-9b10-498a-a1c4-affd71519109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.639 222021 INFO nova.virt.libvirt.driver [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance shutdown successfully after 24 seconds.#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.646 222021 INFO nova.virt.libvirt.driver [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance destroyed successfully.#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.647 222021 DEBUG nova.objects.instance [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'numa_topology' on Instance uuid ce72ab48-cfbc-4521-8957-02ad1b5712e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.694 222021 DEBUG nova.compute.manager [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.785 222021 DEBUG oslo_concurrency.lockutils [None req-0a97f6e9-0eec-4391-b245-16718d738c95 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.914 222021 DEBUG nova.compute.manager [req-8d85a84f-ba03-43f0-85ce-1f471b72ab4d req-a523e5e6-abe0-4b5a-9180-150ebdb0942e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received event network-vif-unplugged-55780789-5847-4bbc-9da1-92ba78a99ce9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.914 222021 DEBUG oslo_concurrency.lockutils [req-8d85a84f-ba03-43f0-85ce-1f471b72ab4d req-a523e5e6-abe0-4b5a-9180-150ebdb0942e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.915 222021 DEBUG oslo_concurrency.lockutils [req-8d85a84f-ba03-43f0-85ce-1f471b72ab4d req-a523e5e6-abe0-4b5a-9180-150ebdb0942e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.915 222021 DEBUG oslo_concurrency.lockutils [req-8d85a84f-ba03-43f0-85ce-1f471b72ab4d req-a523e5e6-abe0-4b5a-9180-150ebdb0942e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.915 222021 DEBUG nova.compute.manager [req-8d85a84f-ba03-43f0-85ce-1f471b72ab4d req-a523e5e6-abe0-4b5a-9180-150ebdb0942e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] No waiting events found dispatching network-vif-unplugged-55780789-5847-4bbc-9da1-92ba78a99ce9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:35 np0005593233 nova_compute[222017]: 2026-01-23 09:53:35.915 222021 WARNING nova.compute.manager [req-8d85a84f-ba03-43f0-85ce-1f471b72ab4d req-a523e5e6-abe0-4b5a-9180-150ebdb0942e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received unexpected event network-vif-unplugged-55780789-5847-4bbc-9da1-92ba78a99ce9 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 04:53:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:36.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:36.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:37 np0005593233 nova_compute[222017]: 2026-01-23 09:53:37.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.173 222021 DEBUG nova.compute.manager [req-508ef762-f0b1-4738-b683-c7385808329a req-e7b9ee47-c068-4794-ab7c-9f14aaac3c45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.173 222021 DEBUG oslo_concurrency.lockutils [req-508ef762-f0b1-4738-b683-c7385808329a req-e7b9ee47-c068-4794-ab7c-9f14aaac3c45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.174 222021 DEBUG oslo_concurrency.lockutils [req-508ef762-f0b1-4738-b683-c7385808329a req-e7b9ee47-c068-4794-ab7c-9f14aaac3c45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.174 222021 DEBUG oslo_concurrency.lockutils [req-508ef762-f0b1-4738-b683-c7385808329a req-e7b9ee47-c068-4794-ab7c-9f14aaac3c45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.174 222021 DEBUG nova.compute.manager [req-508ef762-f0b1-4738-b683-c7385808329a req-e7b9ee47-c068-4794-ab7c-9f14aaac3c45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] No waiting events found dispatching network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.174 222021 WARNING nova.compute.manager [req-508ef762-f0b1-4738-b683-c7385808329a req-e7b9ee47-c068-4794-ab7c-9f14aaac3c45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received unexpected event network-vif-plugged-55780789-5847-4bbc-9da1-92ba78a99ce9 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 04:53:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:38.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:38.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.969 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.970 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.970 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.970 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.971 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.972 222021 INFO nova.compute.manager [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Terminating instance#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.973 222021 DEBUG nova.compute.manager [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.980 222021 INFO nova.virt.libvirt.driver [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Instance destroyed successfully.#033[00m
Jan 23 04:53:38 np0005593233 nova_compute[222017]: 2026-01-23 09:53:38.980 222021 DEBUG nova.objects.instance [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid ce72ab48-cfbc-4521-8957-02ad1b5712e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.006 222021 DEBUG nova.virt.libvirt.vif [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:52:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-364699491',display_name='tempest-DeleteServersTestJSON-server-364699491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-364699491',id=78,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:53:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-p4nso8qp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:53:35Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=ce72ab48-cfbc-4521-8957-02ad1b5712e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.007 222021 DEBUG nova.network.os_vif_util [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "55780789-5847-4bbc-9da1-92ba78a99ce9", "address": "fa:16:3e:ce:6c:5e", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55780789-58", "ovs_interfaceid": "55780789-5847-4bbc-9da1-92ba78a99ce9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.008 222021 DEBUG nova.network.os_vif_util [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.008 222021 DEBUG os_vif [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.010 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55780789-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.013 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:53:39 np0005593233 nova_compute[222017]: 2026-01-23 09:53:39.016 222021 INFO os_vif [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:6c:5e,bridge_name='br-int',has_traffic_filtering=True,id=55780789-5847-4bbc-9da1-92ba78a99ce9,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55780789-58')#033[00m
Jan 23 04:53:40 np0005593233 nova_compute[222017]: 2026-01-23 09:53:40.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:40.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:40.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:40 np0005593233 nova_compute[222017]: 2026-01-23 09:53:40.786 222021 INFO nova.virt.libvirt.driver [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Deleting instance files /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6_del#033[00m
Jan 23 04:53:40 np0005593233 nova_compute[222017]: 2026-01-23 09:53:40.787 222021 INFO nova.virt.libvirt.driver [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Deletion of /var/lib/nova/instances/ce72ab48-cfbc-4521-8957-02ad1b5712e6_del complete#033[00m
Jan 23 04:53:41 np0005593233 nova_compute[222017]: 2026-01-23 09:53:41.006 222021 INFO nova.compute.manager [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Took 2.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:53:41 np0005593233 nova_compute[222017]: 2026-01-23 09:53:41.007 222021 DEBUG oslo.service.loopingcall [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:53:41 np0005593233 nova_compute[222017]: 2026-01-23 09:53:41.007 222021 DEBUG nova.compute.manager [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:53:41 np0005593233 nova_compute[222017]: 2026-01-23 09:53:41.008 222021 DEBUG nova.network.neutron [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:53:42 np0005593233 nova_compute[222017]: 2026-01-23 09:53:42.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:42 np0005593233 nova_compute[222017]: 2026-01-23 09:53:42.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:53:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:42.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:42.656 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:42.656 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:42.656 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:42.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:43 np0005593233 nova_compute[222017]: 2026-01-23 09:53:43.552 222021 DEBUG nova.network.neutron [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:53:43 np0005593233 nova_compute[222017]: 2026-01-23 09:53:43.593 222021 INFO nova.compute.manager [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Took 2.59 seconds to deallocate network for instance.#033[00m
Jan 23 04:53:43 np0005593233 nova_compute[222017]: 2026-01-23 09:53:43.689 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:43 np0005593233 nova_compute[222017]: 2026-01-23 09:53:43.690 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:43 np0005593233 nova_compute[222017]: 2026-01-23 09:53:43.808 222021 DEBUG nova.compute.manager [req-56fd9035-623e-49c3-ae72-c50dd9c28d95 req-1c0fb918-e258-415c-85d6-b9d116e5d8aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Received event network-vif-deleted-55780789-5847-4bbc-9da1-92ba78a99ce9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:43 np0005593233 nova_compute[222017]: 2026-01-23 09:53:43.991 222021 DEBUG oslo_concurrency.processutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:44 np0005593233 podman[253829]: 2026-01-23 09:53:44.051833925 +0000 UTC m=+0.059778346 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 04:53:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3250209934' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.429 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.429 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.439 222021 DEBUG oslo_concurrency.processutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.445 222021 DEBUG nova.compute.provider_tree [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.463 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.475 222021 DEBUG nova.scheduler.client.report [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.517 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:53:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2023274575' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.604 222021 INFO nova.scheduler.client.report [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance ce72ab48-cfbc-4521-8957-02ad1b5712e6#033[00m
Jan 23 04:53:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:44.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:44.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:53:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2023274575' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:53:44 np0005593233 nova_compute[222017]: 2026-01-23 09:53:44.770 222021 DEBUG oslo_concurrency.lockutils [None req-a1644aee-fe3f-4bae-9553-7d8667a031dd 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "ce72ab48-cfbc-4521-8957-02ad1b5712e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:45 np0005593233 nova_compute[222017]: 2026-01-23 09:53:45.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:46.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:46.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:48.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:48.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:49 np0005593233 nova_compute[222017]: 2026-01-23 09:53:49.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:50 np0005593233 nova_compute[222017]: 2026-01-23 09:53:50.094 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:50 np0005593233 nova_compute[222017]: 2026-01-23 09:53:50.137 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162015.1360373, ce72ab48-cfbc-4521-8957-02ad1b5712e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:50 np0005593233 nova_compute[222017]: 2026-01-23 09:53:50.138 222021 INFO nova.compute.manager [-] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:53:50 np0005593233 nova_compute[222017]: 2026-01-23 09:53:50.195 222021 DEBUG nova.compute.manager [None req-32908654-511b-4b5b-8a73-6f605b9ecbbb - - - - - -] [instance: ce72ab48-cfbc-4521-8957-02ad1b5712e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:50.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:50.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:52.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:53.213 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:53.214 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:53:53 np0005593233 nova_compute[222017]: 2026-01-23 09:53:53.214 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:54 np0005593233 nova_compute[222017]: 2026-01-23 09:53:54.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:54.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:54.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:55 np0005593233 nova_compute[222017]: 2026-01-23 09:53:55.096 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:56.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:53:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:56.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.290 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.291 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.351 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.483 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.484 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.505 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.505 222021 INFO nova.compute.claims [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:53:57 np0005593233 nova_compute[222017]: 2026-01-23 09:53:57.783 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1685794325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.248 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.256 222021 DEBUG nova.compute.provider_tree [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.316 222021 DEBUG nova.scheduler.client.report [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.370 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.371 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.474 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.475 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.510 222021 INFO nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.615 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:53:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:58.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:58 np0005593233 nova_compute[222017]: 2026-01-23 09:53:58.687 222021 INFO nova.virt.block_device [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Booting with volume 98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2 at /dev/vda#033[00m
Jan 23 04:53:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:53:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:58.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:53:59.217 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.395 222021 DEBUG os_brick.utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.398 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.417 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.417 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e6b453-148a-4b64-a646-f5f54ab81958]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.418 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.430 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.430 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[a19573e9-4155-4eaa-94a1-29bd750dac82]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.431 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.445 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.446 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[9174fc00-1c64-4468-9d2d-ceccb4751ad4]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.448 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[5c68ea19-f5d0-4ed1-9819-04362d514aa3]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.451 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.481 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "nvme version" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.485 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.485 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.485 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.486 222021 DEBUG os_brick.utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] <== get_connector_properties: return (90ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:53:59 np0005593233 nova_compute[222017]: 2026-01-23 09:53:59.486 222021 DEBUG nova.virt.block_device [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating existing volume attachment record: c1e3e9d5-4d6c-4f1c-8c02-ec0d787ca865 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:54:00 np0005593233 nova_compute[222017]: 2026-01-23 09:54:00.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:00 np0005593233 nova_compute[222017]: 2026-01-23 09:54:00.105 222021 DEBUG nova.policy [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd512838ce2b44554b0566fdbb3c702b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:54:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:00.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:00.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:01 np0005593233 nova_compute[222017]: 2026-01-23 09:54:01.925 222021 INFO nova.virt.block_device [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Booting with volume a5f913c6-cfb1-486f-bcd3-8b5237feb9b1 at /dev/vdb#033[00m
Jan 23 04:54:02 np0005593233 podman[253900]: 2026-01-23 09:54:02.095662464 +0000 UTC m=+0.107058232 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.409 222021 DEBUG os_brick.utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.410 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.419 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.419 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[8678c63f-b6ed-4f7b-9f60-516fe2d4be94]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.421 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.427 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.427 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[52e54801-88d2-4e44-85c1-105b85ff578e]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.429 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.437 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.438 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[771ff938-8c19-41b7-838b-5f59d18b9a09]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.439 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c926fca6-4521-4d78-a64d-3d89fbf99678]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.440 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.482 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "nvme version" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.489 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.490 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.490 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.492 222021 DEBUG os_brick.utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] <== get_connector_properties: return (82ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.492 222021 DEBUG nova.virt.block_device [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating existing volume attachment record: e5a6dd94-0880-4555-a334-033e3156a797 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:54:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:02.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:02.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.841 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully created port: 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:54:02 np0005593233 nova_compute[222017]: 2026-01-23 09:54:02.859 222021 DEBUG nova.compute.manager [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.002 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.003 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.067 222021 DEBUG nova.objects.instance [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_requests' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.099 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.100 222021 INFO nova.compute.claims [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.100 222021 DEBUG nova.objects.instance [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.122 222021 DEBUG nova.objects.instance [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.226 222021 INFO nova.compute.resource_tracker [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating resource usage from migration 9a6f94e7-d356-40f3-896e-69daee93b7ad#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.227 222021 DEBUG nova.compute.resource_tracker [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Starting to track incoming migration 9a6f94e7-d356-40f3-896e-69daee93b7ad with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:54:03 np0005593233 nova_compute[222017]: 2026-01-23 09:54:03.627 222021 DEBUG oslo_concurrency.processutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.039 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3772394774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.102 222021 DEBUG oslo_concurrency.processutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.110 222021 DEBUG nova.compute.provider_tree [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.146 222021 DEBUG nova.scheduler.client.report [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.184 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.185 222021 INFO nova.compute.manager [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Migrating#033[00m
Jan 23 04:54:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:04.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.680 222021 INFO nova.virt.block_device [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Booting with volume 0c57b335-8fff-4dae-ab58-3a6c060df2a2 at /dev/vdc#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.768 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully created port: 8f671859-24dc-4140-915c-bbad6f16e0d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:54:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:04.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.931 222021 DEBUG os_brick.utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.933 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.952 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.952 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[661dd6c1-b16d-42d4-b177-b1fb42d5c9bb]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.954 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.964 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.964 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[9deef973-60fd-4b51-a9f8-35737655c169]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.966 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.976 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.977 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[ea307efc-d274-410a-92a6-7b43403854b3]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.979 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[541456b7-20bb-4450-b130-0f25217db2ca]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:04 np0005593233 nova_compute[222017]: 2026-01-23 09:54:04.979 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.008 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.012 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.013 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.014 222021 DEBUG os_brick.initiator.connectors.lightos [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.014 222021 DEBUG os_brick.utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] <== get_connector_properties: return (82ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.015 222021 DEBUG nova.virt.block_device [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating existing volume attachment record: dc578d3b-e60d-48b6-bc26-fe33ec917aaf _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:54:05 np0005593233 nova_compute[222017]: 2026-01-23 09:54:05.100 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:06 np0005593233 nova_compute[222017]: 2026-01-23 09:54:06.379 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully created port: 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:54:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:06.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:06.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.685 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.688 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.688 222021 INFO nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Creating image(s)#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.690 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.691 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Ensure instance console log exists: /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.692 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.692 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:07 np0005593233 nova_compute[222017]: 2026-01-23 09:54:07.692 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:08.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:08.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:09 np0005593233 nova_compute[222017]: 2026-01-23 09:54:09.042 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:09 np0005593233 nova_compute[222017]: 2026-01-23 09:54:09.480 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully created port: 1e83b219-c51b-488a-8fc7-8240efb384c0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:54:10 np0005593233 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 04:54:10 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 04:54:10 np0005593233 systemd-logind[804]: New session 51 of user nova.
Jan 23 04:54:10 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 04:54:10 np0005593233 systemd[1]: Starting User Manager for UID 42436...
Jan 23 04:54:10 np0005593233 nova_compute[222017]: 2026-01-23 09:54:10.102 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:10 np0005593233 systemd[253969]: Queued start job for default target Main User Target.
Jan 23 04:54:10 np0005593233 systemd[253969]: Created slice User Application Slice.
Jan 23 04:54:10 np0005593233 systemd[253969]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:54:10 np0005593233 systemd[253969]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:54:10 np0005593233 systemd[253969]: Reached target Paths.
Jan 23 04:54:10 np0005593233 systemd[253969]: Reached target Timers.
Jan 23 04:54:10 np0005593233 systemd[253969]: Starting D-Bus User Message Bus Socket...
Jan 23 04:54:10 np0005593233 systemd[253969]: Starting Create User's Volatile Files and Directories...
Jan 23 04:54:10 np0005593233 systemd[253969]: Finished Create User's Volatile Files and Directories.
Jan 23 04:54:10 np0005593233 systemd[253969]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:54:10 np0005593233 systemd[253969]: Reached target Sockets.
Jan 23 04:54:10 np0005593233 systemd[253969]: Reached target Basic System.
Jan 23 04:54:10 np0005593233 systemd[253969]: Reached target Main User Target.
Jan 23 04:54:10 np0005593233 systemd[253969]: Startup finished in 149ms.
Jan 23 04:54:10 np0005593233 systemd[1]: Started User Manager for UID 42436.
Jan 23 04:54:10 np0005593233 systemd[1]: Started Session 51 of User nova.
Jan 23 04:54:10 np0005593233 systemd-logind[804]: Session 51 logged out. Waiting for processes to exit.
Jan 23 04:54:10 np0005593233 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 04:54:10 np0005593233 systemd-logind[804]: Removed session 51.
Jan 23 04:54:10 np0005593233 systemd-logind[804]: New session 53 of user nova.
Jan 23 04:54:10 np0005593233 systemd[1]: Started Session 53 of User nova.
Jan 23 04:54:10 np0005593233 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 04:54:10 np0005593233 systemd-logind[804]: Session 53 logged out. Waiting for processes to exit.
Jan 23 04:54:10 np0005593233 systemd-logind[804]: Removed session 53.
Jan 23 04:54:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:10.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:10.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:12 np0005593233 nova_compute[222017]: 2026-01-23 09:54:12.434 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully created port: 2358fe4c-654b-4b88-9e08-2e85688cb00e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:54:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:12.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.046 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.415 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:14.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.757 222021 DEBUG nova.compute.manager [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.757 222021 DEBUG nova.compute.manager [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.757 222021 DEBUG oslo_concurrency.lockutils [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.758 222021 DEBUG oslo_concurrency.lockutils [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:14 np0005593233 nova_compute[222017]: 2026-01-23 09:54:14.758 222021 DEBUG nova.network.neutron [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:14.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:15 np0005593233 podman[254123]: 2026-01-23 09:54:15.04594145 +0000 UTC m=+0.060664501 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 04:54:15 np0005593233 nova_compute[222017]: 2026-01-23 09:54:15.048 222021 INFO nova.network.neutron [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating port 10fe80c9-2f99-4371-a60e-b8b226c250aa with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 04:54:15 np0005593233 nova_compute[222017]: 2026-01-23 09:54:15.105 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:15 np0005593233 nova_compute[222017]: 2026-01-23 09:54:15.116 222021 DEBUG nova.network.neutron [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:16 np0005593233 nova_compute[222017]: 2026-01-23 09:54:16.393 222021 DEBUG nova.network.neutron [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:16 np0005593233 nova_compute[222017]: 2026-01-23 09:54:16.437 222021 DEBUG oslo_concurrency.lockutils [req-4bfbdebd-615b-4fd4-9629-8c810e25dde5 req-caac0aef-894e-4dfc-8d25-c500276c1269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:16.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:16.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.420 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.478 222021 DEBUG nova.compute.manager [req-6aad24c0-e8bc-4a76-a464-b847fa3592f8 req-682054d2-1375-44a3-9df9-32a1b3a1c035 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-unplugged-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.479 222021 DEBUG oslo_concurrency.lockutils [req-6aad24c0-e8bc-4a76-a464-b847fa3592f8 req-682054d2-1375-44a3-9df9-32a1b3a1c035 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.479 222021 DEBUG oslo_concurrency.lockutils [req-6aad24c0-e8bc-4a76-a464-b847fa3592f8 req-682054d2-1375-44a3-9df9-32a1b3a1c035 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.480 222021 DEBUG oslo_concurrency.lockutils [req-6aad24c0-e8bc-4a76-a464-b847fa3592f8 req-682054d2-1375-44a3-9df9-32a1b3a1c035 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.480 222021 DEBUG nova.compute.manager [req-6aad24c0-e8bc-4a76-a464-b847fa3592f8 req-682054d2-1375-44a3-9df9-32a1b3a1c035 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] No waiting events found dispatching network-vif-unplugged-10fe80c9-2f99-4371-a60e-b8b226c250aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:17 np0005593233 nova_compute[222017]: 2026-01-23 09:54:17.480 222021 WARNING nova.compute.manager [req-6aad24c0-e8bc-4a76-a464-b847fa3592f8 req-682054d2-1375-44a3-9df9-32a1b3a1c035 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received unexpected event network-vif-unplugged-10fe80c9-2f99-4371-a60e-b8b226c250aa for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:54:18 np0005593233 nova_compute[222017]: 2026-01-23 09:54:18.010 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:18 np0005593233 nova_compute[222017]: 2026-01-23 09:54:18.011 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquired lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:18 np0005593233 nova_compute[222017]: 2026-01-23 09:54:18.012 222021 DEBUG nova.network.neutron [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:54:18 np0005593233 nova_compute[222017]: 2026-01-23 09:54:18.218 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: 08ec741b-b592-417f-9a64-1ee4d2e4e006 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:18.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:18.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.049 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.680 222021 DEBUG nova.compute.manager [req-b16a9e95-5e4e-40e0-8209-893eae362738 req-5765c90d-f68a-43d7-bb05-588bea02a63e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.680 222021 DEBUG oslo_concurrency.lockutils [req-b16a9e95-5e4e-40e0-8209-893eae362738 req-5765c90d-f68a-43d7-bb05-588bea02a63e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.681 222021 DEBUG oslo_concurrency.lockutils [req-b16a9e95-5e4e-40e0-8209-893eae362738 req-5765c90d-f68a-43d7-bb05-588bea02a63e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.681 222021 DEBUG oslo_concurrency.lockutils [req-b16a9e95-5e4e-40e0-8209-893eae362738 req-5765c90d-f68a-43d7-bb05-588bea02a63e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.681 222021 DEBUG nova.compute.manager [req-b16a9e95-5e4e-40e0-8209-893eae362738 req-5765c90d-f68a-43d7-bb05-588bea02a63e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] No waiting events found dispatching network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:19 np0005593233 nova_compute[222017]: 2026-01-23 09:54:19.681 222021 WARNING nova.compute.manager [req-b16a9e95-5e4e-40e0-8209-893eae362738 req-5765c90d-f68a-43d7-bb05-588bea02a63e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received unexpected event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:54:20 np0005593233 nova_compute[222017]: 2026-01-23 09:54:20.105 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:20 np0005593233 nova_compute[222017]: 2026-01-23 09:54:20.341 222021 DEBUG nova.compute.manager [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-changed-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:20 np0005593233 nova_compute[222017]: 2026-01-23 09:54:20.341 222021 DEBUG nova.compute.manager [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Refreshing instance network info cache due to event network-changed-10fe80c9-2f99-4371-a60e-b8b226c250aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:20 np0005593233 nova_compute[222017]: 2026-01-23 09:54:20.341 222021 DEBUG oslo_concurrency.lockutils [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:20.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:20 np0005593233 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 04:54:20 np0005593233 systemd[253969]: Activating special unit Exit the Session...
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped target Main User Target.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped target Basic System.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped target Paths.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped target Sockets.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped target Timers.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:54:20 np0005593233 systemd[253969]: Closed D-Bus User Message Bus Socket.
Jan 23 04:54:20 np0005593233 systemd[253969]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:54:20 np0005593233 systemd[253969]: Removed slice User Application Slice.
Jan 23 04:54:20 np0005593233 systemd[253969]: Reached target Shutdown.
Jan 23 04:54:20 np0005593233 systemd[253969]: Finished Exit the Session.
Jan 23 04:54:20 np0005593233 systemd[253969]: Reached target Exit the Session.
Jan 23 04:54:20 np0005593233 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 04:54:20 np0005593233 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 04:54:20 np0005593233 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 04:54:20 np0005593233 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 04:54:20 np0005593233 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 04:54:20 np0005593233 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 04:54:20 np0005593233 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 04:54:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:20.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.594 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:22.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.862 222021 DEBUG nova.compute.manager [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.863 222021 DEBUG nova.compute.manager [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.864 222021 DEBUG oslo_concurrency.lockutils [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.864 222021 DEBUG oslo_concurrency.lockutils [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:22 np0005593233 nova_compute[222017]: 2026-01-23 09:54:22.865 222021 DEBUG nova.network.neutron [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:23 np0005593233 nova_compute[222017]: 2026-01-23 09:54:23.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:24 np0005593233 nova_compute[222017]: 2026-01-23 09:54:24.047 222021 DEBUG nova.network.neutron [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:24 np0005593233 nova_compute[222017]: 2026-01-23 09:54:24.051 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:24 np0005593233 nova_compute[222017]: 2026-01-23 09:54:24.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:24 np0005593233 nova_compute[222017]: 2026-01-23 09:54:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:24.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:24.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:25 np0005593233 nova_compute[222017]: 2026-01-23 09:54:25.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:25 np0005593233 nova_compute[222017]: 2026-01-23 09:54:25.674 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:25 np0005593233 nova_compute[222017]: 2026-01-23 09:54:25.674 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:25 np0005593233 nova_compute[222017]: 2026-01-23 09:54:25.675 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:25 np0005593233 nova_compute[222017]: 2026-01-23 09:54:25.675 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:54:25 np0005593233 nova_compute[222017]: 2026-01-23 09:54:25.675 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2187426068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.201 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.365 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.366 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4630MB free_disk=20.922027587890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.366 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.367 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.472 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Applying migration context for instance fffef24b-bb5b-41c6-a049-c1c4ba8f02fb as it has an incoming, in-progress migration 9a6f94e7-d356-40f3-896e-69daee93b7ad. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.474 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating resource usage from migration 9a6f94e7-d356-40f3-896e-69daee93b7ad#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.571 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance c4100b68-be14-4cd7-8243-2c9a793caa5f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.571 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance fffef24b-bb5b-41c6-a049-c1c4ba8f02fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.572 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.572 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.633 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.669 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.669 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:54:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:26.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.724 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.798 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:54:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:26.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:26 np0005593233 nova_compute[222017]: 2026-01-23 09:54:26.950 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2145352304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.426 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.434 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.670 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.716 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.717 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.775 222021 DEBUG nova.network.neutron [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.818 222021 DEBUG oslo_concurrency.lockutils [req-53185e7d-a97c-4fcd-8101-b74cd9169067 req-274a5803-8288-4c80-a6bb-ca2610a0d4e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:27 np0005593233 nova_compute[222017]: 2026-01-23 09:54:27.987 222021 DEBUG nova.network.neutron [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating instance_info_cache with network_info: [{"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.014 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Releasing lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.018 222021 DEBUG oslo_concurrency.lockutils [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.018 222021 DEBUG nova.network.neutron [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Refreshing network info cache for port 10fe80c9-2f99-4371-a60e-b8b226c250aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.183 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.186 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.187 222021 INFO nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Creating image(s)#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.248 222021 DEBUG nova.storage.rbd_utils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] creating snapshot(nova-resize) on rbd image(fffef24b-bb5b-41c6-a049-c1c4ba8f02fb_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:54:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:28.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.717 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.718 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.718 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.742 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:54:28 np0005593233 nova_compute[222017]: 2026-01-23 09:54:28.743 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:28.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e257 e257: 3 total, 3 up, 3 in
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.054 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.287 222021 DEBUG nova.objects.instance [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.429 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.430 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Ensure instance console log exists: /var/lib/nova/instances/fffef24b-bb5b-41c6-a049-c1c4ba8f02fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.431 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.431 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.431 222021 DEBUG oslo_concurrency.lockutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.434 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Start _get_guest_xml network_info=[{"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:45:b2:d4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.440 222021 WARNING nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.445 222021 DEBUG nova.virt.libvirt.host [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.446 222021 DEBUG nova.virt.libvirt.host [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.449 222021 DEBUG nova.virt.libvirt.host [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.450 222021 DEBUG nova.virt.libvirt.host [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.451 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.452 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.452 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.452 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.453 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.453 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.453 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.454 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.454 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.454 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.454 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.455 222021 DEBUG nova.virt.hardware [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.455 222021 DEBUG nova.objects.instance [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:29 np0005593233 nova_compute[222017]: 2026-01-23 09:54:29.475 222021 DEBUG oslo_concurrency.processutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1020189206' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.017 222021 DEBUG oslo_concurrency.processutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.060 222021 DEBUG oslo_concurrency.processutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.111 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.229 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: 8f671859-24dc-4140-915c-bbad6f16e0d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1477358881' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.523 222021 DEBUG oslo_concurrency.processutils [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.525 222021 DEBUG nova.virt.libvirt.vif [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1974132906',display_name='tempest-ServerDiskConfigTestJSON-server-1974132906',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1974132906',id=79,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:53:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-09i9yxy4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:54:14Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=fffef24b-bb5b-41c6-a049-c1c4ba8f02fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:45:b2:d4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.526 222021 DEBUG nova.network.os_vif_util [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:45:b2:d4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.527 222021 DEBUG nova.network.os_vif_util [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.530 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <uuid>fffef24b-bb5b-41c6-a049-c1c4ba8f02fb</uuid>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <name>instance-0000004f</name>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <memory>196608</memory>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1974132906</nova:name>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:54:29</nova:creationTime>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.micro">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:memory>192</nova:memory>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <nova:port uuid="10fe80c9-2f99-4371-a60e-b8b226c250aa">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <entry name="serial">fffef24b-bb5b-41c6-a049-c1c4ba8f02fb</entry>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <entry name="uuid">fffef24b-bb5b-41c6-a049-c1c4ba8f02fb</entry>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fffef24b-bb5b-41c6-a049-c1c4ba8f02fb_disk">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fffef24b-bb5b-41c6-a049-c1c4ba8f02fb_disk.config">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:45:b2:d4"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <target dev="tap10fe80c9-2f"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/fffef24b-bb5b-41c6-a049-c1c4ba8f02fb/console.log" append="off"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:54:30 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:54:30 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:54:30 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:54:30 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.533 222021 DEBUG nova.virt.libvirt.vif [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1974132906',display_name='tempest-ServerDiskConfigTestJSON-server-1974132906',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1974132906',id=79,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:53:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-09i9yxy4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:54:14Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=fffef24b-bb5b-41c6-a049-c1c4ba8f02fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:45:b2:d4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.533 222021 DEBUG nova.network.os_vif_util [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:45:b2:d4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.534 222021 DEBUG nova.network.os_vif_util [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.535 222021 DEBUG os_vif [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.535 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.536 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.536 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.540 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.541 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10fe80c9-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.541 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10fe80c9-2f, col_values=(('external_ids', {'iface-id': '10fe80c9-2f99-4371-a60e-b8b226c250aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:b2:d4', 'vm-uuid': 'fffef24b-bb5b-41c6-a049-c1c4ba8f02fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.543 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:30 np0005593233 NetworkManager[48871]: <info>  [1769162070.5440] manager: (tap10fe80c9-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.546 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.549 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:30 np0005593233 nova_compute[222017]: 2026-01-23 09:54:30.550 222021 INFO os_vif [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f')#033[00m
Jan 23 04:54:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:30.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:30.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.262 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.263 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.263 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:45:b2:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.264 222021 INFO nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Using config drive#033[00m
Jan 23 04:54:31 np0005593233 kernel: tap10fe80c9-2f: entered promiscuous mode
Jan 23 04:54:31 np0005593233 NetworkManager[48871]: <info>  [1769162071.3768] manager: (tap10fe80c9-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Jan 23 04:54:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:31Z|00280|binding|INFO|Claiming lport 10fe80c9-2f99-4371-a60e-b8b226c250aa for this chassis.
Jan 23 04:54:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:31Z|00281|binding|INFO|10fe80c9-2f99-4371-a60e-b8b226c250aa: Claiming fa:16:3e:45:b2:d4 10.100.0.12
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.383 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.389 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:b2:d4 10.100.0.12'], port_security=['fa:16:3e:45:b2:d4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fffef24b-bb5b-41c6-a049-c1c4ba8f02fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=10fe80c9-2f99-4371-a60e-b8b226c250aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.392 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 10fe80c9-2f99-4371-a60e-b8b226c250aa in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.395 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.414 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[23f3153a-c87a-4c12-8c01-83ef9a3da45e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.415 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.418 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.418 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d11fbc8d-6da7-4ee5-88a8-c8a46fa550fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.419 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7c1810-1d58-420f-9cc3-2fcad4a42038]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 systemd-machined[190954]: New machine qemu-40-instance-0000004f.
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.432 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[09d331a5-990c-44a6-a48a-241721921346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 systemd[1]: Started Virtual Machine qemu-40-instance-0000004f.
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.444 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:31Z|00282|binding|INFO|Setting lport 10fe80c9-2f99-4371-a60e-b8b226c250aa ovn-installed in OVS
Jan 23 04:54:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:31Z|00283|binding|INFO|Setting lport 10fe80c9-2f99-4371-a60e-b8b226c250aa up in Southbound
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.448 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 systemd-udevd[254407]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.455 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8321f56d-a6f1-4158-ba45-bc721143a203]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 NetworkManager[48871]: <info>  [1769162071.4726] device (tap10fe80c9-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:54:31 np0005593233 NetworkManager[48871]: <info>  [1769162071.4732] device (tap10fe80c9-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.493 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1db7ef20-d456-43b6-a296-7e726a8e5434]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.500 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c19a4888-18df-4559-b289-897c1518f210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 NetworkManager[48871]: <info>  [1769162071.5015] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.541 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6948e7f2-8635-4722-9288-ec17cafc532e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.545 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8ceefac7-6505-475d-88e5-cc7e727733cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 NetworkManager[48871]: <info>  [1769162071.5750] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.583 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[86167fc0-b07a-409d-8171-b49a916c7816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.603 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1da4fc7e-fcd1-4427-a0b0-165cac23b075]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598420, 'reachable_time': 42625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254437, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.621 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a622c389-fd62-4d19-9993-37eb90b6b627]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 598420, 'tstamp': 598420}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254438, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.642 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[35a73b42-7d7c-4953-aae1-54b03dbb4d79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598420, 'reachable_time': 42625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254439, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.687 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5114d0-81fa-4b27-9a98-395e278d14db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.768 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9bde171f-2624-46e4-ae45-e4295f2b0b52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.770 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.770 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.771 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:31 np0005593233 NetworkManager[48871]: <info>  [1769162071.8138] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 23 04:54:31 np0005593233 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.813 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.815 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.816 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.817 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:31Z|00284|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:54:31 np0005593233 nova_compute[222017]: 2026-01-23 09:54:31.831 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.832 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.833 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[51b16166-fbcb-4531-869a-ca1df6392b1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.833 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:54:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:31.834 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.042 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162072.0419676, fffef24b-bb5b-41c6-a049-c1c4ba8f02fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.043 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.045 222021 DEBUG nova.compute.manager [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.049 222021 INFO nova.virt.libvirt.driver [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Instance running successfully.#033[00m
Jan 23 04:54:32 np0005593233 virtqemud[221325]: argument unsupported: QEMU guest agent is not configured
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.051 222021 DEBUG nova.virt.libvirt.guest [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.052 222021 DEBUG nova.virt.libvirt.driver [None req-0e3d80b6-54ee-4047-ae8f-401c4c3f54e7 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.090 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.103 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.236 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.236 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162072.0436141, fffef24b-bb5b-41c6-a049-c1c4ba8f02fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.237 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] VM Started (Lifecycle Event)#033[00m
Jan 23 04:54:32 np0005593233 podman[254512]: 2026-01-23 09:54:32.253605674 +0000 UTC m=+0.059056597 container create d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 04:54:32 np0005593233 systemd[1]: Started libpod-conmon-d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd.scope.
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.284 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:32 np0005593233 nova_compute[222017]: 2026-01-23 09:54:32.290 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:32 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:54:32 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb2f3fc93c4b90026f6807844d9c2b0f3f48312ae61da93aa8b1192c909b8051/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:32 np0005593233 podman[254512]: 2026-01-23 09:54:32.225986289 +0000 UTC m=+0.031437242 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:54:32 np0005593233 podman[254512]: 2026-01-23 09:54:32.330368275 +0000 UTC m=+0.135819198 container init d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:32 np0005593233 podman[254512]: 2026-01-23 09:54:32.337288269 +0000 UTC m=+0.142739192 container start d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:54:32 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [NOTICE]   (254542) : New worker (254551) forked
Jan 23 04:54:32 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [NOTICE]   (254542) : Loading success.
Jan 23 04:54:32 np0005593233 podman[254525]: 2026-01-23 09:54:32.404170544 +0000 UTC m=+0.109888531 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:54:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:32.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:32.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.198 222021 DEBUG nova.compute.manager [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-8f671859-24dc-4140-915c-bbad6f16e0d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.200 222021 DEBUG nova.compute.manager [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-8f671859-24dc-4140-915c-bbad6f16e0d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.200 222021 DEBUG oslo_concurrency.lockutils [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.201 222021 DEBUG oslo_concurrency.lockutils [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.201 222021 DEBUG nova.network.neutron [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 8f671859-24dc-4140-915c-bbad6f16e0d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.467 222021 DEBUG nova.compute.manager [req-e854574f-6a79-43c9-b5dd-5c6fe9d2e868 req-e0ec29a7-5a2f-43e1-9a64-d7af874824e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.467 222021 DEBUG oslo_concurrency.lockutils [req-e854574f-6a79-43c9-b5dd-5c6fe9d2e868 req-e0ec29a7-5a2f-43e1-9a64-d7af874824e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.467 222021 DEBUG oslo_concurrency.lockutils [req-e854574f-6a79-43c9-b5dd-5c6fe9d2e868 req-e0ec29a7-5a2f-43e1-9a64-d7af874824e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.468 222021 DEBUG oslo_concurrency.lockutils [req-e854574f-6a79-43c9-b5dd-5c6fe9d2e868 req-e0ec29a7-5a2f-43e1-9a64-d7af874824e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.468 222021 DEBUG nova.compute.manager [req-e854574f-6a79-43c9-b5dd-5c6fe9d2e868 req-e0ec29a7-5a2f-43e1-9a64-d7af874824e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] No waiting events found dispatching network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.468 222021 WARNING nova.compute.manager [req-e854574f-6a79-43c9-b5dd-5c6fe9d2e868 req-e0ec29a7-5a2f-43e1-9a64-d7af874824e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received unexpected event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa for instance with vm_state resized and task_state None.#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.740 222021 DEBUG nova.network.neutron [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updated VIF entry in instance network info cache for port 10fe80c9-2f99-4371-a60e-b8b226c250aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.741 222021 DEBUG nova.network.neutron [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating instance_info_cache with network_info: [{"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.831 222021 DEBUG nova.network.neutron [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.917 222021 DEBUG oslo_concurrency.lockutils [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.918 222021 DEBUG nova.compute.manager [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-08ec741b-b592-417f-9a64-1ee4d2e4e006 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.918 222021 DEBUG nova.compute.manager [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-08ec741b-b592-417f-9a64-1ee4d2e4e006. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.918 222021 DEBUG oslo_concurrency.lockutils [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.919 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.919 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:54:33 np0005593233 nova_compute[222017]: 2026-01-23 09:54:33.920 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:34.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:34.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.140 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.660 222021 DEBUG nova.compute.manager [req-197e86f0-d823-4e36-98de-1bc05f257296 req-d5b4edd7-4c72-4e5e-8d1a-59530d5bf903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.661 222021 DEBUG oslo_concurrency.lockutils [req-197e86f0-d823-4e36-98de-1bc05f257296 req-d5b4edd7-4c72-4e5e-8d1a-59530d5bf903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.662 222021 DEBUG oslo_concurrency.lockutils [req-197e86f0-d823-4e36-98de-1bc05f257296 req-d5b4edd7-4c72-4e5e-8d1a-59530d5bf903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.662 222021 DEBUG oslo_concurrency.lockutils [req-197e86f0-d823-4e36-98de-1bc05f257296 req-d5b4edd7-4c72-4e5e-8d1a-59530d5bf903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.662 222021 DEBUG nova.compute.manager [req-197e86f0-d823-4e36-98de-1bc05f257296 req-d5b4edd7-4c72-4e5e-8d1a-59530d5bf903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] No waiting events found dispatching network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.663 222021 WARNING nova.compute.manager [req-197e86f0-d823-4e36-98de-1bc05f257296 req-d5b4edd7-4c72-4e5e-8d1a-59530d5bf903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received unexpected event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa for instance with vm_state resized and task_state None.#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.707 222021 DEBUG nova.network.neutron [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.759 222021 DEBUG oslo_concurrency.lockutils [req-f09ab27a-b4e4-4a27-88a9-369171e94e74 req-58629189-88a1-4857-bffd-dab0ae2cee58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.761 222021 DEBUG oslo_concurrency.lockutils [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:35 np0005593233 nova_compute[222017]: 2026-01-23 09:54:35.761 222021 DEBUG nova.network.neutron [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 08ec741b-b592-417f-9a64-1ee4d2e4e006 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:36 np0005593233 nova_compute[222017]: 2026-01-23 09:54:36.190 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:36.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:36.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:37 np0005593233 nova_compute[222017]: 2026-01-23 09:54:37.671 222021 DEBUG nova.network.neutron [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:37 np0005593233 nova_compute[222017]: 2026-01-23 09:54:37.744 222021 DEBUG nova.compute.manager [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:37 np0005593233 nova_compute[222017]: 2026-01-23 09:54:37.745 222021 DEBUG nova.compute.manager [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:37 np0005593233 nova_compute[222017]: 2026-01-23 09:54:37.746 222021 DEBUG oslo_concurrency.lockutils [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:38.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:38.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:39 np0005593233 nova_compute[222017]: 2026-01-23 09:54:39.163 222021 DEBUG nova.network.neutron [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:39 np0005593233 nova_compute[222017]: 2026-01-23 09:54:39.183 222021 DEBUG oslo_concurrency.lockutils [req-beb77c0d-7fe0-4f20-9ad1-c7c5d45bc557 req-77cdfd7f-dd57-4059-84e2-b19eac6aedc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:39 np0005593233 nova_compute[222017]: 2026-01-23 09:54:39.185 222021 DEBUG oslo_concurrency.lockutils [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:39 np0005593233 nova_compute[222017]: 2026-01-23 09:54:39.185 222021 DEBUG nova.network.neutron [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:39 np0005593233 nova_compute[222017]: 2026-01-23 09:54:39.728 222021 DEBUG nova.network.neutron [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.143 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.171 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating instance_info_cache with network_info: [{"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.527 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.528 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.528 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.528 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.529 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:54:40 np0005593233 nova_compute[222017]: 2026-01-23 09:54:40.546 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:40.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:40.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:41 np0005593233 nova_compute[222017]: 2026-01-23 09:54:41.120 222021 DEBUG nova.network.neutron [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:41 np0005593233 nova_compute[222017]: 2026-01-23 09:54:41.162 222021 DEBUG oslo_concurrency.lockutils [req-9ed3e978-0bfb-444e-9e6d-174abc71b197 req-771a0103-b1b9-451f-9251-5826ceb7c83a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.215 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: 1e83b219-c51b-488a-8fc7-8240efb384c0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.498 222021 DEBUG nova.compute.manager [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-1e83b219-c51b-488a-8fc7-8240efb384c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.499 222021 DEBUG nova.compute.manager [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-1e83b219-c51b-488a-8fc7-8240efb384c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.499 222021 DEBUG oslo_concurrency.lockutils [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.499 222021 DEBUG oslo_concurrency.lockutils [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.500 222021 DEBUG nova.network.neutron [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 1e83b219-c51b-488a-8fc7-8240efb384c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:42 np0005593233 nova_compute[222017]: 2026-01-23 09:54:42.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:42.646 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:42.648 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:42.656 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:42.656 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:42.657 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:42.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:42.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:43 np0005593233 nova_compute[222017]: 2026-01-23 09:54:43.153 222021 DEBUG nova.network.neutron [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:44.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e258 e258: 3 total, 3 up, 3 in
Jan 23 04:54:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:44.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:44 np0005593233 nova_compute[222017]: 2026-01-23 09:54:44.911 222021 DEBUG nova.network.neutron [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:44 np0005593233 nova_compute[222017]: 2026-01-23 09:54:44.946 222021 DEBUG oslo_concurrency.lockutils [req-0ead1941-54a4-4cb2-b5e2-3a6009efcb0f req-7c8a59e6-9d53-4245-abf5-9ac113cc5e1d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.183 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.190 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/973634706' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.893 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Successfully updated port: 2358fe4c-654b-4b88-9e08-2e85688cb00e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.924 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.925 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:45 np0005593233 nova_compute[222017]: 2026-01-23 09:54:45.925 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:54:46 np0005593233 podman[254567]: 2026-01-23 09:54:46.051043176 +0000 UTC m=+0.055704092 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:54:46 np0005593233 nova_compute[222017]: 2026-01-23 09:54:46.547 222021 DEBUG nova.compute.manager [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-2358fe4c-654b-4b88-9e08-2e85688cb00e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:46 np0005593233 nova_compute[222017]: 2026-01-23 09:54:46.547 222021 DEBUG nova.compute.manager [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-2358fe4c-654b-4b88-9e08-2e85688cb00e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:46 np0005593233 nova_compute[222017]: 2026-01-23 09:54:46.548 222021 DEBUG oslo_concurrency.lockutils [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:46 np0005593233 nova_compute[222017]: 2026-01-23 09:54:46.711 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:46.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:46.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:47Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:b2:d4 10.100.0.12
Jan 23 04:54:47 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:47Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:b2:d4 10.100.0.12
Jan 23 04:54:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:48.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:48.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:50 np0005593233 nova_compute[222017]: 2026-01-23 09:54:50.185 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:50 np0005593233 nova_compute[222017]: 2026-01-23 09:54:50.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:50.651 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:50.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:50.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e259 e259: 3 total, 3 up, 3 in
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.141 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.142 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.142 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.142 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.143 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.144 222021 INFO nova.compute.manager [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Terminating instance#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.145 222021 DEBUG nova.compute.manager [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:54:51 np0005593233 kernel: tap10fe80c9-2f (unregistering): left promiscuous mode
Jan 23 04:54:51 np0005593233 NetworkManager[48871]: <info>  [1769162091.2543] device (tap10fe80c9-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:54:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:51Z|00285|binding|INFO|Releasing lport 10fe80c9-2f99-4371-a60e-b8b226c250aa from this chassis (sb_readonly=0)
Jan 23 04:54:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:51Z|00286|binding|INFO|Setting lport 10fe80c9-2f99-4371-a60e-b8b226c250aa down in Southbound
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.259 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 ovn_controller[130653]: 2026-01-23T09:54:51Z|00287|binding|INFO|Removing iface tap10fe80c9-2f ovn-installed in OVS
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.261 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.271 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:b2:d4 10.100.0.12'], port_security=['fa:16:3e:45:b2:d4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'fffef24b-bb5b-41c6-a049-c1c4ba8f02fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '7', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=10fe80c9-2f99-4371-a60e-b8b226c250aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.273 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 10fe80c9-2f99-4371-a60e-b8b226c250aa in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.275 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.277 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2f20c5a4-719e-4a18-a96b-25b7aa8fffe8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.278 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.283 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 23 04:54:51 np0005593233 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004f.scope: Consumed 14.380s CPU time.
Jan 23 04:54:51 np0005593233 systemd-machined[190954]: Machine qemu-40-instance-0000004f terminated.
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.388 222021 INFO nova.virt.libvirt.driver [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Instance destroyed successfully.#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.389 222021 DEBUG nova.objects.instance [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid fffef24b-bb5b-41c6-a049-c1c4ba8f02fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:51 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [NOTICE]   (254542) : haproxy version is 2.8.14-c23fe91
Jan 23 04:54:51 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [NOTICE]   (254542) : path to executable is /usr/sbin/haproxy
Jan 23 04:54:51 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [WARNING]  (254542) : Exiting Master process...
Jan 23 04:54:51 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [ALERT]    (254542) : Current worker (254551) exited with code 143 (Terminated)
Jan 23 04:54:51 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[254528]: [WARNING]  (254542) : All workers exited. Exiting... (0)
Jan 23 04:54:51 np0005593233 systemd[1]: libpod-d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd.scope: Deactivated successfully.
Jan 23 04:54:51 np0005593233 podman[254613]: 2026-01-23 09:54:51.458593605 +0000 UTC m=+0.061119494 container died d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.462 222021 DEBUG nova.virt.libvirt.vif [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1974132906',display_name='tempest-ServerDiskConfigTestJSON-server-1974132906',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1974132906',id=79,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:54:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-09i9yxy4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:54:45Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=fffef24b-bb5b-41c6-a049-c1c4ba8f02fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.463 222021 DEBUG nova.network.os_vif_util [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "address": "fa:16:3e:45:b2:d4", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10fe80c9-2f", "ovs_interfaceid": "10fe80c9-2f99-4371-a60e-b8b226c250aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.464 222021 DEBUG nova.network.os_vif_util [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.464 222021 DEBUG os_vif [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.467 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10fe80c9-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.469 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.471 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.475 222021 INFO os_vif [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:b2:d4,bridge_name='br-int',has_traffic_filtering=True,id=10fe80c9-2f99-4371-a60e-b8b226c250aa,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10fe80c9-2f')#033[00m
Jan 23 04:54:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd-userdata-shm.mount: Deactivated successfully.
Jan 23 04:54:51 np0005593233 systemd[1]: var-lib-containers-storage-overlay-eb2f3fc93c4b90026f6807844d9c2b0f3f48312ae61da93aa8b1192c909b8051-merged.mount: Deactivated successfully.
Jan 23 04:54:51 np0005593233 podman[254613]: 2026-01-23 09:54:51.579463684 +0000 UTC m=+0.181989553 container cleanup d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:54:51 np0005593233 systemd[1]: libpod-conmon-d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd.scope: Deactivated successfully.
Jan 23 04:54:51 np0005593233 podman[254664]: 2026-01-23 09:54:51.696864234 +0000 UTC m=+0.085504217 container remove d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.706 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[005f12a7-ecea-4b08-b3cd-39da01b43063]: (4, ('Fri Jan 23 09:54:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd)\nd9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd\nFri Jan 23 09:54:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (d9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd)\nd9921b7fa79ac67fd603eabb447d5afd79566f60af9718584de9ee2074bd56fd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.709 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bc68a85a-8666-4994-b00c-5c3f98a3eb69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.711 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.713 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.716 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.719 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5dab30-7be5-436c-aa29-3f01439fc338]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 nova_compute[222017]: 2026-01-23 09:54:51.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.736 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9719ee-2a94-464b-b11f-adbd21777da4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.738 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c39813-ce93-4053-834c-bac0a9a04647]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.756 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0eddb069-b7e1-4b1e-91c9-f306b858e9fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598411, 'reachable_time': 17050, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254679, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:51 np0005593233 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.760 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:54:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:54:51.760 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[665c189c-c8b2-4bfe-a829-fafaed12e852]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.247 222021 DEBUG nova.compute.manager [req-ea976231-d6e6-4733-a841-1152ed8ea11b req-423ffba4-8a0d-405a-bfd6-2df79b25648a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-unplugged-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.248 222021 DEBUG oslo_concurrency.lockutils [req-ea976231-d6e6-4733-a841-1152ed8ea11b req-423ffba4-8a0d-405a-bfd6-2df79b25648a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.248 222021 DEBUG oslo_concurrency.lockutils [req-ea976231-d6e6-4733-a841-1152ed8ea11b req-423ffba4-8a0d-405a-bfd6-2df79b25648a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.248 222021 DEBUG oslo_concurrency.lockutils [req-ea976231-d6e6-4733-a841-1152ed8ea11b req-423ffba4-8a0d-405a-bfd6-2df79b25648a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.248 222021 DEBUG nova.compute.manager [req-ea976231-d6e6-4733-a841-1152ed8ea11b req-423ffba4-8a0d-405a-bfd6-2df79b25648a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] No waiting events found dispatching network-vif-unplugged-10fe80c9-2f99-4371-a60e-b8b226c250aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.249 222021 DEBUG nova.compute.manager [req-ea976231-d6e6-4733-a841-1152ed8ea11b req-423ffba4-8a0d-405a-bfd6-2df79b25648a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-unplugged-10fe80c9-2f99-4371-a60e-b8b226c250aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:54:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.798 222021 INFO nova.virt.libvirt.driver [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Deleting instance files /var/lib/nova/instances/fffef24b-bb5b-41c6-a049-c1c4ba8f02fb_del#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.800 222021 INFO nova.virt.libvirt.driver [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Deletion of /var/lib/nova/instances/fffef24b-bb5b-41c6-a049-c1c4ba8f02fb_del complete#033[00m
Jan 23 04:54:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:52.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.905 222021 INFO nova.compute.manager [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Took 1.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.907 222021 DEBUG oslo.service.loopingcall [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.907 222021 DEBUG nova.compute.manager [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:54:52 np0005593233 nova_compute[222017]: 2026-01-23 09:54:52.907 222021 DEBUG nova.network.neutron [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.549 222021 DEBUG nova.compute.manager [req-f0ba74ca-8a92-47ab-b859-3c8873fb2041 req-f0995696-ecb6-4bb2-9161-4de8f3e24e91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.550 222021 DEBUG oslo_concurrency.lockutils [req-f0ba74ca-8a92-47ab-b859-3c8873fb2041 req-f0995696-ecb6-4bb2-9161-4de8f3e24e91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.550 222021 DEBUG oslo_concurrency.lockutils [req-f0ba74ca-8a92-47ab-b859-3c8873fb2041 req-f0995696-ecb6-4bb2-9161-4de8f3e24e91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.550 222021 DEBUG oslo_concurrency.lockutils [req-f0ba74ca-8a92-47ab-b859-3c8873fb2041 req-f0995696-ecb6-4bb2-9161-4de8f3e24e91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.550 222021 DEBUG nova.compute.manager [req-f0ba74ca-8a92-47ab-b859-3c8873fb2041 req-f0995696-ecb6-4bb2-9161-4de8f3e24e91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] No waiting events found dispatching network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.551 222021 WARNING nova.compute.manager [req-f0ba74ca-8a92-47ab-b859-3c8873fb2041 req-f0995696-ecb6-4bb2-9161-4de8f3e24e91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received unexpected event network-vif-plugged-10fe80c9-2f99-4371-a60e-b8b226c250aa for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:54:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.831 222021 DEBUG nova.network.neutron [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:54.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.882 222021 INFO nova.compute.manager [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Took 1.97 seconds to deallocate network for instance.#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.957 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.958 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:54 np0005593233 nova_compute[222017]: 2026-01-23 09:54:54.977 222021 DEBUG nova.compute.manager [req-57447de4-8597-4afe-8c58-036e7e20eedb req-a0f0334b-7a70-4ad0-8fd3-2e3b889c59c4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Received event network-vif-deleted-10fe80c9-2f99-4371-a60e-b8b226c250aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.067 222021 DEBUG oslo_concurrency.processutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.188 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1067011283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.543 222021 DEBUG oslo_concurrency.processutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.550 222021 DEBUG nova.compute.provider_tree [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.579 222021 DEBUG nova.scheduler.client.report [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.617 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.681 222021 INFO nova.scheduler.client.report [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Deleted allocations for instance fffef24b-bb5b-41c6-a049-c1c4ba8f02fb#033[00m
Jan 23 04:54:55 np0005593233 nova_compute[222017]: 2026-01-23 09:54:55.803 222021 DEBUG oslo_concurrency.lockutils [None req-10458039-48fc-42c4-8544-27ffaeec35be 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "fffef24b-bb5b-41c6-a049-c1c4ba8f02fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:56 np0005593233 nova_compute[222017]: 2026-01-23 09:54:56.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:56.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:54:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:56.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.045 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.046 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.076 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.242 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.243 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.248 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.249 222021 INFO nova.compute.claims [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:54:57 np0005593233 nova_compute[222017]: 2026-01-23 09:54:57.519 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2532078772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.009 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.017 222021 DEBUG nova.compute.provider_tree [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.050 222021 DEBUG nova.scheduler.client.report [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.093 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.094 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.185 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.185 222021 DEBUG nova.network.neutron [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.214 222021 INFO nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.257 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.339 222021 DEBUG nova.compute.manager [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.574 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.574 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.587 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.588 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.589 222021 INFO nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Creating image(s)#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.622 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.660 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.690 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.693 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.729 222021 DEBUG nova.policy [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cfac2191989448ead77e75ca3910ac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '86d938c8e2bb41a79012befd500d1088', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.735 222021 DEBUG nova.objects.instance [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_requests' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.760 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.760 222021 INFO nova.compute.claims [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.761 222021 DEBUG nova.objects.instance [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:58.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.771 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.772 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.772 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.772 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.796 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.799 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ec678068-aa1c-4926-abee-e4852fd8f1fd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.826 222021 DEBUG nova.objects.instance [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:54:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:58.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.912 222021 INFO nova.compute.resource_tracker [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating resource usage from migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e#033[00m
Jan 23 04:54:58 np0005593233 nova_compute[222017]: 2026-01-23 09:54:58.912 222021 DEBUG nova.compute.resource_tracker [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Starting to track incoming migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.073 222021 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.110 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ec678068-aa1c-4926-abee-e4852fd8f1fd_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.312s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.199 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] resizing rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.332 222021 DEBUG nova.objects.instance [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'migration_context' on Instance uuid ec678068-aa1c-4926-abee-e4852fd8f1fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.380 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.381 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Ensure instance console log exists: /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.381 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.381 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.382 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/45800569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.527 222021 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.532 222021 DEBUG nova.compute.provider_tree [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.563 222021 DEBUG nova.scheduler.client.report [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.612 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:59 np0005593233 nova_compute[222017]: 2026-01-23 09:54:59.612 222021 INFO nova.compute.manager [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Migrating#033[00m
Jan 23 04:55:00 np0005593233 nova_compute[222017]: 2026-01-23 09:55:00.192 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:00.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:00.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:01 np0005593233 nova_compute[222017]: 2026-01-23 09:55:01.473 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:02 np0005593233 nova_compute[222017]: 2026-01-23 09:55:02.412 222021 DEBUG nova.network.neutron [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Successfully created port: 4d7d3d07-d9ea-465d-b091-0ad0246436b2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:55:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:02.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:02.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:03 np0005593233 podman[254913]: 2026-01-23 09:55:03.107343338 +0000 UTC m=+0.099699156 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.071 222021 DEBUG nova.network.neutron [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.116 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.116 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance network_info: |[{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.117 222021 DEBUG oslo_concurrency.lockutils [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.118 222021 DEBUG nova.network.neutron [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 2358fe4c-654b-4b88-9e08-2e85688cb00e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.128 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Start _get_guest_xml network_info=[{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk', 'boot_index': '2'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk', 'boot_index': '3'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:
Jan 23 04:55:04 np0005593233 nova_compute[222017]: _ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'attached_at': '', 'detached_at': '', 'volume_id': '98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2', 'serial': '98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'c1e3e9d5-4d6c-4f1c-8c02-ec0d787ca865', 'volume_type': None}, {'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-a5f913c6-cfb1-486f-bcd3-8b5237feb9b1', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'a5f913c6-cfb1-486f-bcd3-8b5237feb9b1', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'attached_at': '', 'detached_at': '', 'volume_id': 'a5f913c6-cfb1-486f-bcd3-8b5237feb9b1', 'serial': 'a5f913c6-cfb1-486f-bcd3-8b5237feb9b1'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vdb', 'boot_index': 1, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'e5a6dd94-0880-4555-a334-033e3156a797', 'volume_type': None}, {'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-0c57b335-8fff-4dae-ab58-3a6c060df2a2', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '0c57b335-8fff-4dae-ab58-3a6c060df2a2', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'attached_at': '', 'detached_at': '', 'volume_id': '0c57b335-8fff-4dae-ab58-3a6c060df2a2', 'serial': '0c57b335-8fff-4dae-ab58-3a6c060df2a2'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vdc', 'boot_index': 2, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'dc578d3b-e60d-48b6-bc26-fe33ec917aaf', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.133 222021 WARNING nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.142 222021 DEBUG nova.virt.libvirt.host [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.143 222021 DEBUG nova.virt.libvirt.host [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.148 222021 DEBUG nova.virt.libvirt.host [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.148 222021 DEBUG nova.virt.libvirt.host [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.150 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.150 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.151 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.151 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.151 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.152 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.152 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.152 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.153 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.153 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.153 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.153 222021 DEBUG nova.virt.hardware [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.185 222021 DEBUG nova.storage.rbd_utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] rbd image c4100b68-be14-4cd7-8243-2c9a793caa5f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:55:04 np0005593233 rsyslogd[1009]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:55:04.128 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.191 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:55:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2306050176' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.643 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.768 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.769 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.770 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:04.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.771 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.771 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.772 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.773 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.773 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.774 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.774 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.775 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.775 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.776 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.776 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.777 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.778 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.779 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.779 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.780 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.780 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.781 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.782 222021 DEBUG nova.objects.instance [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4100b68-be14-4cd7-8243-2c9a793caa5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.801 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <uuid>c4100b68-be14-4cd7-8243-2c9a793caa5f</uuid>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <name>instance-00000051</name>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:name>tempest-device-tagging-server-451535713</nova:name>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:55:04</nova:creationTime>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:user uuid="d512838ce2b44554b0566fdbb3c702b4">tempest-TaggedBootDevicesTest-1431047957-project-member</nova:user>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:project uuid="9bc2d47d48c446c7ae1fc44cd9c32878">tempest-TaggedBootDevicesTest-1431047957</nova:project>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="0c6216c8-fb3f-49e9-a5c5-6eb26d39b408">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="08ec741b-b592-417f-9a64-1ee4d2e4e006">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.86" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.35" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="8f671859-24dc-4140-915c-bbad6f16e0d8">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.82" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="8adc9155-eb6d-41ea-ae59-7de12ff3fc5a">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.246" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="1e83b219-c51b-488a-8fc7-8240efb384c0">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.2.2.100" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <nova:port uuid="2358fe4c-654b-4b88-9e08-2e85688cb00e">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.2.2.200" ipVersion="4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <entry name="serial">c4100b68-be14-4cd7-8243-2c9a793caa5f</entry>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <entry name="uuid">c4100b68-be14-4cd7-8243-2c9a793caa5f</entry>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c4100b68-be14-4cd7-8243-2c9a793caa5f_disk.config">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <serial>98b96f14-bdc2-414c-aaaf-7d1dfaafbfd2</serial>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-a5f913c6-cfb1-486f-bcd3-8b5237feb9b1">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="vdb" bus="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <serial>a5f913c6-cfb1-486f-bcd3-8b5237feb9b1</serial>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-0c57b335-8fff-4dae-ab58-3a6c060df2a2">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="vdc" bus="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <serial>0c57b335-8fff-4dae-ab58-3a6c060df2a2</serial>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:bf:32:35"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tap0c6216c8-fb"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:53:6d:39"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tap08ec741b-b5"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:12:a4:39"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tapc7e6d8c9-43"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:7c:55:dc"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tap8f671859-24"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:e3:af:d4"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tap8adc9155-eb"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:6d:d6:55"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tap1e83b219-c5"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:6a:da:72"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <target dev="tap2358fe4c-65"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/console.log" append="off"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:55:04 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:55:04 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:55:04 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:55:04 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.803 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.804 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.805 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.805 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.805 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.806 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.806 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.806 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.806 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.806 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.806 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.807 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.807 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.807 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.807 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.807 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.807 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.808 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.808 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.808 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.808 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.808 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.809 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.809 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.809 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Preparing to wait for external event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.809 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.809 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.809 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.810 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.811 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.811 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.812 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.813 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.813 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.814 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.819 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c6216c8-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.819 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c6216c8-fb, col_values=(('external_ids', {'iface-id': '0c6216c8-fb3f-49e9-a5c5-6eb26d39b408', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:32:35', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.863 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.8661] manager: (tap0c6216c8-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.867 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:04.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.874 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb')#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.875 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.875 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.876 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.876 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.877 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.877 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.879 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.879 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08ec741b-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.880 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap08ec741b-b5, col_values=(('external_ids', {'iface-id': '08ec741b-b592-417f-9a64-1ee4d2e4e006', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:6d:39', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.8820] manager: (tap08ec741b-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.881 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.889 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.890 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5')#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.891 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.891 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.892 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.892 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.893 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.893 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.893 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.896 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7e6d8c9-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.896 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc7e6d8c9-43, col_values=(('external_ids', {'iface-id': 'c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:12:a4:39', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.897 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.8991] manager: (tapc7e6d8c9-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.899 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.908 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.909 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43')#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.910 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.910 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.911 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.911 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.911 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.911 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.912 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.914 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.914 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f671859-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.915 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f671859-24, col_values=(('external_ids', {'iface-id': '8f671859-24dc-4140-915c-bbad6f16e0d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:55:dc', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.916 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.9172] manager: (tap8f671859-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.918 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:04 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 systemd-logind[804]: New session 54 of user nova.
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.929 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24')#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.930 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.930 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.931 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.931 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.932 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.932 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.934 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8adc9155-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.934 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8adc9155-eb, col_values=(('external_ids', {'iface-id': '8adc9155-eb6d-41ea-ae59-7de12ff3fc5a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:af:d4', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.936 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.9370] manager: (tap8adc9155-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.938 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:04 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.951 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.952 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb')#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.952 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.953 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.953 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.953 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.954 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.954 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.956 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e83b219-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.956 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e83b219-c5, col_values=(('external_ids', {'iface-id': '1e83b219-c51b-488a-8fc7-8240efb384c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:d6:55', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.9582] manager: (tap1e83b219-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Jan 23 04:55:04 np0005593233 systemd[1]: Starting User Manager for UID 42436...
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.975 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5')#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.975 222021 DEBUG nova.virt.libvirt.vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.976 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.976 222021 DEBUG nova.network.os_vif_util [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.977 222021 DEBUG os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.977 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.977 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.977 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.979 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.979 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2358fe4c-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.979 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2358fe4c-65, col_values=(('external_ids', {'iface-id': '2358fe4c-654b-4b88-9e08-2e85688cb00e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:da:72', 'vm-uuid': 'c4100b68-be14-4cd7-8243-2c9a793caa5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.981 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:04 np0005593233 NetworkManager[48871]: <info>  [1769162104.9819] manager: (tap2358fe4c-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 23 04:55:04 np0005593233 nova_compute[222017]: 2026-01-23 09:55:04.983 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.001 222021 INFO os_vif [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65')#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.089 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.091 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.091 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] No VIF found with MAC fa:16:3e:bf:32:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.091 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] No VIF found with MAC fa:16:3e:e3:af:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.092 222021 INFO nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Using config drive#033[00m
Jan 23 04:55:05 np0005593233 systemd[254999]: Queued start job for default target Main User Target.
Jan 23 04:55:05 np0005593233 systemd[254999]: Created slice User Application Slice.
Jan 23 04:55:05 np0005593233 systemd[254999]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:55:05 np0005593233 systemd[254999]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:55:05 np0005593233 systemd[254999]: Reached target Paths.
Jan 23 04:55:05 np0005593233 systemd[254999]: Reached target Timers.
Jan 23 04:55:05 np0005593233 systemd[254999]: Starting D-Bus User Message Bus Socket...
Jan 23 04:55:05 np0005593233 systemd[254999]: Starting Create User's Volatile Files and Directories...
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.123 222021 DEBUG nova.storage.rbd_utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] rbd image c4100b68-be14-4cd7-8243-2c9a793caa5f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:55:05 np0005593233 systemd[254999]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:55:05 np0005593233 systemd[254999]: Reached target Sockets.
Jan 23 04:55:05 np0005593233 systemd[254999]: Finished Create User's Volatile Files and Directories.
Jan 23 04:55:05 np0005593233 systemd[254999]: Reached target Basic System.
Jan 23 04:55:05 np0005593233 systemd[254999]: Reached target Main User Target.
Jan 23 04:55:05 np0005593233 systemd[254999]: Startup finished in 140ms.
Jan 23 04:55:05 np0005593233 systemd[1]: Started User Manager for UID 42436.
Jan 23 04:55:05 np0005593233 systemd[1]: Started Session 54 of User nova.
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.193 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:05 np0005593233 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 04:55:05 np0005593233 systemd-logind[804]: Session 54 logged out. Waiting for processes to exit.
Jan 23 04:55:05 np0005593233 systemd-logind[804]: Removed session 54.
Jan 23 04:55:05 np0005593233 systemd-logind[804]: New session 56 of user nova.
Jan 23 04:55:05 np0005593233 systemd[1]: Started Session 56 of User nova.
Jan 23 04:55:05 np0005593233 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 04:55:05 np0005593233 systemd-logind[804]: Session 56 logged out. Waiting for processes to exit.
Jan 23 04:55:05 np0005593233 systemd-logind[804]: Removed session 56.
Jan 23 04:55:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.930 222021 DEBUG nova.network.neutron [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Successfully updated port: 4d7d3d07-d9ea-465d-b091-0ad0246436b2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.976 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "refresh_cache-ec678068-aa1c-4926-abee-e4852fd8f1fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.976 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquired lock "refresh_cache-ec678068-aa1c-4926-abee-e4852fd8f1fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:05 np0005593233 nova_compute[222017]: 2026-01-23 09:55:05.977 222021 DEBUG nova.network.neutron [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.138 222021 INFO nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Creating config drive at /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/disk.config#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.146 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmn54vnw9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.297 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmn54vnw9" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.336 222021 DEBUG nova.storage.rbd_utils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] rbd image c4100b68-be14-4cd7-8243-2c9a793caa5f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.341 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/disk.config c4100b68-be14-4cd7-8243-2c9a793caa5f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.386 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162091.3842428, fffef24b-bb5b-41c6-a049-c1c4ba8f02fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.387 222021 INFO nova.compute.manager [-] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.470 222021 DEBUG nova.compute.manager [None req-6fb8a8de-a45f-46a9-a633-c4336a0c8cca - - - - - -] [instance: fffef24b-bb5b-41c6-a049-c1c4ba8f02fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.506 222021 DEBUG nova.network.neutron [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.531 222021 DEBUG oslo_concurrency.processutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/disk.config c4100b68-be14-4cd7-8243-2c9a793caa5f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.533 222021 INFO nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Deleting local config drive /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f/disk.config because it was imported into RBD.#033[00m
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6179] manager: (tap0c6216c8-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Jan 23 04:55:06 np0005593233 kernel: tap0c6216c8-fb: entered promiscuous mode
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00288|binding|INFO|Claiming lport 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 for this chassis.
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.636 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00289|binding|INFO|0c6216c8-fb3f-49e9-a5c5-6eb26d39b408: Claiming fa:16:3e:bf:32:35 10.100.0.10
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.639 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6410] manager: (tap08ec741b-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Jan 23 04:55:06 np0005593233 kernel: tap08ec741b-b5: entered promiscuous mode
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6576] manager: (tapc7e6d8c9-43): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Jan 23 04:55:06 np0005593233 systemd-udevd[255121]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:06 np0005593233 systemd-udevd[255123]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:06 np0005593233 systemd-udevd[255124]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6742] manager: (tap8f671859-24): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 23 04:55:06 np0005593233 systemd-udevd[255129]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6849] device (tap08ec741b-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6854] device (tap0c6216c8-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6861] device (tap08ec741b-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6870] device (tap0c6216c8-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.6935] manager: (tap8adc9155-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/153)
Jan 23 04:55:06 np0005593233 systemd-udevd[255133]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7100] manager: (tap1e83b219-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00290|if_status|INFO|Not updating pb chassis for 08ec741b-b592-417f-9a64-1ee4d2e4e006 now as sb is readonly
Jan 23 04:55:06 np0005593233 kernel: tap8adc9155-eb: entered promiscuous mode
Jan 23 04:55:06 np0005593233 kernel: tapc7e6d8c9-43: entered promiscuous mode
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7194] device (tap8adc9155-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 kernel: tap8f671859-24: entered promiscuous mode
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7205] device (tapc7e6d8c9-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7218] device (tap8f671859-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7226] device (tap8adc9155-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7230] device (tapc7e6d8c9-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7236] device (tap8f671859-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 kernel: tap1e83b219-c5: entered promiscuous mode
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7265] device (tap1e83b219-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7284] manager: (tap2358fe4c-65): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7294] device (tap1e83b219-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:06 np0005593233 kernel: tap2358fe4c-65: entered promiscuous mode
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.741 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7453] device (tap2358fe4c-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.7459] device (tap2358fe4c-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:06 np0005593233 systemd-machined[190954]: New machine qemu-41-instance-00000051.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00291|binding|INFO|Claiming lport 8f671859-24dc-4140-915c-bbad6f16e0d8 for this chassis.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00292|binding|INFO|8f671859-24dc-4140-915c-bbad6f16e0d8: Claiming fa:16:3e:7c:55:dc 10.1.1.82
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00293|binding|INFO|Claiming lport 2358fe4c-654b-4b88-9e08-2e85688cb00e for this chassis.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00294|binding|INFO|2358fe4c-654b-4b88-9e08-2e85688cb00e: Claiming fa:16:3e:6a:da:72 10.2.2.200
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00295|binding|INFO|Claiming lport 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a for this chassis.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00296|binding|INFO|8adc9155-eb6d-41ea-ae59-7de12ff3fc5a: Claiming fa:16:3e:e3:af:d4 10.1.1.246
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00297|binding|INFO|Claiming lport c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b for this chassis.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00298|binding|INFO|c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b: Claiming fa:16:3e:12:a4:39 10.1.1.35
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00299|binding|INFO|Claiming lport 1e83b219-c51b-488a-8fc7-8240efb384c0 for this chassis.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00300|binding|INFO|1e83b219-c51b-488a-8fc7-8240efb384c0: Claiming fa:16:3e:6d:d6:55 10.2.2.100
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00301|binding|INFO|Claiming lport 08ec741b-b592-417f-9a64-1ee4d2e4e006 for this chassis.
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00302|binding|INFO|08ec741b-b592-417f-9a64-1ee4d2e4e006: Claiming fa:16:3e:53:6d:39 10.1.1.86
Jan 23 04:55:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.774 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:32:35 10.100.0.10'], port_security=['fa:16:3e:bf:32:35 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56f8805d-8a62-4352-aa44-760b906565e7, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:06.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.775 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 in datapath 6a690804-4ecf-4c63-9b31-acfe5ecf9a3a bound to our chassis#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.778 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a690804-4ecf-4c63-9b31-acfe5ecf9a3a#033[00m
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00303|binding|INFO|Setting lport 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 ovn-installed in OVS
Jan 23 04:55:06 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:06Z|00304|binding|INFO|Setting lport 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 up in Southbound
Jan 23 04:55:06 np0005593233 nova_compute[222017]: 2026-01-23 09:55:06.785 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:06 np0005593233 systemd[1]: Started Virtual Machine qemu-41-instance-00000051.
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.795 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6b9a52-ae7a-4531-8603-35a47a4f5c66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.796 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a690804-41 in ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.799 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a690804-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.800 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f9302566-20f7-4935-b11f-cf7f99e24dc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.801 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[30c2d5ce-725f-4181-817d-845b43f5fb58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.816 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[ff927295-41fb-4ade-b159-55ad0fd1ca93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.849 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[54d475d6-03a7-4657-86fe-a49a4e805474]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:06.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.890 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:af:d4 10.1.1.246'], port_security=['fa:16:3e:e3:af:d4 10.1.1.246'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.246/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.892 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:55:dc 10.1.1.82'], port_security=['fa:16:3e:7c:55:dc 10.1.1.82'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.82/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8f671859-24dc-4140-915c-bbad6f16e0d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.893 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:d6:55 10.2.2.100'], port_security=['fa:16:3e:6d:d6:55 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2e206f-a131-4bd2-8f90-67b8b0bc9e3d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1e83b219-c51b-488a-8fc7-8240efb384c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.894 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:a4:39 10.1.1.35'], port_security=['fa:16:3e:12:a4:39 10.1.1.35'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1279021389', 'neutron:cidrs': '10.1.1.35/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1279021389', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2aa5e406-7157-4dbd-9a15-73e3a671533e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.896 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:da:72 10.2.2.200'], port_security=['fa:16:3e:6a:da:72 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2e206f-a131-4bd2-8f90-67b8b0bc9e3d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=2358fe4c-654b-4b88-9e08-2e85688cb00e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.897 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:6d:39 10.1.1.86'], port_security=['fa:16:3e:53:6d:39 10.1.1.86'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-289949651', 'neutron:cidrs': '10.1.1.86/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-289949651', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2aa5e406-7157-4dbd-9a15-73e3a671533e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=08ec741b-b592-417f-9a64-1ee4d2e4e006) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.893 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[da285fc2-700c-439e-9182-3c16a21037f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.910 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab66aaa-ef68-4470-bbd4-e8b678377670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.9114] manager: (tap6a690804-40): new Veth device (/org/freedesktop/NetworkManager/Devices/156)
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.948 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[37a4e347-e753-4a71-a063-9ac8d788caa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.951 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b90266ab-cb92-42f8-babd-a7156dc32608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:06 np0005593233 NetworkManager[48871]: <info>  [1769162106.9810] device (tap6a690804-40): carrier: link connected
Jan 23 04:55:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:06.988 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b29a0845-9787-4eff-95d2-d7ce982b57c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00305|binding|INFO|Setting lport 08ec741b-b592-417f-9a64-1ee4d2e4e006 ovn-installed in OVS
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00306|binding|INFO|Setting lport 08ec741b-b592-417f-9a64-1ee4d2e4e006 up in Southbound
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00307|binding|INFO|Setting lport c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b ovn-installed in OVS
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00308|binding|INFO|Setting lport c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b up in Southbound
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00309|binding|INFO|Setting lport 1e83b219-c51b-488a-8fc7-8240efb384c0 ovn-installed in OVS
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00310|binding|INFO|Setting lport 1e83b219-c51b-488a-8fc7-8240efb384c0 up in Southbound
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00311|binding|INFO|Setting lport 2358fe4c-654b-4b88-9e08-2e85688cb00e ovn-installed in OVS
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00312|binding|INFO|Setting lport 2358fe4c-654b-4b88-9e08-2e85688cb00e up in Southbound
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00313|binding|INFO|Setting lport 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a ovn-installed in OVS
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00314|binding|INFO|Setting lport 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a up in Southbound
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00315|binding|INFO|Setting lport 8f671859-24dc-4140-915c-bbad6f16e0d8 ovn-installed in OVS
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00316|binding|INFO|Setting lport 8f671859-24dc-4140-915c-bbad6f16e0d8 up in Southbound
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.012 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2fb79a-7109-4441-9f31-7527b33107b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a690804-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:ab:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601961, 'reachable_time': 27161, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255186, 'error': None, 'target': 'ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.037 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[857fa8d8-6666-4319-b96d-04b5a2b09485]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:ab2e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 601961, 'tstamp': 601961}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255195, 'error': None, 'target': 'ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.059 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c121063e-3187-46cd-a392-a0508f366ba8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a690804-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:ab:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601961, 'reachable_time': 27161, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255199, 'error': None, 'target': 'ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.089 222021 DEBUG nova.compute.manager [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-changed-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.090 222021 DEBUG nova.compute.manager [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Refreshing instance network info cache due to event network-changed-4d7d3d07-d9ea-465d-b091-0ad0246436b2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.090 222021 DEBUG oslo_concurrency.lockutils [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ec678068-aa1c-4926-abee-e4852fd8f1fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.102 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d467c898-1cfa-40e9-ba32-ea1eea8c964f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.182 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c6eeaa4f-538f-4b30-8b75-25cdbf348e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.184 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a690804-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.184 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.184 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a690804-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.205 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593233 kernel: tap6a690804-40: entered promiscuous mode
Jan 23 04:55:07 np0005593233 NetworkManager[48871]: <info>  [1769162107.2087] manager: (tap6a690804-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.209 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.209 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a690804-40, col_values=(('external_ids', {'iface-id': '05764538-5c1e-43c4-889d-c55248036dc2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:07 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:07Z|00317|binding|INFO|Releasing lport 05764538-5c1e-43c4-889d-c55248036dc2 from this chassis (sb_readonly=0)
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.227 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.228 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a690804-4ecf-4c63-9b31-acfe5ecf9a3a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a690804-4ecf-4c63-9b31-acfe5ecf9a3a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.229 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[99ea871d-de44-4b20-9fb3-ffd94453e882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.230 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/6a690804-4ecf-4c63-9b31-acfe5ecf9a3a.pid.haproxy
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 6a690804-4ecf-4c63-9b31-acfe5ecf9a3a
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.231 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'env', 'PROCESS_TAG=haproxy-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a690804-4ecf-4c63-9b31-acfe5ecf9a3a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.410 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162107.4096212, c4100b68-be14-4cd7-8243-2c9a793caa5f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.411 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] VM Started (Lifecycle Event)#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.462 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.468 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162107.409851, c4100b68-be14-4cd7-8243-2c9a793caa5f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.469 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.492 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.497 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.546 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:55:07 np0005593233 podman[255297]: 2026-01-23 09:55:07.654450155 +0000 UTC m=+0.054965131 container create e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:07 np0005593233 systemd[1]: Started libpod-conmon-e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a.scope.
Jan 23 04:55:07 np0005593233 podman[255297]: 2026-01-23 09:55:07.626457861 +0000 UTC m=+0.026972857 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:55:07 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:55:07 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ba305a3110896dfe732f7f2da08e1d39b78bd5ead29d76a4f851ea4e6ae471d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:55:07 np0005593233 podman[255297]: 2026-01-23 09:55:07.755298412 +0000 UTC m=+0.155813408 container init e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:55:07 np0005593233 podman[255297]: 2026-01-23 09:55:07.761103205 +0000 UTC m=+0.161618181 container start e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:55:07 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [NOTICE]   (255316) : New worker (255318) forked
Jan 23 04:55:07 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [NOTICE]   (255316) : Loading success.
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.797 222021 DEBUG nova.compute.manager [req-97b05866-3e5c-41a0-807d-232da252a635 req-15b213e7-2d46-4430-acda-bc74e49fdcb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.798 222021 DEBUG oslo_concurrency.lockutils [req-97b05866-3e5c-41a0-807d-232da252a635 req-15b213e7-2d46-4430-acda-bc74e49fdcb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.798 222021 DEBUG oslo_concurrency.lockutils [req-97b05866-3e5c-41a0-807d-232da252a635 req-15b213e7-2d46-4430-acda-bc74e49fdcb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.799 222021 DEBUG oslo_concurrency.lockutils [req-97b05866-3e5c-41a0-807d-232da252a635 req-15b213e7-2d46-4430-acda-bc74e49fdcb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.799 222021 DEBUG nova.compute.manager [req-97b05866-3e5c-41a0-807d-232da252a635 req-15b213e7-2d46-4430-acda-bc74e49fdcb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.851 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.854 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9a939889-e790-45a5-a577-4dfb1df5c2f4#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.869 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce0fab7-584e-4f28-83f7-c84e7eb76f9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.871 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9a939889-e1 in ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.874 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9a939889-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.874 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[989fe485-8f7f-4d2b-ba9e-34ded08f6b9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.876 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[50a6fb27-210e-4c23-bb1c-64b69678b810]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.895 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[451e3761-b93a-4446-ae5c-7652bf78331e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.925 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[412b7f95-dde4-4be9-b6e3-5d102a65565e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.966 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7999c1f9-4880-45c1-84ef-61734733a978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 NetworkManager[48871]: <info>  [1769162107.9747] manager: (tap9a939889-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/158)
Jan 23 04:55:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:07.974 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fd8753-6ca0-4c3e-8102-6b4345747703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593233 systemd-udevd[255169]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.989 222021 DEBUG nova.compute.manager [req-96228f6c-4361-49d8-8066-9a3792ca965b req-c744ccdd-1db8-495c-8d83-7713cb055d1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.990 222021 DEBUG oslo_concurrency.lockutils [req-96228f6c-4361-49d8-8066-9a3792ca965b req-c744ccdd-1db8-495c-8d83-7713cb055d1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.990 222021 DEBUG oslo_concurrency.lockutils [req-96228f6c-4361-49d8-8066-9a3792ca965b req-c744ccdd-1db8-495c-8d83-7713cb055d1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.990 222021 DEBUG oslo_concurrency.lockutils [req-96228f6c-4361-49d8-8066-9a3792ca965b req-c744ccdd-1db8-495c-8d83-7713cb055d1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:07 np0005593233 nova_compute[222017]: 2026-01-23 09:55:07.991 222021 DEBUG nova.compute.manager [req-96228f6c-4361-49d8-8066-9a3792ca965b req-c744ccdd-1db8-495c-8d83-7713cb055d1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.021 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2d14f6-5df7-40ce-b41e-1cd870a23ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.024 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[97d7f8ae-02bb-4ce2-92a0-303a3a645f54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 NetworkManager[48871]: <info>  [1769162108.0512] device (tap9a939889-e0): carrier: link connected
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.055 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[767c133c-6dd6-4c26-8d95-57d0511029ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.075 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ab807ede-1eb6-404f-b7ec-1e139d9e309f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a939889-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:61:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602068, 'reachable_time': 39912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255337, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.099 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[59aba877-1aa6-47d7-83b7-92fd7aa393f7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:6162'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602068, 'tstamp': 602068}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255338, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.119 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[68bda53a-7de4-4725-9798-cf19f17a7b91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a939889-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:61:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602068, 'reachable_time': 39912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255339, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.156 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2b4399-a109-487a-9eee-5cf63e47c343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.239 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a6165467-c010-429f-a94a-96af86fdcd26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.242 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a939889-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.243 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.244 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a939889-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.286 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593233 NetworkManager[48871]: <info>  [1769162108.2870] manager: (tap9a939889-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Jan 23 04:55:08 np0005593233 kernel: tap9a939889-e0: entered promiscuous mode
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.291 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.293 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9a939889-e0, col_values=(('external_ids', {'iface-id': '586a15e6-31c5-4047-b8d7-e35dbfe6ae30'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.295 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:08Z|00318|binding|INFO|Releasing lport 586a15e6-31c5-4047-b8d7-e35dbfe6ae30 from this chassis (sb_readonly=0)
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.309 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.311 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.312 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9a939889-e790-45a5-a577-4dfb1df5c2f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9a939889-e790-45a5-a577-4dfb1df5c2f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.314 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[474fe337-21f5-4e98-89ef-d2663326ce8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.315 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-9a939889-e790-45a5-a577-4dfb1df5c2f4
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/9a939889-e790-45a5-a577-4dfb1df5c2f4.pid.haproxy
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 9a939889-e790-45a5-a577-4dfb1df5c2f4
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.316 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'env', 'PROCESS_TAG=haproxy-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9a939889-e790-45a5-a577-4dfb1df5c2f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.646 222021 DEBUG nova.compute.manager [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.648 222021 DEBUG oslo_concurrency.lockutils [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.649 222021 DEBUG oslo_concurrency.lockutils [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.649 222021 DEBUG oslo_concurrency.lockutils [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.650 222021 DEBUG nova.compute.manager [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.650 222021 DEBUG nova.compute.manager [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.651 222021 DEBUG oslo_concurrency.lockutils [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.651 222021 DEBUG oslo_concurrency.lockutils [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.651 222021 DEBUG oslo_concurrency.lockutils [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.652 222021 DEBUG nova.compute.manager [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No event matching network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b in dict_keys([('network-vif-plugged', '08ec741b-b592-417f-9a64-1ee4d2e4e006'), ('network-vif-plugged', '8f671859-24dc-4140-915c-bbad6f16e0d8'), ('network-vif-plugged', '1e83b219-c51b-488a-8fc7-8240efb384c0'), ('network-vif-plugged', '2358fe4c-654b-4b88-9e08-2e85688cb00e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.652 222021 WARNING nova.compute.manager [req-46c8cc6c-5af2-47ba-abb0-7f62fed1a565 req-9b68d39d-2f45-4b0a-876f-968bfe000cef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:08 np0005593233 podman[255371]: 2026-01-23 09:55:08.74910945 +0000 UTC m=+0.059453228 container create 75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 04:55:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:08.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:08 np0005593233 systemd[1]: Started libpod-conmon-75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d.scope.
Jan 23 04:55:08 np0005593233 podman[255371]: 2026-01-23 09:55:08.718448911 +0000 UTC m=+0.028792709 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:55:08 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:55:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df700b3b6c63d59e5971822633cf8d43cb6a4f5146c486deb7dd94e7beebf82f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:55:08 np0005593233 podman[255371]: 2026-01-23 09:55:08.849874295 +0000 UTC m=+0.160218063 container init 75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:55:08 np0005593233 podman[255371]: 2026-01-23 09:55:08.857354634 +0000 UTC m=+0.167698402 container start 75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:55:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:08.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:08 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [NOTICE]   (255391) : New worker (255393) forked
Jan 23 04:55:08 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [NOTICE]   (255391) : Loading success.
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.929 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8f671859-24dc-4140-915c-bbad6f16e0d8 in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.931 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9a939889-e790-45a5-a577-4dfb1df5c2f4#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.954 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0832fe71-2d37-4f31-bd85-88462f976d40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593233 nova_compute[222017]: 2026-01-23 09:55:08.992 222021 DEBUG nova.network.neutron [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Updating instance_info_cache with network_info: [{"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:08.997 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[86c6a82b-67e1-407c-b5c9-aa64afcdb662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.002 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf57c89-21f2-443e-ae74-63ece09aa42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.041 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3daf64be-300e-46dd-a06d-73a946b17cba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.065 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2cfc53fc-b7db-4a57-9c76-a6097b7d9975]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a939889-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:61:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 266, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 266, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602068, 'reachable_time': 39912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255407, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.093 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Releasing lock "refresh_cache-ec678068-aa1c-4926-abee-e4852fd8f1fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.094 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Instance network_info: |[{"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.093 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d1dff9d7-1b2a-4c72-9eff-0777427acb19]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tap9a939889-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602082, 'tstamp': 602082}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255408, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9a939889-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602086, 'tstamp': 602086}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255408, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.095 222021 DEBUG oslo_concurrency.lockutils [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ec678068-aa1c-4926-abee-e4852fd8f1fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.095 222021 DEBUG nova.network.neutron [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Refreshing network info cache for port 4d7d3d07-d9ea-465d-b091-0ad0246436b2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.096 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a939889-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.099 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Start _get_guest_xml network_info=[{"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.100 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.101 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a939889-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.101 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.101 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9a939889-e0, col_values=(('external_ids', {'iface-id': '586a15e6-31c5-4047-b8d7-e35dbfe6ae30'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.102 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.103 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1e83b219-c51b-488a-8fc7-8240efb384c0 in datapath 334ca2e2-bd22-482d-8aad-3c18e88d90a5 unbound from our chassis#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.105 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 334ca2e2-bd22-482d-8aad-3c18e88d90a5#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.107 222021 WARNING nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.116 222021 DEBUG nova.virt.libvirt.host [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.117 222021 DEBUG nova.virt.libvirt.host [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.123 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7916e352-3018-492e-b74c-1d1aabbe49d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.124 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap334ca2e2-b1 in ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.126 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap334ca2e2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.126 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[43971518-a452-4e4a-8ebc-d62a3980ec81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.127 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a21e99-c505-45d8-88e9-c505c2e8bb58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.137 222021 DEBUG nova.virt.libvirt.host [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.139 222021 DEBUG nova.virt.libvirt.host [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.140 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.141 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.141 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.142 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.142 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.141 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[05a61063-5196-4303-aee6-b1143d449094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.142 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.143 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.143 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.143 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.143 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.144 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.144 222021 DEBUG nova.virt.hardware [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.147 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.171 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1adb362d-b68a-45bc-8687-2cc4082742a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.217 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a52e24fd-cee4-40ee-914e-61621c91a817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.225 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[92e5f33c-5e93-41a6-9257-d03eb67ab146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 NetworkManager[48871]: <info>  [1769162109.2263] manager: (tap334ca2e2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.269 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8a99f454-c0f2-4895-813b-65ef5e548690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.273 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[98b0e2a5-0bda-4cbd-841f-cf263eeb8e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 NetworkManager[48871]: <info>  [1769162109.3135] device (tap334ca2e2-b0): carrier: link connected
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.322 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e9851274-b3a0-4acc-ba52-3f7b0ceb9250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.351 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fa39c16a-d5ea-4790-8038-09da598d3081]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap334ca2e2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:43:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602194, 'reachable_time': 15411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255439, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.367 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[38ac241d-6b75-4c78-8c58-76656bfcc63a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:431b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602194, 'tstamp': 602194}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255440, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.395 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9d130445-7ada-4054-a6c1-7683277fcf5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap334ca2e2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:43:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602194, 'reachable_time': 15411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255441, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.405 222021 DEBUG nova.network.neutron [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updated VIF entry in instance network info cache for port 2358fe4c-654b-4b88-9e08-2e85688cb00e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.407 222021 DEBUG nova.network.neutron [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.439 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce33ebbd-e42e-44b2-8ec1-fad1530d8012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.530 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c22835b2-ac6f-43da-868d-d30c4d278308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.532 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap334ca2e2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.533 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.533 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap334ca2e2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:09 np0005593233 NetworkManager[48871]: <info>  [1769162109.5360] manager: (tap334ca2e2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 23 04:55:09 np0005593233 kernel: tap334ca2e2-b0: entered promiscuous mode
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.537 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.540 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap334ca2e2-b0, col_values=(('external_ids', {'iface-id': '25d922f4-f2af-4379-86c5-92c7874dd4ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:09 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:09Z|00319|binding|INFO|Releasing lport 25d922f4-f2af-4379-86c5-92c7874dd4ee from this chassis (sb_readonly=0)
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.557 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.558 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/334ca2e2-bd22-482d-8aad-3c18e88d90a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/334ca2e2-bd22-482d-8aad-3c18e88d90a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.559 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[168526de-afa3-4a5f-b3ce-55ea22a9c845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.560 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-334ca2e2-bd22-482d-8aad-3c18e88d90a5
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/334ca2e2-bd22-482d-8aad-3c18e88d90a5.pid.haproxy
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 334ca2e2-bd22-482d-8aad-3c18e88d90a5
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:55:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:09.561 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'env', 'PROCESS_TAG=haproxy-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/334ca2e2-bd22-482d-8aad-3c18e88d90a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.562 222021 DEBUG oslo_concurrency.lockutils [req-8b0f038c-2141-49c8-9a98-6be0034ee5af req-877fb7a1-688c-45a9-a664-f7628b11c3cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:55:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2836683925' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.661 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.690 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.695 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:09 np0005593233 nova_compute[222017]: 2026-01-23 09:55:09.981 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:09 np0005593233 podman[255511]: 2026-01-23 09:55:09.989009916 +0000 UTC m=+0.066159155 container create 912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.020 222021 DEBUG nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.021 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.022 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.022 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.023 222021 DEBUG nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No event matching network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 in dict_keys([('network-vif-plugged', '08ec741b-b592-417f-9a64-1ee4d2e4e006'), ('network-vif-plugged', '8f671859-24dc-4140-915c-bbad6f16e0d8'), ('network-vif-plugged', '1e83b219-c51b-488a-8fc7-8240efb384c0'), ('network-vif-plugged', '2358fe4c-654b-4b88-9e08-2e85688cb00e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.023 222021 WARNING nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.024 222021 DEBUG nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.024 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.024 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.025 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.025 222021 DEBUG nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.025 222021 DEBUG nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.026 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.026 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.027 222021 DEBUG oslo_concurrency.lockutils [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.027 222021 DEBUG nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No event matching network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e in dict_keys([('network-vif-plugged', '08ec741b-b592-417f-9a64-1ee4d2e4e006'), ('network-vif-plugged', '8f671859-24dc-4140-915c-bbad6f16e0d8'), ('network-vif-plugged', '1e83b219-c51b-488a-8fc7-8240efb384c0')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.027 222021 WARNING nova.compute.manager [req-93688019-0082-43e3-9a6b-ecb8c2317073 req-9986a8e1-fd51-4e7f-80ce-8d36ad16a5d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:10 np0005593233 systemd[1]: Started libpod-conmon-912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65.scope.
Jan 23 04:55:10 np0005593233 podman[255511]: 2026-01-23 09:55:09.957343139 +0000 UTC m=+0.034492408 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:55:10 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:55:10 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8857f9bb3afc8e0dbbb993a2827bc52f90d3fe9a1e6d1369814c90aa5a2c3b06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:55:10 np0005593233 podman[255511]: 2026-01-23 09:55:10.076820368 +0000 UTC m=+0.153969627 container init 912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 04:55:10 np0005593233 podman[255511]: 2026-01-23 09:55:10.08333022 +0000 UTC m=+0.160479459 container start 912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:55:10 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [NOTICE]   (255530) : New worker (255532) forked
Jan 23 04:55:10 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [NOTICE]   (255530) : Loading success.
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.159 140224 INFO neutron.agent.ovn.metadata.agent [-] Port c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:55:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3768755403' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.163 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9a939889-e790-45a5-a577-4dfb1df5c2f4#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.183 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4183fa-c48c-428e-9682-88cadbbf2e56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.190 222021 DEBUG nova.compute.manager [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.190 222021 DEBUG oslo_concurrency.lockutils [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.191 222021 DEBUG oslo_concurrency.lockutils [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.191 222021 DEBUG oslo_concurrency.lockutils [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.191 222021 DEBUG nova.compute.manager [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.191 222021 DEBUG nova.compute.manager [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.192 222021 DEBUG oslo_concurrency.lockutils [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.192 222021 DEBUG oslo_concurrency.lockutils [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.192 222021 DEBUG oslo_concurrency.lockutils [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.192 222021 DEBUG nova.compute.manager [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No event matching network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 in dict_keys([('network-vif-plugged', '08ec741b-b592-417f-9a64-1ee4d2e4e006'), ('network-vif-plugged', '8f671859-24dc-4140-915c-bbad6f16e0d8')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.193 222021 WARNING nova.compute.manager [req-276b41d5-dcbb-46eb-ad6b-6ea5769c3648 req-a3968788-284c-4274-b527-1d4d84694aa5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.193 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.196 222021 DEBUG nova.virt.libvirt.vif [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-822107035',display_name='tempest-ServerDiskConfigTestJSON-server-822107035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-822107035',id=84,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-k0qehsmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:54:58Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=ec678068-aa1c-4926-abee-e4852fd8f1fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.197 222021 DEBUG nova.network.os_vif_util [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.199 222021 DEBUG nova.network.os_vif_util [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.201 222021 DEBUG nova.objects.instance [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid ec678068-aa1c-4926-abee-e4852fd8f1fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.202 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.229 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <uuid>ec678068-aa1c-4926-abee-e4852fd8f1fd</uuid>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <name>instance-00000054</name>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-822107035</nova:name>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:55:09</nova:creationTime>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <nova:port uuid="4d7d3d07-d9ea-465d-b091-0ad0246436b2">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <entry name="serial">ec678068-aa1c-4926-abee-e4852fd8f1fd</entry>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <entry name="uuid">ec678068-aa1c-4926-abee-e4852fd8f1fd</entry>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ec678068-aa1c-4926-abee-e4852fd8f1fd_disk">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ec678068-aa1c-4926-abee-e4852fd8f1fd_disk.config">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:c1:a4:09"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <target dev="tap4d7d3d07-d9"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/console.log" append="off"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:55:10 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:55:10 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:55:10 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:55:10 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.230 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Preparing to wait for external event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.230 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.231 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.231 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.230 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2632955d-ca22-44b3-b352-324898916827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.232 222021 DEBUG nova.virt.libvirt.vif [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-822107035',display_name='tempest-ServerDiskConfigTestJSON-server-822107035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-822107035',id=84,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-k0qehsmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:54:58Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=ec678068-aa1c-4926-abee-e4852fd8f1fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.232 222021 DEBUG nova.network.os_vif_util [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.233 222021 DEBUG nova.network.os_vif_util [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.233 222021 DEBUG os_vif [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.234 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b01e11-b617-4431-b501-31ba761d9066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.235 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.236 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.236 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.239 222021 DEBUG nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.239 222021 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.239 222021 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.240 222021 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.240 222021 DEBUG nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No event matching network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a in dict_keys([('network-vif-plugged', '08ec741b-b592-417f-9a64-1ee4d2e4e006'), ('network-vif-plugged', '8f671859-24dc-4140-915c-bbad6f16e0d8')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.240 222021 WARNING nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.240 222021 DEBUG nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.240 222021 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.241 222021 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.241 222021 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.241 222021 DEBUG nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.241 222021 WARNING nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.245 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.246 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d7d3d07-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.246 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d7d3d07-d9, col_values=(('external_ids', {'iface-id': '4d7d3d07-d9ea-465d-b091-0ad0246436b2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:a4:09', 'vm-uuid': 'ec678068-aa1c-4926-abee-e4852fd8f1fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.248 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.249 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:10 np0005593233 NetworkManager[48871]: <info>  [1769162110.2500] manager: (tap4d7d3d07-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.252 222021 INFO nova.network.neutron [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating port 21920b88-3779-4c29-b3a9-7591691e880a with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.257 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.260 222021 INFO os_vif [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9')#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.275 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ef58d824-66fb-476f-87d3-d62b6d5cb797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.298 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34723fa2-4862-4d24-bf16-cefb6fd20666]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a939889-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:61:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 8, 'rx_bytes': 532, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602068, 'reachable_time': 39912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255550, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.323 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[27bc835c-1df9-4ec9-b82d-16472ba42369]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tap9a939889-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602082, 'tstamp': 602082}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255551, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9a939889-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602086, 'tstamp': 602086}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255551, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.325 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a939889-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.327 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.333 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.333 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a939889-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.334 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.334 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9a939889-e0, col_values=(('external_ids', {'iface-id': '586a15e6-31c5-4047-b8d7-e35dbfe6ae30'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.335 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.337 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 2358fe4c-654b-4b88-9e08-2e85688cb00e in datapath 334ca2e2-bd22-482d-8aad-3c18e88d90a5 unbound from our chassis#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.339 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 334ca2e2-bd22-482d-8aad-3c18e88d90a5#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.356 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c8b166-f230-45fa-9372-bd7f9e9d41c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.378 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.379 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.379 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:c1:a4:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.380 222021 INFO nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Using config drive#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.402 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[05d09b81-1ca5-490a-a72b-84353b839181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.406 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba52563-51cd-44b6-8c56-97e4787b52e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.412 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.448 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[db6b5bae-fa02-4675-8a1e-a5a78f204b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.472 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6b25b7-d43d-4f94-9be0-68f332ad3863]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap334ca2e2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:43:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 266, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 6, 'rx_bytes': 266, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602194, 'reachable_time': 15411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255576, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.495 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2eeef5-abfd-4f14-b0e4-ffb6680272cd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap334ca2e2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602211, 'tstamp': 602211}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255577, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.2.2.2'], ['IFA_LOCAL', '10.2.2.2'], ['IFA_BROADCAST', '10.2.2.255'], ['IFA_LABEL', 'tap334ca2e2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602215, 'tstamp': 602215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255577, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.497 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap334ca2e2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.498 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.503 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.504 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap334ca2e2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.505 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.505 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap334ca2e2-b0, col_values=(('external_ids', {'iface-id': '25d922f4-f2af-4379-86c5-92c7874dd4ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.506 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.507 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 08ec741b-b592-417f-9a64-1ee4d2e4e006 in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.509 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9a939889-e790-45a5-a577-4dfb1df5c2f4#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.526 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2fef44-6ec1-4f8f-820b-b11509e84085]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.566 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[07b394b7-36fa-46b3-b0b8-205d4f9c364a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.571 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7aba4ef1-4375-4d31-aa43-73187d4f0ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.619 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6330d6-1d17-433d-83ea-77dbc1f94d07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.645 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dc70f1f0-280a-4d1c-a427-a791b8dc67c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a939889-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:61:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 532, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 10, 'rx_bytes': 532, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602068, 'reachable_time': 39912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255584, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.669 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b1564cbb-f9a4-4711-ab95-5151c86dba06]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tap9a939889-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602082, 'tstamp': 602082}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255585, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9a939889-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602086, 'tstamp': 602086}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255585, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.671 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a939889-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 nova_compute[222017]: 2026-01-23 09:55:10.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.681 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a939889-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.681 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.682 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9a939889-e0, col_values=(('external_ids', {'iface-id': '586a15e6-31c5-4047-b8d7-e35dbfe6ae30'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:10.682 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:10.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:10.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.399 222021 INFO nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Creating config drive at /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/disk.config#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.407 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjvb7p1n4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.551 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjvb7p1n4" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.589 222021 DEBUG nova.storage.rbd_utils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image ec678068-aa1c-4926-abee-e4852fd8f1fd_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.595 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/disk.config ec678068-aa1c-4926-abee-e4852fd8f1fd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.632 222021 DEBUG nova.compute.manager [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.633 222021 DEBUG oslo_concurrency.lockutils [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.633 222021 DEBUG oslo_concurrency.lockutils [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.634 222021 DEBUG oslo_concurrency.lockutils [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.634 222021 DEBUG nova.compute.manager [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.634 222021 DEBUG nova.compute.manager [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.634 222021 DEBUG oslo_concurrency.lockutils [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.634 222021 DEBUG oslo_concurrency.lockutils [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.635 222021 DEBUG oslo_concurrency.lockutils [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.635 222021 DEBUG nova.compute.manager [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No event matching network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 in dict_keys([('network-vif-plugged', '8f671859-24dc-4140-915c-bbad6f16e0d8')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.635 222021 WARNING nova.compute.manager [req-85729312-c58e-464a-b1cb-a0326aab2246 req-cc6deea6-d82d-437b-bae7-fe52833539d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.801 222021 DEBUG oslo_concurrency.processutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/disk.config ec678068-aa1c-4926-abee-e4852fd8f1fd_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.802 222021 INFO nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Deleting local config drive /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd/disk.config because it was imported into RBD.#033[00m
Jan 23 04:55:11 np0005593233 kernel: tap4d7d3d07-d9: entered promiscuous mode
Jan 23 04:55:11 np0005593233 NetworkManager[48871]: <info>  [1769162111.8713] manager: (tap4d7d3d07-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Jan 23 04:55:11 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:11Z|00320|binding|INFO|Claiming lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 for this chassis.
Jan 23 04:55:11 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:11Z|00321|binding|INFO|4d7d3d07-d9ea-465d-b091-0ad0246436b2: Claiming fa:16:3e:c1:a4:09 10.100.0.10
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.874 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.887 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:a4:09 10.100.0.10'], port_security=['fa:16:3e:c1:a4:09 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ec678068-aa1c-4926-abee-e4852fd8f1fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=4d7d3d07-d9ea-465d-b091-0ad0246436b2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.888 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 4d7d3d07-d9ea-465d-b091-0ad0246436b2 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.890 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:55:11 np0005593233 systemd-udevd[255640]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.911 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5c44ed-0bb7-41cb-af8f-d89bd1710490]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.912 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.914 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.915 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4847905b-f3d0-43f9-9bfa-52449398e25b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.919 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[83de8e62-5359-4cc5-a798-69b34af0ca3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:11 np0005593233 systemd-machined[190954]: New machine qemu-42-instance-00000054.
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.932 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[ad3dc19f-e6a0-450f-86d2-a0a465e79bec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:11 np0005593233 NetworkManager[48871]: <info>  [1769162111.9408] device (tap4d7d3d07-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:11 np0005593233 NetworkManager[48871]: <info>  [1769162111.9418] device (tap4d7d3d07-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:11 np0005593233 systemd[1]: Started Virtual Machine qemu-42-instance-00000054.
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.951 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:11 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:11Z|00322|binding|INFO|Setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 ovn-installed in OVS
Jan 23 04:55:11 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:11Z|00323|binding|INFO|Setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 up in Southbound
Jan 23 04:55:11 np0005593233 nova_compute[222017]: 2026-01-23 09:55:11.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.960 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fca05e45-f1ae-4093-ae72-3d87f08b8cc5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:11.998 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[126fd063-a26a-4c60-b059-e8617820dcc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.005 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e8656ac8-0a1b-4511-a5ed-9cf7ead52438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 NetworkManager[48871]: <info>  [1769162112.0072] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/164)
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.053 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[61f1f96e-112f-4414-be0d-58fde6e35e30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.057 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5da1ec-b535-4f69-897c-7ddcccba5264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 NetworkManager[48871]: <info>  [1769162112.0853] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.095 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8bb7c6-be70-4fb2-b212-d21dca0000cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.120 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f788b894-46c4-49d3-917c-48c60067ea62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602471, 'reachable_time': 35431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255673, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.141 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc7ce50-5193-47a8-b2a2-9ae44eaf5165]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602471, 'tstamp': 602471}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255674, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.168 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[97acfec1-0504-4ecc-874f-a1435cd40de9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602471, 'reachable_time': 35431, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255682, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.212 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[86de4649-7c5a-4c0c-b1bc-15d2242bf842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.291 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2badb1-490e-4b82-9cad-8dbd0108e6db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.293 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.294 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.294 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:12 np0005593233 NetworkManager[48871]: <info>  [1769162112.2976] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 23 04:55:12 np0005593233 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.300 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.300 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:12 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:12Z|00324|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.317 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.319 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.322 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5dba38ef-1e64-4ff6-893d-58410907c2f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.323 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:55:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:12.324 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.373 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162112.372837, ec678068-aa1c-4926-abee-e4852fd8f1fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.374 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] VM Started (Lifecycle Event)#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.433 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.437 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162112.3732502, ec678068-aa1c-4926-abee-e4852fd8f1fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.438 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.480 222021 DEBUG nova.network.neutron [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Updated VIF entry in instance network info cache for port 4d7d3d07-d9ea-465d-b091-0ad0246436b2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.481 222021 DEBUG nova.network.neutron [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Updating instance_info_cache with network_info: [{"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.487 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.491 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.557 222021 DEBUG oslo_concurrency.lockutils [req-4ebbf9a9-2448-4158-bc7d-a759b3ee4b0b req-9125908e-829d-43f2-8a79-64fbac87ac98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ec678068-aa1c-4926-abee-e4852fd8f1fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.558 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.578 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.579 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:12 np0005593233 nova_compute[222017]: 2026-01-23 09:55:12.579 222021 DEBUG nova.network.neutron [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:55:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:12.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:12 np0005593233 podman[255749]: 2026-01-23 09:55:12.87523005 +0000 UTC m=+0.087991538 container create ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 04:55:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:12.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:12 np0005593233 podman[255749]: 2026-01-23 09:55:12.8160261 +0000 UTC m=+0.028787608 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:55:12 np0005593233 systemd[1]: Started libpod-conmon-ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2.scope.
Jan 23 04:55:12 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:55:12 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3049011eaa7e8a9586e8d6fd8b679b8e1e51b1de36326bbb57c9beccfdc069aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:55:13 np0005593233 podman[255749]: 2026-01-23 09:55:13.021765197 +0000 UTC m=+0.234526735 container init ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:13 np0005593233 podman[255749]: 2026-01-23 09:55:13.031042007 +0000 UTC m=+0.243803505 container start ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:55:13 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [NOTICE]   (255768) : New worker (255770) forked
Jan 23 04:55:13 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [NOTICE]   (255768) : Loading success.
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.133 222021 DEBUG nova.compute.manager [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.134 222021 DEBUG oslo_concurrency.lockutils [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.134 222021 DEBUG oslo_concurrency.lockutils [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.135 222021 DEBUG oslo_concurrency.lockutils [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.135 222021 DEBUG nova.compute.manager [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Processing event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.135 222021 DEBUG nova.compute.manager [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.136 222021 DEBUG oslo_concurrency.lockutils [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.136 222021 DEBUG oslo_concurrency.lockutils [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.136 222021 DEBUG oslo_concurrency.lockutils [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.137 222021 DEBUG nova.compute.manager [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.137 222021 WARNING nova.compute.manager [req-af29be59-7225-4029-8436-a57da6045d47 req-eba023d2-251a-4cdc-9c27-47bb77deb0e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.138 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance event wait completed in 5 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.143 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162113.143711, c4100b68-be14-4cd7-8243-2c9a793caa5f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.144 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.147 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.151 222021 INFO nova.virt.libvirt.driver [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance spawned successfully.#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.152 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.240 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.250 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.251 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.251 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.252 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.252 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.253 222021 DEBUG nova.virt.libvirt.driver [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.259 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.418 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.566 222021 INFO nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Took 65.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.567 222021 DEBUG nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.621 222021 DEBUG nova.compute.manager [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.621 222021 DEBUG oslo_concurrency.lockutils [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.622 222021 DEBUG oslo_concurrency.lockutils [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.622 222021 DEBUG oslo_concurrency.lockutils [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.623 222021 DEBUG nova.compute.manager [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.623 222021 WARNING nova.compute.manager [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.675 222021 INFO nova.compute.manager [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Took 76.25 seconds to build instance.#033[00m
Jan 23 04:55:13 np0005593233 nova_compute[222017]: 2026-01-23 09:55:13.716 222021 DEBUG oslo_concurrency.lockutils [None req-09133e25-8cef-4cfa-8d0a-9ea70c68daf9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 76.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:14.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:14.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:15 np0005593233 nova_compute[222017]: 2026-01-23 09:55:15.199 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:15 np0005593233 nova_compute[222017]: 2026-01-23 09:55:15.249 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:15 np0005593233 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 04:55:15 np0005593233 systemd[254999]: Activating special unit Exit the Session...
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped target Main User Target.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped target Basic System.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped target Paths.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped target Sockets.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped target Timers.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:55:15 np0005593233 systemd[254999]: Closed D-Bus User Message Bus Socket.
Jan 23 04:55:15 np0005593233 systemd[254999]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:55:15 np0005593233 systemd[254999]: Removed slice User Application Slice.
Jan 23 04:55:15 np0005593233 systemd[254999]: Reached target Shutdown.
Jan 23 04:55:15 np0005593233 systemd[254999]: Finished Exit the Session.
Jan 23 04:55:15 np0005593233 systemd[254999]: Reached target Exit the Session.
Jan 23 04:55:15 np0005593233 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 04:55:15 np0005593233 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 04:55:15 np0005593233 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 04:55:15 np0005593233 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 04:55:15 np0005593233 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 04:55:15 np0005593233 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 04:55:15 np0005593233 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 04:55:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.279 222021 DEBUG nova.network.neutron [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.327 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.523 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.525 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.526 222021 INFO nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Creating image(s)#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.570 222021 DEBUG nova.storage.rbd_utils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] creating snapshot(nova-resize) on rbd image(dc5e2bb3-0d73-4538-a181-9380a1d67934_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.639 222021 DEBUG nova.compute.manager [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.640 222021 DEBUG oslo_concurrency.lockutils [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.640 222021 DEBUG oslo_concurrency.lockutils [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.640 222021 DEBUG oslo_concurrency.lockutils [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.641 222021 DEBUG nova.compute.manager [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Processing event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.641 222021 DEBUG nova.compute.manager [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.645 222021 DEBUG oslo_concurrency.lockutils [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.645 222021 DEBUG oslo_concurrency.lockutils [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.645 222021 DEBUG oslo_concurrency.lockutils [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.645 222021 DEBUG nova.compute.manager [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] No waiting events found dispatching network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.645 222021 WARNING nova.compute.manager [req-70f71811-e0f6-4095-8f6a-4f336bce80d3 req-b74ff5c3-d56d-4761-8e6c-436dc7aaf1d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received unexpected event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.646 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.650 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.651 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162116.650316, ec678068-aa1c-4926-abee-e4852fd8f1fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.651 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.658 222021 INFO nova.virt.libvirt.driver [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Instance spawned successfully.#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.658 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.707 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.711 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.734 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.734 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.735 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.735 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.736 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.736 222021 DEBUG nova.virt.libvirt.driver [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.768 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:55:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e260 e260: 3 total, 3 up, 3 in
Jan 23 04:55:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:16.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.844 222021 DEBUG nova.objects.instance [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'trusted_certs' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.852 222021 INFO nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Took 18.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.852 222021 DEBUG nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.858 222021 DEBUG nova.compute.manager [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-changed-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.859 222021 DEBUG nova.compute.manager [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Refreshing instance network info cache due to event network-changed-21920b88-3779-4c29-b3a9-7591691e880a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.859 222021 DEBUG oslo_concurrency.lockutils [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.860 222021 DEBUG oslo_concurrency.lockutils [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:16 np0005593233 nova_compute[222017]: 2026-01-23 09:55:16.860 222021 DEBUG nova.network.neutron [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Refreshing network info cache for port 21920b88-3779-4c29-b3a9-7591691e880a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:55:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:16.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.066 222021 INFO nova.compute.manager [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Took 19.85 seconds to build instance.#033[00m
Jan 23 04:55:17 np0005593233 podman[255832]: 2026-01-23 09:55:17.085478896 +0000 UTC m=+0.093577844 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.113 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.113 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Ensure instance console log exists: /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.114 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.114 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.114 222021 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.117 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Start _get_guest_xml network_info=[{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.118 222021 DEBUG oslo_concurrency.lockutils [None req-938c71ef-e7e1-46b8-b5d3-465a87679302 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.122 222021 WARNING nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.129 222021 DEBUG nova.virt.libvirt.host [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.130 222021 DEBUG nova.virt.libvirt.host [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.138 222021 DEBUG nova.virt.libvirt.host [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.139 222021 DEBUG nova.virt.libvirt.host [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.141 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.141 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.142 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.142 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.142 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.142 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.143 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.143 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.143 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.143 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.144 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.144 222021 DEBUG nova.virt.hardware [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.144 222021 DEBUG nova.objects.instance [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'vcpu_model' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.179 222021 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:55:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4282727135' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.659 222021 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:17 np0005593233 nova_compute[222017]: 2026-01-23 09:55:17.709 222021 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:55:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/455046457' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.208 222021 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.211 222021 DEBUG nova.virt.libvirt.vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:54:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:55:09Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.212 222021 DEBUG nova.network.os_vif_util [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.213 222021 DEBUG nova.network.os_vif_util [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.217 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <uuid>dc5e2bb3-0d73-4538-a181-9380a1d67934</uuid>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <name>instance-00000053</name>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <memory>196608</memory>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:name>tempest-DeleteServersTestJSON-server-1189076101</nova:name>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:55:17</nova:creationTime>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.micro">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:memory>192</nova:memory>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <nova:port uuid="21920b88-3779-4c29-b3a9-7591691e880a">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <entry name="serial">dc5e2bb3-0d73-4538-a181-9380a1d67934</entry>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <entry name="uuid">dc5e2bb3-0d73-4538-a181-9380a1d67934</entry>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/dc5e2bb3-0d73-4538-a181-9380a1d67934_disk">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:46:2e:b4"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <target dev="tap21920b88-37"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/console.log" append="off"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:55:18 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:55:18 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:55:18 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:55:18 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.223 222021 DEBUG nova.virt.libvirt.vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:54:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:55:09Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.224 222021 DEBUG nova.network.os_vif_util [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.224 222021 DEBUG nova.network.os_vif_util [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.225 222021 DEBUG os_vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.226 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.227 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.230 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.230 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21920b88-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.231 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21920b88-37, col_values=(('external_ids', {'iface-id': '21920b88-3779-4c29-b3a9-7591691e880a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:2e:b4', 'vm-uuid': 'dc5e2bb3-0d73-4538-a181-9380a1d67934'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.2341] manager: (tap21920b88-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.239 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.243 222021 INFO os_vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37')#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.376 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.376 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.377 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:46:2e:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.377 222021 INFO nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Using config drive#033[00m
Jan 23 04:55:18 np0005593233 kernel: tap21920b88-37: entered promiscuous mode
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.4765] manager: (tap21920b88-37): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.480 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:18Z|00325|binding|INFO|Claiming lport 21920b88-3779-4c29-b3a9-7591691e880a for this chassis.
Jan 23 04:55:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:18Z|00326|binding|INFO|21920b88-3779-4c29-b3a9-7591691e880a: Claiming fa:16:3e:46:2e:b4 10.100.0.14
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.501 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:2e:b4 10.100.0.14'], port_security=['fa:16:3e:46:2e:b4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dc5e2bb3-0d73-4538-a181-9380a1d67934', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=21920b88-3779-4c29-b3a9-7591691e880a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.505 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 21920b88-3779-4c29-b3a9-7591691e880a in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.507 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:55:18 np0005593233 systemd-udevd[255963]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.524 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4b0073-2abe-4d54-b628-f6b4c60fd46a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.525 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.5328] device (tap21920b88-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.5335] device (tap21920b88-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.532 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.533 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[890a51f7-749f-4347-ab9c-a462fbe9ab70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.536 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[670b5246-b30e-4753-8594-08e4e9b111be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 systemd-machined[190954]: New machine qemu-43-instance-00000053.
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.560 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[966b84e4-aadb-4f58-a800-2dfc8d0dfa5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.562 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 systemd[1]: Started Virtual Machine qemu-43-instance-00000053.
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.572 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:18Z|00327|binding|INFO|Setting lport 21920b88-3779-4c29-b3a9-7591691e880a ovn-installed in OVS
Jan 23 04:55:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:18Z|00328|binding|INFO|Setting lport 21920b88-3779-4c29-b3a9-7591691e880a up in Southbound
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.576 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.585 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7320a1fe-7369-4681-8211-2f8ebc754653]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.632 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7b84e224-6ba3-4235-8c6f-3534ef5de249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.6406] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.642 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a28478-0482-4648-b07e-617f8db848fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.685 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[bac4a019-638c-414e-9027-79fa02c057a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.689 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1f436dd2-8fcf-42cc-b10d-286a229998f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.7190] device (tapa3788149-e0): carrier: link connected
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.729 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc25afa-b3ee-402a-89d9-031d0c049168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.751 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5b77bc86-8887-47ee-b1f4-04abe020eebc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603135, 'reachable_time': 23156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255999, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.772 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b9373585-31a4-441c-9d0a-162df3446c9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 603135, 'tstamp': 603135}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256000, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:18.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.801 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[23e3f35a-e638-4adb-8a91-796d116fdf2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603135, 'reachable_time': 23156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256001, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.850 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1cfa2b-e960-4bd9-a91d-2f2c386eb3a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:18.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.935 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[09ee2cd0-bb9a-4257-adb9-f3e07782bb18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.936 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.937 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.937 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.939 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 NetworkManager[48871]: <info>  [1769162118.9402] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Jan 23 04:55:18 np0005593233 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.943 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:18 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:18Z|00329|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.960 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:55:18 np0005593233 nova_compute[222017]: 2026-01-23 09:55:18.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.961 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d06f395e-fd17-40d3-af8e-0480f18d8046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.962 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:55:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:18.963 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.383 222021 DEBUG nova.compute.manager [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.384 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162119.3828337, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.385 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:55:19 np0005593233 podman[256073]: 2026-01-23 09:55:19.388798899 +0000 UTC m=+0.066339190 container create dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.390 222021 INFO nova.virt.libvirt.driver [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance running successfully.#033[00m
Jan 23 04:55:19 np0005593233 virtqemud[221325]: argument unsupported: QEMU guest agent is not configured
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.394 222021 DEBUG nova.virt.libvirt.guest [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.394 222021 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.433 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.442 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:19 np0005593233 podman[256073]: 2026-01-23 09:55:19.348479869 +0000 UTC m=+0.026020180 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:55:19 np0005593233 systemd[1]: Started libpod-conmon-dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c.scope.
Jan 23 04:55:19 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:55:19 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60d112a96d8c214ca02fae7889fd7fac148dda7826bf6ce73e0a89ed16e2871f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:55:19 np0005593233 podman[256073]: 2026-01-23 09:55:19.503625158 +0000 UTC m=+0.181165469 container init dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:19 np0005593233 podman[256073]: 2026-01-23 09:55:19.51014225 +0000 UTC m=+0.187682541 container start dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.512 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.514 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162119.3832688, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.514 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Started (Lifecycle Event)#033[00m
Jan 23 04:55:19 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [NOTICE]   (256093) : New worker (256095) forked
Jan 23 04:55:19 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [NOTICE]   (256093) : Loading success.
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.604 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.611 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.821 222021 DEBUG nova.compute.manager [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.822 222021 DEBUG oslo_concurrency.lockutils [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.822 222021 DEBUG oslo_concurrency.lockutils [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.822 222021 DEBUG oslo_concurrency.lockutils [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.823 222021 DEBUG nova.compute.manager [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:19 np0005593233 nova_compute[222017]: 2026-01-23 09:55:19.823 222021 WARNING nova.compute.manager [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state resized and task_state None.#033[00m
Jan 23 04:55:20 np0005593233 nova_compute[222017]: 2026-01-23 09:55:20.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:20.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:20.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:21 np0005593233 nova_compute[222017]: 2026-01-23 09:55:21.999 222021 DEBUG nova.compute.manager [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:21.999 222021 DEBUG oslo_concurrency.lockutils [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.000 222021 DEBUG oslo_concurrency.lockutils [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.000 222021 DEBUG oslo_concurrency.lockutils [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.000 222021 DEBUG nova.compute.manager [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.000 222021 WARNING nova.compute.manager [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state resized and task_state deleting.#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.117 222021 DEBUG nova.network.neutron [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updated VIF entry in instance network info cache for port 21920b88-3779-4c29-b3a9-7591691e880a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.118 222021 DEBUG nova.network.neutron [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.140 222021 DEBUG oslo_concurrency.lockutils [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.210 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:22 np0005593233 NetworkManager[48871]: <info>  [1769162122.2112] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Jan 23 04:55:22 np0005593233 NetworkManager[48871]: <info>  [1769162122.2126] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:22Z|00330|binding|INFO|Releasing lport 05764538-5c1e-43c4-889d-c55248036dc2 from this chassis (sb_readonly=0)
Jan 23 04:55:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:22Z|00331|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:55:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:22Z|00332|binding|INFO|Releasing lport 586a15e6-31c5-4047-b8d7-e35dbfe6ae30 from this chassis (sb_readonly=0)
Jan 23 04:55:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:22Z|00333|binding|INFO|Releasing lport 25d922f4-f2af-4379-86c5-92c7874dd4ee from this chassis (sb_readonly=0)
Jan 23 04:55:22 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:22Z|00334|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:55:22 np0005593233 nova_compute[222017]: 2026-01-23 09:55:22.387 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:22.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:22.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.514 222021 DEBUG nova.compute.manager [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-changed-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.514 222021 DEBUG nova.compute.manager [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing instance network info cache due to event network-changed-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.515 222021 DEBUG oslo_concurrency.lockutils [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.515 222021 DEBUG oslo_concurrency.lockutils [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:23 np0005593233 nova_compute[222017]: 2026-01-23 09:55:23.516 222021 DEBUG nova.network.neutron [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Refreshing network info cache for port 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:55:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.390 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.437 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.437 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.438 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.438 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.438 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:24.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2581804733' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:24.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:24 np0005593233 nova_compute[222017]: 2026-01-23 09:55:24.972 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:55:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:55:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.109 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.111 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.111 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.112 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000051 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.117 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000054 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.117 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000054 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.121 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.121 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.206 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.420 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:25.423 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:25.425 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.547 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.549 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4091MB free_disk=20.876117706298828GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.550 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.550 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.636 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Applying migration context for instance dc5e2bb3-0d73-4538-a181-9380a1d67934 as it has an incoming, in-progress migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e. Migration status is finished _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.638 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating resource usage from migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.691 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance c4100b68-be14-4cd7-8243-2c9a793caa5f actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.693 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ec678068-aa1c-4926-abee-e4852fd8f1fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.693 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance dc5e2bb3-0d73-4538-a181-9380a1d67934 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.694 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.694 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:55:25 np0005593233 nova_compute[222017]: 2026-01-23 09:55:25.795 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1846089029' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:26 np0005593233 nova_compute[222017]: 2026-01-23 09:55:26.362 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.567s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:26 np0005593233 nova_compute[222017]: 2026-01-23 09:55:26.370 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:55:26 np0005593233 nova_compute[222017]: 2026-01-23 09:55:26.416 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:55:26 np0005593233 nova_compute[222017]: 2026-01-23 09:55:26.462 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:55:26 np0005593233 nova_compute[222017]: 2026-01-23 09:55:26.464 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:26.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:26.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.315 222021 DEBUG nova.network.neutron [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updated VIF entry in instance network info cache for port 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.316 222021 DEBUG nova.network.neutron [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.369 222021 DEBUG oslo_concurrency.lockutils [req-014eb773-c95d-4c5f-ae38-4ce4542c8cfd req-fb2ca24f-87cf-4dbf-b89b-fd8dca5fcfbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.813 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.814 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.815 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.815 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.816 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.817 222021 INFO nova.compute.manager [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Terminating instance#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.818 222021 DEBUG nova.compute.manager [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:55:27 np0005593233 kernel: tap4d7d3d07-d9 (unregistering): left promiscuous mode
Jan 23 04:55:27 np0005593233 NetworkManager[48871]: <info>  [1769162127.8682] device (tap4d7d3d07-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.882 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:27Z|00335|binding|INFO|Releasing lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 from this chassis (sb_readonly=0)
Jan 23 04:55:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:27Z|00336|binding|INFO|Setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 down in Southbound
Jan 23 04:55:27 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:27Z|00337|binding|INFO|Removing iface tap4d7d3d07-d9 ovn-installed in OVS
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.889 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:27.893 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:a4:09 10.100.0.10'], port_security=['fa:16:3e:c1:a4:09 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ec678068-aa1c-4926-abee-e4852fd8f1fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=4d7d3d07-d9ea-465d-b091-0ad0246436b2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:27.895 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 4d7d3d07-d9ea-465d-b091-0ad0246436b2 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:55:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:27.897 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:27.900 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[95978182-f9ef-4f67-9b28-369cc24fc6ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:27.901 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:55:27 np0005593233 nova_compute[222017]: 2026-01-23 09:55:27.916 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:27 np0005593233 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 23 04:55:27 np0005593233 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000054.scope: Consumed 11.752s CPU time.
Jan 23 04:55:27 np0005593233 systemd-machined[190954]: Machine qemu-42-instance-00000054 terminated.
Jan 23 04:55:28 np0005593233 kernel: tap4d7d3d07-d9: entered promiscuous mode
Jan 23 04:55:28 np0005593233 NetworkManager[48871]: <info>  [1769162128.0452] manager: (tap4d7d3d07-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Jan 23 04:55:28 np0005593233 systemd-udevd[256284]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.047 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 kernel: tap4d7d3d07-d9 (unregistering): left promiscuous mode
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00338|binding|INFO|Claiming lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 for this chassis.
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00339|binding|INFO|4d7d3d07-d9ea-465d-b091-0ad0246436b2: Claiming fa:16:3e:c1:a4:09 10.100.0.10
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.059 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:a4:09 10.100.0.10'], port_security=['fa:16:3e:c1:a4:09 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ec678068-aa1c-4926-abee-e4852fd8f1fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=4d7d3d07-d9ea-465d-b091-0ad0246436b2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00340|binding|INFO|Setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 ovn-installed in OVS
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00341|binding|INFO|Setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 up in Southbound
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00342|binding|INFO|Releasing lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 from this chassis (sb_readonly=1)
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00343|if_status|INFO|Not setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 down as sb is readonly
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00344|binding|INFO|Removing iface tap4d7d3d07-d9 ovn-installed in OVS
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.092 222021 INFO nova.virt.libvirt.driver [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Instance destroyed successfully.#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00345|binding|INFO|Releasing lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 from this chassis (sb_readonly=0)
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.093 222021 DEBUG nova.objects.instance [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid ec678068-aa1c-4926-abee-e4852fd8f1fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:28Z|00346|binding|INFO|Setting lport 4d7d3d07-d9ea-465d-b091-0ad0246436b2 down in Southbound
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.101 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:a4:09 10.100.0.10'], port_security=['fa:16:3e:c1:a4:09 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ec678068-aa1c-4926-abee-e4852fd8f1fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=4d7d3d07-d9ea-465d-b091-0ad0246436b2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.105 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.111 222021 DEBUG nova.virt.libvirt.vif [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-822107035',display_name='tempest-ServerDiskConfigTestJSON-server-822107035',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-822107035',id=84,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-k0qehsmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:24Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=ec678068-aa1c-4926-abee-e4852fd8f1fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.112 222021 DEBUG nova.network.os_vif_util [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "address": "fa:16:3e:c1:a4:09", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7d3d07-d9", "ovs_interfaceid": "4d7d3d07-d9ea-465d-b091-0ad0246436b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.113 222021 DEBUG nova.network.os_vif_util [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.113 222021 DEBUG os_vif [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.117 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.117 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d7d3d07-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.119 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.121 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [NOTICE]   (255768) : haproxy version is 2.8.14-c23fe91
Jan 23 04:55:28 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [NOTICE]   (255768) : path to executable is /usr/sbin/haproxy
Jan 23 04:55:28 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [WARNING]  (255768) : Exiting Master process...
Jan 23 04:55:28 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [WARNING]  (255768) : Exiting Master process...
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.124 222021 INFO os_vif [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:a4:09,bridge_name='br-int',has_traffic_filtering=True,id=4d7d3d07-d9ea-465d-b091-0ad0246436b2,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7d3d07-d9')#033[00m
Jan 23 04:55:28 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [ALERT]    (255768) : Current worker (255770) exited with code 143 (Terminated)
Jan 23 04:55:28 np0005593233 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[255764]: [WARNING]  (255768) : All workers exited. Exiting... (0)
Jan 23 04:55:28 np0005593233 systemd[1]: libpod-ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2.scope: Deactivated successfully.
Jan 23 04:55:28 np0005593233 podman[256306]: 2026-01-23 09:55:28.133351895 +0000 UTC m=+0.069997803 container died ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:28 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2-userdata-shm.mount: Deactivated successfully.
Jan 23 04:55:28 np0005593233 systemd[1]: var-lib-containers-storage-overlay-3049011eaa7e8a9586e8d6fd8b679b8e1e51b1de36326bbb57c9beccfdc069aa-merged.mount: Deactivated successfully.
Jan 23 04:55:28 np0005593233 podman[256306]: 2026-01-23 09:55:28.207795162 +0000 UTC m=+0.144441060 container cleanup ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:55:28 np0005593233 systemd[1]: libpod-conmon-ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2.scope: Deactivated successfully.
Jan 23 04:55:28 np0005593233 podman[256356]: 2026-01-23 09:55:28.298213957 +0000 UTC m=+0.055653511 container remove ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.304 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[33369a0d-b56d-4ae1-a1e6-9c74e1dde346]: (4, ('Fri Jan 23 09:55:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2)\nac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2\nFri Jan 23 09:55:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (ac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2)\nac6e55e359c35712ad322397b5c5f13ecf114dfe4e478232eb27b2e00370c3a2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.306 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5317618c-bc29-4f1f-9610-09412ef3b8dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.308 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.310 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.331 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.335 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eb688045-3c3d-4cc4-bbce-aa35e8d7f839]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.352 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1eee29cc-32b2-47a7-b4e3-efd7b430e190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.355 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3df224-c8f3-4480-8dfa-608c6fe1c055]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.375 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[86b4380a-ea04-4994-8e0a-7e687623fc94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602462, 'reachable_time': 19722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256372, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.379 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.379 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[5100b24e-fadd-4b8a-99be-53bcd89424f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.380 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 4d7d3d07-d9ea-465d-b091-0ad0246436b2 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:55:28 np0005593233 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.382 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.383 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[62298fdb-d7a2-48fd-8c73-a3b99497969b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.384 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 4d7d3d07-d9ea-465d-b091-0ad0246436b2 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.386 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:28.389 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[782614b2-4c8e-4591-b6f3-f5ca33feafff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.465 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.466 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.466 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.466 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:55:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e261 e261: 3 total, 3 up, 3 in
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.599 222021 DEBUG nova.compute.manager [req-4ee0dcc1-6c37-4518-ac37-9ad2a1cd3f38 req-2deff290-8d99-4d6f-85fd-6b6f788ddc49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-unplugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.600 222021 DEBUG oslo_concurrency.lockutils [req-4ee0dcc1-6c37-4518-ac37-9ad2a1cd3f38 req-2deff290-8d99-4d6f-85fd-6b6f788ddc49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.600 222021 DEBUG oslo_concurrency.lockutils [req-4ee0dcc1-6c37-4518-ac37-9ad2a1cd3f38 req-2deff290-8d99-4d6f-85fd-6b6f788ddc49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.601 222021 DEBUG oslo_concurrency.lockutils [req-4ee0dcc1-6c37-4518-ac37-9ad2a1cd3f38 req-2deff290-8d99-4d6f-85fd-6b6f788ddc49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.602 222021 DEBUG nova.compute.manager [req-4ee0dcc1-6c37-4518-ac37-9ad2a1cd3f38 req-2deff290-8d99-4d6f-85fd-6b6f788ddc49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] No waiting events found dispatching network-vif-unplugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.602 222021 DEBUG nova.compute.manager [req-4ee0dcc1-6c37-4518-ac37-9ad2a1cd3f38 req-2deff290-8d99-4d6f-85fd-6b6f788ddc49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-unplugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.705 222021 INFO nova.virt.libvirt.driver [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Deleting instance files /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd_del#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.706 222021 INFO nova.virt.libvirt.driver [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Deletion of /var/lib/nova/instances/ec678068-aa1c-4926-abee-e4852fd8f1fd_del complete#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.796 222021 INFO nova.compute.manager [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Took 0.98 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.797 222021 DEBUG oslo.service.loopingcall [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.797 222021 DEBUG nova.compute.manager [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:55:28 np0005593233 nova_compute[222017]: 2026-01-23 09:55:28.798 222021 DEBUG nova.network.neutron [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:55:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:28.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:28.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.419 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.420 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.727 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.728 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.729 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.729 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.730 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.731 222021 INFO nova.compute.manager [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Terminating instance#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.732 222021 DEBUG nova.compute.manager [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:55:29 np0005593233 kernel: tap21920b88-37 (unregistering): left promiscuous mode
Jan 23 04:55:29 np0005593233 NetworkManager[48871]: <info>  [1769162129.7814] device (tap21920b88-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:29 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:29Z|00347|binding|INFO|Releasing lport 21920b88-3779-4c29-b3a9-7591691e880a from this chassis (sb_readonly=0)
Jan 23 04:55:29 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:29Z|00348|binding|INFO|Setting lport 21920b88-3779-4c29-b3a9-7591691e880a down in Southbound
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:29 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:29Z|00349|binding|INFO|Removing iface tap21920b88-37 ovn-installed in OVS
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:29.803 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:2e:b4 10.100.0.14'], port_security=['fa:16:3e:46:2e:b4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dc5e2bb3-0d73-4538-a181-9380a1d67934', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=21920b88-3779-4c29-b3a9-7591691e880a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:29.805 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 21920b88-3779-4c29-b3a9-7591691e880a in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.806 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.806 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.806 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.806 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c4100b68-be14-4cd7-8243-2c9a793caa5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:29.807 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:29.808 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3c51f108-e0d9-4ad9-a3db-25edb39d0dcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:29.808 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.812 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:29 np0005593233 nova_compute[222017]: 2026-01-23 09:55:29.821 222021 DEBUG nova.network.neutron [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:29 np0005593233 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 23 04:55:29 np0005593233 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000053.scope: Consumed 11.173s CPU time.
Jan 23 04:55:29 np0005593233 systemd-machined[190954]: Machine qemu-43-instance-00000053 terminated.
Jan 23 04:55:29 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [NOTICE]   (256093) : haproxy version is 2.8.14-c23fe91
Jan 23 04:55:29 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [NOTICE]   (256093) : path to executable is /usr/sbin/haproxy
Jan 23 04:55:29 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [WARNING]  (256093) : Exiting Master process...
Jan 23 04:55:29 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [WARNING]  (256093) : Exiting Master process...
Jan 23 04:55:29 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [ALERT]    (256093) : Current worker (256095) exited with code 143 (Terminated)
Jan 23 04:55:29 np0005593233 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[256089]: [WARNING]  (256093) : All workers exited. Exiting... (0)
Jan 23 04:55:29 np0005593233 systemd[1]: libpod-dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c.scope: Deactivated successfully.
Jan 23 04:55:29 np0005593233 podman[256393]: 2026-01-23 09:55:29.965508782 +0000 UTC m=+0.058976385 container died dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.015 222021 INFO nova.virt.libvirt.driver [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance destroyed successfully.#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.016 222021 DEBUG nova.objects.instance [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.106 222021 DEBUG nova.virt.libvirt.vif [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:19Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.107 222021 DEBUG nova.network.os_vif_util [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.108 222021 DEBUG nova.network.os_vif_util [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.108 222021 DEBUG os_vif [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.111 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.111 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21920b88-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.113 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.115 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.119 222021 INFO os_vif [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37')#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.207 222021 INFO nova.compute.manager [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Took 1.41 seconds to deallocate network for instance.#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.208 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.272 222021 DEBUG nova.compute.manager [req-68e3bcd8-4a33-40bd-8978-982f8f22b502 req-22008015-4606-487e-a47d-55455beac669 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-deleted-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:30 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c-userdata-shm.mount: Deactivated successfully.
Jan 23 04:55:30 np0005593233 systemd[1]: var-lib-containers-storage-overlay-60d112a96d8c214ca02fae7889fd7fac148dda7826bf6ce73e0a89ed16e2871f-merged.mount: Deactivated successfully.
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.338 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.339 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:30 np0005593233 podman[256393]: 2026-01-23 09:55:30.41352046 +0000 UTC m=+0.506988063 container cleanup dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:55:30 np0005593233 systemd[1]: libpod-conmon-dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c.scope: Deactivated successfully.
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.428 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.528 222021 DEBUG oslo_concurrency.processutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:30 np0005593233 podman[256452]: 2026-01-23 09:55:30.678008074 +0000 UTC m=+0.229485034 container remove dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.692 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[de51b699-d54a-4ab3-862b-95ed6f40893d]: (4, ('Fri Jan 23 09:55:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c)\ndc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c\nFri Jan 23 09:55:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (dc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c)\ndc78ca6e3700bdb6d46ba3d946986b1ee4599cbf8facb657c084009dda02e16c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.694 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[44ba1392-ebc4-42b5-a1c6-924b530cfcf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.696 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.697 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.701 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.706 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[84bef052-30e6-4c5e-a905-4d03858e2f91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 nova_compute[222017]: 2026-01-23 09:55:30.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.721 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2ad5ed7f-0dd9-45d2-a56d-b232825b9088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.722 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8ebf02-ed52-4459-aba8-7132282a07a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:30Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:12:a4:39 10.1.1.35
Jan 23 04:55:30 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:30Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:12:a4:39 10.1.1.35
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.745 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[92e9ac06-72f9-4825-a1c2-63840921ae07]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 603125, 'reachable_time': 30681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256487, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.752 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:55:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:30.752 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[14356d31-0316-42ba-8af8-12f9d4bdf143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:30.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:30.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3736538725' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:6d:39 10.1.1.86
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.053 222021 DEBUG oslo_concurrency.processutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:6d:39 10.1.1.86
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.059 222021 DEBUG nova.compute.provider_tree [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.252 222021 INFO nova.virt.libvirt.driver [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Deleting instance files /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934_del#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.253 222021 INFO nova.virt.libvirt.driver [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Deletion of /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934_del complete#033[00m
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:32:35 10.100.0.10
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:32:35 10.100.0.10
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.279 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.280 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.280 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.280 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.280 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] No waiting events found dispatching network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.281 222021 WARNING nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received unexpected event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.281 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.281 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.281 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.281 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.282 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] No waiting events found dispatching network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.282 222021 WARNING nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received unexpected event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.282 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.282 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.283 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.283 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.283 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] No waiting events found dispatching network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.283 222021 WARNING nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received unexpected event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.283 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.284 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.284 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.284 222021 DEBUG oslo_concurrency.lockutils [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.284 222021 DEBUG nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] No waiting events found dispatching network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.285 222021 WARNING nova.compute.manager [req-a7d4fea4-f47e-4865-8f37-9858fcf4e20f req-ddcf2c99-0834-41c7-a531-7c275a3d062c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Received unexpected event network-vif-plugged-4d7d3d07-d9ea-465d-b091-0ad0246436b2 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.356 222021 DEBUG nova.scheduler.client.report [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.410 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:da:72 10.2.2.200
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:da:72 10.2.2.200
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.484 222021 INFO nova.scheduler.client.report [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Deleted allocations for instance ec678068-aa1c-4926-abee-e4852fd8f1fd#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.485 222021 INFO nova.compute.manager [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Took 1.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.486 222021 DEBUG oslo.service.loopingcall [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.486 222021 DEBUG nova.compute.manager [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.486 222021 DEBUG nova.network.neutron [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:55:31 np0005593233 nova_compute[222017]: 2026-01-23 09:55:31.586 222021 DEBUG oslo_concurrency.lockutils [None req-9ea8995e-bb86-4235-b8a8-b01ea3a8af6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "ec678068-aa1c-4926-abee-e4852fd8f1fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:d6:55 10.2.2.100
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:d6:55 10.2.2.100
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:55:dc 10.1.1.82
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:55:dc 10.1.1.82
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e3:af:d4 10.1.1.246
Jan 23 04:55:31 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:31Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e3:af:d4 10.1.1.246
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.473 222021 DEBUG nova.compute.manager [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.473 222021 DEBUG oslo_concurrency.lockutils [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.473 222021 DEBUG oslo_concurrency.lockutils [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.473 222021 DEBUG oslo_concurrency.lockutils [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.474 222021 DEBUG nova.compute.manager [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.474 222021 WARNING nova.compute.manager [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state None.#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.474 222021 DEBUG nova.compute.manager [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.474 222021 DEBUG oslo_concurrency.lockutils [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.475 222021 DEBUG oslo_concurrency.lockutils [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.475 222021 DEBUG oslo_concurrency.lockutils [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.475 222021 DEBUG nova.compute.manager [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.475 222021 WARNING nova.compute.manager [req-b980d676-170f-4da1-93c5-f172f88a9d3d req-752454f7-c932-404d-950c-bd3e108e4747 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state None.#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.701 222021 DEBUG nova.network.neutron [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.739 222021 INFO nova.compute.manager [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Took 1.25 seconds to deallocate network for instance.#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.812 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.812 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:32.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:55:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:55:32 np0005593233 nova_compute[222017]: 2026-01-23 09:55:32.959 222021 DEBUG oslo_concurrency.processutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:32.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3548008359' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:33 np0005593233 nova_compute[222017]: 2026-01-23 09:55:33.463 222021 DEBUG oslo_concurrency.processutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:33 np0005593233 nova_compute[222017]: 2026-01-23 09:55:33.477 222021 DEBUG nova.compute.provider_tree [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:55:33 np0005593233 nova_compute[222017]: 2026-01-23 09:55:33.524 222021 DEBUG nova.scheduler.client.report [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:55:33 np0005593233 nova_compute[222017]: 2026-01-23 09:55:33.569 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:33 np0005593233 nova_compute[222017]: 2026-01-23 09:55:33.620 222021 INFO nova.scheduler.client.report [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance dc5e2bb3-0d73-4538-a181-9380a1d67934#033[00m
Jan 23 04:55:33 np0005593233 nova_compute[222017]: 2026-01-23 09:55:33.734 222021 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:34 np0005593233 podman[256563]: 2026-01-23 09:55:34.087501075 +0000 UTC m=+0.095933801 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:55:34 np0005593233 nova_compute[222017]: 2026-01-23 09:55:34.670 222021 DEBUG nova.compute.manager [req-77667ed8-a552-4326-8750-8c438a65038e req-acfa152b-07b9-4db8-9642-0879c10f1e1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-deleted-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:34.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:34.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:35 np0005593233 nova_compute[222017]: 2026-01-23 09:55:35.112 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:35 np0005593233 nova_compute[222017]: 2026-01-23 09:55:35.213 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 e262: 3 total, 3 up, 3 in
Jan 23 04:55:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:36.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:36.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:37Z|00350|binding|INFO|Releasing lport 05764538-5c1e-43c4-889d-c55248036dc2 from this chassis (sb_readonly=0)
Jan 23 04:55:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:37Z|00351|binding|INFO|Releasing lport 586a15e6-31c5-4047-b8d7-e35dbfe6ae30 from this chassis (sb_readonly=0)
Jan 23 04:55:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:37Z|00352|binding|INFO|Releasing lport 25d922f4-f2af-4379-86c5-92c7874dd4ee from this chassis (sb_readonly=0)
Jan 23 04:55:37 np0005593233 nova_compute[222017]: 2026-01-23 09:55:37.968 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:38.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:38.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:40 np0005593233 nova_compute[222017]: 2026-01-23 09:55:40.116 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:40 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:40Z|00353|binding|INFO|Releasing lport 05764538-5c1e-43c4-889d-c55248036dc2 from this chassis (sb_readonly=0)
Jan 23 04:55:40 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:40Z|00354|binding|INFO|Releasing lport 586a15e6-31c5-4047-b8d7-e35dbfe6ae30 from this chassis (sb_readonly=0)
Jan 23 04:55:40 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:40Z|00355|binding|INFO|Releasing lport 25d922f4-f2af-4379-86c5-92c7874dd4ee from this chassis (sb_readonly=0)
Jan 23 04:55:40 np0005593233 nova_compute[222017]: 2026-01-23 09:55:40.241 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:40.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:40.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:41.566 140481 DEBUG eventlet.wsgi.server [-] (140481) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:41.569 140481 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: Accept: */*#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: Connection: close#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: Content-Type: text/plain#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: Host: 169.254.169.254#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: User-Agent: curl/7.84.0#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: X-Forwarded-For: 10.100.0.10#015
Jan 23 04:55:41 np0005593233 ovn_metadata_agent[140219]: X-Ovn-Network-Id: 6a690804-4ecf-4c63-9b31-acfe5ecf9a3a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 23 04:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:42.657 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:42.657 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:42.658 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:42.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:42.835 140481 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 23 04:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:42.836 140481 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 2548 time: 1.2673075#033[00m
Jan 23 04:55:42 np0005593233 haproxy-metadata-proxy-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255318]: 10.100.0.10:49054 [23/Jan/2026:09:55:41.565] listener listener/metadata 0/0/0/1270/1270 200 2532 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 23 04:55:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:42.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:43 np0005593233 nova_compute[222017]: 2026-01-23 09:55:43.080 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162128.0790083, ec678068-aa1c-4926-abee-e4852fd8f1fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:43 np0005593233 nova_compute[222017]: 2026-01-23 09:55:43.082 222021 INFO nova.compute.manager [-] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:55:43 np0005593233 nova_compute[222017]: 2026-01-23 09:55:43.130 222021 DEBUG nova.compute.manager [None req-42001696-2931-4c83-9bb7-42be83eb161f - - - - - -] [instance: ec678068-aa1c-4926-abee-e4852fd8f1fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.132 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.133 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.133 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.133 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.134 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.135 222021 INFO nova.compute.manager [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Terminating instance#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.136 222021 DEBUG nova.compute.manager [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:55:44 np0005593233 kernel: tap0c6216c8-fb (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.3377] device (tap0c6216c8-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00356|binding|INFO|Releasing lport 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00357|binding|INFO|Setting lport 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.354 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00358|binding|INFO|Removing iface tap0c6216c8-fb ovn-installed in OVS
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.569 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:32:35 10.100.0.10'], port_security=['fa:16:3e:bf:32:35 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.209'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56f8805d-8a62-4352-aa44-760b906565e7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.571 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 in datapath 6a690804-4ecf-4c63-9b31-acfe5ecf9a3a unbound from our chassis#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.573 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.574 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[441ea989-83ca-4abc-8e52-a2dd75a37430]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.575 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a namespace which is not needed anymore#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.657 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 kernel: tap08ec741b-b5 (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.7455] device (tap08ec741b-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00359|binding|INFO|Releasing lport 08ec741b-b592-417f-9a64-1ee4d2e4e006 from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00360|binding|INFO|Setting lport 08ec741b-b592-417f-9a64-1ee4d2e4e006 down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00361|binding|INFO|Removing iface tap08ec741b-b5 ovn-installed in OVS
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 kernel: tapc7e6d8c9-43 (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.792 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:6d:39 10.1.1.86'], port_security=['fa:16:3e:53:6d:39 10.1.1.86'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-289949651', 'neutron:cidrs': '10.1.1.86/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-289949651', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2aa5e406-7157-4dbd-9a15-73e3a671533e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=08ec741b-b592-417f-9a64-1ee4d2e4e006) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.8025] device (tapc7e6d8c9-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.804 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00362|binding|INFO|Releasing lport c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00363|binding|INFO|Setting lport c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.816 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00364|binding|INFO|Removing iface tapc7e6d8c9-43 ovn-installed in OVS
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.823 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:a4:39 10.1.1.35'], port_security=['fa:16:3e:12:a4:39 10.1.1.35'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest-1279021389', 'neutron:cidrs': '10.1.1.35/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest-1279021389', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2aa5e406-7157-4dbd-9a15-73e3a671533e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 kernel: tap8f671859-24 (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.8336] device (tap8f671859-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.836 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:44.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00365|binding|INFO|Releasing lport 8f671859-24dc-4140-915c-bbad6f16e0d8 from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00366|binding|INFO|Setting lport 8f671859-24dc-4140-915c-bbad6f16e0d8 down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.848 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00367|binding|INFO|Removing iface tap8f671859-24 ovn-installed in OVS
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.851 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 kernel: tap8adc9155-eb (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.8655] device (tap8adc9155-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.868 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.870 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:55:dc 10.1.1.82'], port_security=['fa:16:3e:7c:55:dc 10.1.1.82'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.82/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8f671859-24dc-4140-915c-bbad6f16e0d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00368|binding|INFO|Releasing lport 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00369|binding|INFO|Setting lport 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.881 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00370|binding|INFO|Removing iface tap8adc9155-eb ovn-installed in OVS
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.889 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:af:d4 10.1.1.246'], port_security=['fa:16:3e:e3:af:d4 10.1.1.246'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.246/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82a36eed-1bfe-4d29-9766-88a06a6a5247, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 kernel: tap1e83b219-c5 (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.9104] device (tap1e83b219-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 kernel: tap2358fe4c-65 (unregistering): left promiscuous mode
Jan 23 04:55:44 np0005593233 NetworkManager[48871]: <info>  [1769162144.9392] device (tap2358fe4c-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00371|binding|INFO|Releasing lport 1e83b219-c51b-488a-8fc7-8240efb384c0 from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00372|binding|INFO|Setting lport 1e83b219-c51b-488a-8fc7-8240efb384c0 down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00373|binding|INFO|Removing iface tap1e83b219-c5 ovn-installed in OVS
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.947 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.956 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:d6:55 10.2.2.100'], port_security=['fa:16:3e:6d:d6:55 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2e206f-a131-4bd2-8f90-67b8b0bc9e3d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1e83b219-c51b-488a-8fc7-8240efb384c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00374|binding|INFO|Releasing lport 2358fe4c-654b-4b88-9e08-2e85688cb00e from this chassis (sb_readonly=0)
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00375|binding|INFO|Setting lport 2358fe4c-654b-4b88-9e08-2e85688cb00e down in Southbound
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.980 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:55:44Z|00376|binding|INFO|Removing iface tap2358fe4c-65 ovn-installed in OVS
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.982 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:44.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:44.988 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:da:72 10.2.2.200'], port_security=['fa:16:3e:6a:da:72 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': 'c4100b68-be14-4cd7-8243-2c9a793caa5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bc2d47d48c446c7ae1fc44cd9c32878', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7242574-a840-4a11-bf79-364aff0a64d9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2e206f-a131-4bd2-8f90-67b8b0bc9e3d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=2358fe4c-654b-4b88-9e08-2e85688cb00e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:44 np0005593233 nova_compute[222017]: 2026-01-23 09:55:44.995 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000051.scope: Deactivated successfully.
Jan 23 04:55:45 np0005593233 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000051.scope: Consumed 18.922s CPU time.
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.014 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162130.0124812, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.014 222021 INFO nova.compute.manager [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:55:45 np0005593233 systemd-machined[190954]: Machine qemu-41-instance-00000051 terminated.
Jan 23 04:55:45 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [NOTICE]   (255316) : haproxy version is 2.8.14-c23fe91
Jan 23 04:55:45 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [NOTICE]   (255316) : path to executable is /usr/sbin/haproxy
Jan 23 04:55:45 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [WARNING]  (255316) : Exiting Master process...
Jan 23 04:55:45 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [WARNING]  (255316) : Exiting Master process...
Jan 23 04:55:45 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [ALERT]    (255316) : Current worker (255318) exited with code 143 (Terminated)
Jan 23 04:55:45 np0005593233 neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a[255312]: [WARNING]  (255316) : All workers exited. Exiting... (0)
Jan 23 04:55:45 np0005593233 systemd[1]: libpod-e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a.scope: Deactivated successfully.
Jan 23 04:55:45 np0005593233 podman[256606]: 2026-01-23 09:55:45.051632958 +0000 UTC m=+0.370063294 container died e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.057 222021 DEBUG nova.compute.manager [None req-4a32b6f7-3250-4b2a-bfd7-e62bd1aa64f6 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.118 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 NetworkManager[48871]: <info>  [1769162145.1780] manager: (tap0c6216c8-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Jan 23 04:55:45 np0005593233 NetworkManager[48871]: <info>  [1769162145.1876] manager: (tap08ec741b-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/174)
Jan 23 04:55:45 np0005593233 NetworkManager[48871]: <info>  [1769162145.2032] manager: (tapc7e6d8c9-43): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 23 04:55:45 np0005593233 NetworkManager[48871]: <info>  [1769162145.2149] manager: (tap8f671859-24): new Tun device (/org/freedesktop/NetworkManager/Devices/176)
Jan 23 04:55:45 np0005593233 NetworkManager[48871]: <info>  [1769162145.2274] manager: (tap8adc9155-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Jan 23 04:55:45 np0005593233 NetworkManager[48871]: <info>  [1769162145.2508] manager: (tap2358fe4c-65): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.271 222021 INFO nova.virt.libvirt.driver [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Instance destroyed successfully.#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.273 222021 DEBUG nova.objects.instance [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lazy-loading 'resources' on Instance uuid c4100b68-be14-4cd7-8243-2c9a793caa5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.275 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.302 222021 DEBUG nova.compute.manager [req-37516c41-8d18-4ea3-8d80-7d16a355caec req-188d91c2-7c5d-4d9c-bdfa-0a0daa17c718 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.303 222021 DEBUG oslo_concurrency.lockutils [req-37516c41-8d18-4ea3-8d80-7d16a355caec req-188d91c2-7c5d-4d9c-bdfa-0a0daa17c718 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.303 222021 DEBUG oslo_concurrency.lockutils [req-37516c41-8d18-4ea3-8d80-7d16a355caec req-188d91c2-7c5d-4d9c-bdfa-0a0daa17c718 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.303 222021 DEBUG oslo_concurrency.lockutils [req-37516c41-8d18-4ea3-8d80-7d16a355caec req-188d91c2-7c5d-4d9c-bdfa-0a0daa17c718 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.303 222021 DEBUG nova.compute.manager [req-37516c41-8d18-4ea3-8d80-7d16a355caec req-188d91c2-7c5d-4d9c-bdfa-0a0daa17c718 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.303 222021 DEBUG nova.compute.manager [req-37516c41-8d18-4ea3-8d80-7d16a355caec req-188d91c2-7c5d-4d9c-bdfa-0a0daa17c718 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:45 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a-userdata-shm.mount: Deactivated successfully.
Jan 23 04:55:45 np0005593233 systemd[1]: var-lib-containers-storage-overlay-0ba305a3110896dfe732f7f2da08e1d39b78bd5ead29d76a4f851ea4e6ae471d-merged.mount: Deactivated successfully.
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.319 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.319 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.320 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.320 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.322 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.322 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c6216c8-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.323 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.328 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.341 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.343 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:32:35,bridge_name='br-int',has_traffic_filtering=True,id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408,network=Network(6a690804-4ecf-4c63-9b31-acfe5ecf9a3a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6216c8-fb')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.344 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.344 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.345 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.345 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.347 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08ec741b-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.348 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.351 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.362 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.363 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:6d:39,bridge_name='br-int',has_traffic_filtering=True,id=08ec741b-b592-417f-9a64-1ee4d2e4e006,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08ec741b-b5')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.364 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.364 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.365 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.365 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.366 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.366 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7e6d8c9-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.367 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.369 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.382 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a4:39,bridge_name='br-int',has_traffic_filtering=True,id=c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc7e6d8c9-43')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.383 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.383 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.384 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.384 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.385 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.386 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f671859-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.387 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.388 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.396 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.397 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:55:dc,bridge_name='br-int',has_traffic_filtering=True,id=8f671859-24dc-4140-915c-bbad6f16e0d8,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f671859-24')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.398 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.398 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.399 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.399 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.400 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.400 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8adc9155-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.401 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.410 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:af:d4,bridge_name='br-int',has_traffic_filtering=True,id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a,network=Network(9a939889-e790-45a5-a577-4dfb1df5c2f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8adc9155-eb')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.411 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.412 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.412 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.413 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.414 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e83b219-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.418 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.421 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=1e83b219-c51b-488a-8fc7-8240efb384c0,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e83b219-c5')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.422 222021 DEBUG nova.virt.libvirt.vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-451535713',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-451535713',id=81,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+RHDEXfb6O0Kf7YJngxrkjRr4hUXN6KkbMes9eGfUfnCeg9IQpGvDSAEchNvOkaW8b8diB2nsny4T7Qwy6SBdKjw80eRo7rW56xwErviEqm+8clUI9Y0ihCp5EmmCJkA==',key_name='tempest-keypair-317332663',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9bc2d47d48c446c7ae1fc44cd9c32878',ramdisk_id='',reservation_id='r-91jkan0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest-1431047957',owner_user_name='tempest-TaggedBootDevicesTest-1431047957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d512838ce2b44554b0566fdbb3c702b4',uuid=c4100b68-be14-4cd7-8243-2c9a793caa5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.422 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converting VIF {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.423 222021 DEBUG nova.network.os_vif_util [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.423 222021 DEBUG os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.425 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.425 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2358fe4c-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.426 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.430 222021 INFO os_vif [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:da:72,bridge_name='br-int',has_traffic_filtering=True,id=2358fe4c-654b-4b88-9e08-2e85688cb00e,network=Network(334ca2e2-bd22-482d-8aad-3c18e88d90a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2358fe4c-65')#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.530 222021 DEBUG nova.compute.manager [req-bae255a6-527e-4e57-b4be-98582ae1875d req-ece670d0-4ca4-47bf-b845-2b822fe5f248 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.531 222021 DEBUG oslo_concurrency.lockutils [req-bae255a6-527e-4e57-b4be-98582ae1875d req-ece670d0-4ca4-47bf-b845-2b822fe5f248 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.531 222021 DEBUG oslo_concurrency.lockutils [req-bae255a6-527e-4e57-b4be-98582ae1875d req-ece670d0-4ca4-47bf-b845-2b822fe5f248 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.532 222021 DEBUG oslo_concurrency.lockutils [req-bae255a6-527e-4e57-b4be-98582ae1875d req-ece670d0-4ca4-47bf-b845-2b822fe5f248 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.532 222021 DEBUG nova.compute.manager [req-bae255a6-527e-4e57-b4be-98582ae1875d req-ece670d0-4ca4-47bf-b845-2b822fe5f248 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:45 np0005593233 podman[256606]: 2026-01-23 09:55:45.533500386 +0000 UTC m=+0.851930732 container cleanup e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.533 222021 DEBUG nova.compute.manager [req-bae255a6-527e-4e57-b4be-98582ae1875d req-ece670d0-4ca4-47bf-b845-2b822fe5f248 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:45 np0005593233 systemd[1]: libpod-conmon-e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a.scope: Deactivated successfully.
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.704 222021 DEBUG nova.compute.manager [req-8c424d29-4b2c-4b85-bf39-60a5c295540e req-d9e61a0f-d203-4ac7-9365-ae8d82e256d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.705 222021 DEBUG oslo_concurrency.lockutils [req-8c424d29-4b2c-4b85-bf39-60a5c295540e req-d9e61a0f-d203-4ac7-9365-ae8d82e256d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.705 222021 DEBUG oslo_concurrency.lockutils [req-8c424d29-4b2c-4b85-bf39-60a5c295540e req-d9e61a0f-d203-4ac7-9365-ae8d82e256d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.705 222021 DEBUG oslo_concurrency.lockutils [req-8c424d29-4b2c-4b85-bf39-60a5c295540e req-d9e61a0f-d203-4ac7-9365-ae8d82e256d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.707 222021 DEBUG nova.compute.manager [req-8c424d29-4b2c-4b85-bf39-60a5c295540e req-d9e61a0f-d203-4ac7-9365-ae8d82e256d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.707 222021 DEBUG nova.compute.manager [req-8c424d29-4b2c-4b85-bf39-60a5c295540e req-d9e61a0f-d203-4ac7-9365-ae8d82e256d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:45 np0005593233 podman[256806]: 2026-01-23 09:55:45.788410071 +0000 UTC m=+0.084746446 container remove e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.845 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bcf53709-55df-45c9-ba09-24df55792bdd]: (4, ('Fri Jan 23 09:55:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a (e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a)\ne6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a\nFri Jan 23 09:55:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a (e6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a)\ne6ae264be97a6c6c5c00b30b375196ddcaac3f7ed23fdaa73fc21476562d4f2a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.847 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[21b0497a-1ec1-46bc-a107-1cc4a30c0d08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.849 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a690804-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.850 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 kernel: tap6a690804-40: left promiscuous mode
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.865 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 nova_compute[222017]: 2026-01-23 09:55:45.865 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.869 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[472f5001-dd82-4c96-9ba0-c7a9f447cb1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.882 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[77975e59-dd0b-413d-9aaa-843b665abe96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.883 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe576ab-5a29-46d2-b7b3-082f1e2268f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.903 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[620320ec-2dd0-4351-bf98-2daf742867d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 601951, 'reachable_time': 15292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256821, 'error': None, 'target': 'ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 systemd[1]: run-netns-ovnmeta\x2d6a690804\x2d4ecf\x2d4c63\x2d9b31\x2dacfe5ecf9a3a.mount: Deactivated successfully.
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.908 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a690804-4ecf-4c63-9b31-acfe5ecf9a3a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.908 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[d047bfcb-a232-4a57-8a35-7ea3e5a1e98e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.910 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 08ec741b-b592-417f-9a64-1ee4d2e4e006 in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.912 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a939889-e790-45a5-a577-4dfb1df5c2f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.913 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dd55716a-fd7c-4a3b-888f-bc99e72e761c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:45.913 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4 namespace which is not needed anymore#033[00m
Jan 23 04:55:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [NOTICE]   (255391) : haproxy version is 2.8.14-c23fe91
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [NOTICE]   (255391) : path to executable is /usr/sbin/haproxy
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [WARNING]  (255391) : Exiting Master process...
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [ALERT]    (255391) : Current worker (255393) exited with code 143 (Terminated)
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4[255387]: [WARNING]  (255391) : All workers exited. Exiting... (0)
Jan 23 04:55:46 np0005593233 systemd[1]: libpod-75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d.scope: Deactivated successfully.
Jan 23 04:55:46 np0005593233 podman[256837]: 2026-01-23 09:55:46.13941705 +0000 UTC m=+0.114564442 container died 75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.142 222021 INFO nova.virt.libvirt.driver [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Deleting instance files /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f_del#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.143 222021 INFO nova.virt.libvirt.driver [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Deletion of /var/lib/nova/instances/c4100b68-be14-4cd7-8243-2c9a793caa5f_del complete#033[00m
Jan 23 04:55:46 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d-userdata-shm.mount: Deactivated successfully.
Jan 23 04:55:46 np0005593233 systemd[1]: var-lib-containers-storage-overlay-df700b3b6c63d59e5971822633cf8d43cb6a4f5146c486deb7dd94e7beebf82f-merged.mount: Deactivated successfully.
Jan 23 04:55:46 np0005593233 podman[256837]: 2026-01-23 09:55:46.18006204 +0000 UTC m=+0.155209402 container cleanup 75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:55:46 np0005593233 systemd[1]: libpod-conmon-75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d.scope: Deactivated successfully.
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.236 222021 INFO nova.compute.manager [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Took 2.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.237 222021 DEBUG oslo.service.loopingcall [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.238 222021 DEBUG nova.compute.manager [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.239 222021 DEBUG nova.network.neutron [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:55:46 np0005593233 podman[256866]: 2026-01-23 09:55:46.403666187 +0000 UTC m=+0.187997970 container remove 75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.412 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa3e448-1084-4635-b387-62da9a4425fa]: (4, ('Fri Jan 23 09:55:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4 (75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d)\n75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d\nFri Jan 23 09:55:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4 (75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d)\n75309400ab2c553f3dbcaf3e6b77776cb74fff58cc80bf2b71a7956fa1eedf8d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.416 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34167445-cd28-44c9-b271-ba36ab021bd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.417 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a939889-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:46 np0005593233 kernel: tap9a939889-e0: left promiscuous mode
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.437 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.439 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[74de68c1-639f-4f04-befb-98b2e2868370]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.461 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4b5533c3-ca00-4836-8c32-a4b2fc06f41a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.463 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[06ed7d9c-207b-432d-b540-1829dd3ec7b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.486 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e7d0c6-e6e5-499b-ac74-6ddd817a1cf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602058, 'reachable_time': 38195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256881, 'error': None, 'target': 'ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.490 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9a939889-e790-45a5-a577-4dfb1df5c2f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.490 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e14681-d326-4260-92a5-f72116ab688f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 systemd[1]: run-netns-ovnmeta\x2d9a939889\x2de790\x2d45a5\x2da577\x2d4dfb1df5c2f4.mount: Deactivated successfully.
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.491 140224 INFO neutron.agent.ovn.metadata.agent [-] Port c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.493 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a939889-e790-45a5-a577-4dfb1df5c2f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.494 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe70998-4583-4992-8bef-9398bc746344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.495 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8f671859-24dc-4140-915c-bbad6f16e0d8 in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.496 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a939889-e790-45a5-a577-4dfb1df5c2f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.496 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[550ebaa4-282f-422e-85bf-511255583b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.497 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a in datapath 9a939889-e790-45a5-a577-4dfb1df5c2f4 unbound from our chassis#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.498 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a939889-e790-45a5-a577-4dfb1df5c2f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.499 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9835bbb2-6da1-406f-8a21-d7997ca7abab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.499 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1e83b219-c51b-488a-8fc7-8240efb384c0 in datapath 334ca2e2-bd22-482d-8aad-3c18e88d90a5 unbound from our chassis#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.501 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 334ca2e2-bd22-482d-8aad-3c18e88d90a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.501 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9dd123-84f5-415f-867d-a18d0e7f0dd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:46.501 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5 namespace which is not needed anymore#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.561 222021 DEBUG nova.compute.manager [req-45939cef-5c4f-4619-8ade-06fbe35b2d74 req-150fba05-ffe1-4a6a-963e-b2dca344f61c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-1e83b219-c51b-488a-8fc7-8240efb384c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.562 222021 DEBUG oslo_concurrency.lockutils [req-45939cef-5c4f-4619-8ade-06fbe35b2d74 req-150fba05-ffe1-4a6a-963e-b2dca344f61c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.563 222021 DEBUG oslo_concurrency.lockutils [req-45939cef-5c4f-4619-8ade-06fbe35b2d74 req-150fba05-ffe1-4a6a-963e-b2dca344f61c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.563 222021 DEBUG oslo_concurrency.lockutils [req-45939cef-5c4f-4619-8ade-06fbe35b2d74 req-150fba05-ffe1-4a6a-963e-b2dca344f61c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.563 222021 DEBUG nova.compute.manager [req-45939cef-5c4f-4619-8ade-06fbe35b2d74 req-150fba05-ffe1-4a6a-963e-b2dca344f61c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-1e83b219-c51b-488a-8fc7-8240efb384c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:46 np0005593233 nova_compute[222017]: 2026-01-23 09:55:46.563 222021 DEBUG nova.compute.manager [req-45939cef-5c4f-4619-8ade-06fbe35b2d74 req-150fba05-ffe1-4a6a-963e-b2dca344f61c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-1e83b219-c51b-488a-8fc7-8240efb384c0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [NOTICE]   (255530) : haproxy version is 2.8.14-c23fe91
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [NOTICE]   (255530) : path to executable is /usr/sbin/haproxy
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [WARNING]  (255530) : Exiting Master process...
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [ALERT]    (255530) : Current worker (255532) exited with code 143 (Terminated)
Jan 23 04:55:46 np0005593233 neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5[255526]: [WARNING]  (255530) : All workers exited. Exiting... (0)
Jan 23 04:55:46 np0005593233 systemd[1]: libpod-912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65.scope: Deactivated successfully.
Jan 23 04:55:46 np0005593233 podman[256900]: 2026-01-23 09:55:46.65064766 +0000 UTC m=+0.049966402 container died 912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:55:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:46.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:46 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65-userdata-shm.mount: Deactivated successfully.
Jan 23 04:55:46 np0005593233 systemd[1]: var-lib-containers-storage-overlay-8857f9bb3afc8e0dbbb993a2827bc52f90d3fe9a1e6d1369814c90aa5a2c3b06-merged.mount: Deactivated successfully.
Jan 23 04:55:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:46.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:47 np0005593233 podman[256900]: 2026-01-23 09:55:47.289552109 +0000 UTC m=+0.688870851 container cleanup 912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:55:47 np0005593233 systemd[1]: libpod-conmon-912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65.scope: Deactivated successfully.
Jan 23 04:55:47 np0005593233 podman[256930]: 2026-01-23 09:55:47.356490126 +0000 UTC m=+0.041778993 container remove 912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.362 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5a70b1-78a5-4656-b67c-f6ef775ced4d]: (4, ('Fri Jan 23 09:55:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5 (912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65)\n912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65\nFri Jan 23 09:55:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5 (912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65)\n912a9b8b33328e482472b547fa2d5c2e18b355a49cd7f04e1aa58959d16e9a65\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.364 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[562066a1-40ef-4f47-8b40-b0ee01a7629d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.364 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap334ca2e2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:47 np0005593233 kernel: tap334ca2e2-b0: left promiscuous mode
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.367 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.384 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b268eff7-e704-4a56-b3c3-79936934fc84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.400 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[88ed57dd-e7cc-439f-b4b8-b7a4d6e66b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.401 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f91b5a-b61c-439f-91d1-769d33019fd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.419 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9046f39e-76b0-4c38-a0a6-1dfd6017c5e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602183, 'reachable_time': 25075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256960, 'error': None, 'target': 'ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.421 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-334ca2e2-bd22-482d-8aad-3c18e88d90a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.421 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb6560f-1a8b-46cd-8025-b298b58444ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 systemd[1]: run-netns-ovnmeta\x2d334ca2e2\x2dbd22\x2d482d\x2d8aad\x2d3c18e88d90a5.mount: Deactivated successfully.
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.423 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 2358fe4c-654b-4b88-9e08-2e85688cb00e in datapath 334ca2e2-bd22-482d-8aad-3c18e88d90a5 unbound from our chassis#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.424 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 334ca2e2-bd22-482d-8aad-3c18e88d90a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:55:47.425 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc6072e-8e98-4284-b4ab-1eb4e60434f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:47 np0005593233 podman[256931]: 2026-01-23 09:55:47.428493354 +0000 UTC m=+0.094586213 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.552 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.553 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.553 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.553 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.553 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.553 222021 WARNING nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-8f671859-24dc-4140-915c-bbad6f16e0d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-8f671859-24dc-4140-915c-bbad6f16e0d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-8f671859-24dc-4140-915c-bbad6f16e0d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.554 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.555 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.555 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.555 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.555 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.555 222021 WARNING nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-8f671859-24dc-4140-915c-bbad6f16e0d8 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.555 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-2358fe4c-654b-4b88-9e08-2e85688cb00e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.556 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.556 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.556 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.556 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-2358fe4c-654b-4b88-9e08-2e85688cb00e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.556 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-2358fe4c-654b-4b88-9e08-2e85688cb00e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.556 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.557 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.557 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.557 222021 DEBUG oslo_concurrency.lockutils [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.557 222021 DEBUG nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.557 222021 WARNING nova.compute.manager [req-21f357fd-5f17-406a-82e5-77ac71ec4ecb req-989711cf-2119-489e-b77a-2cac422892e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-2358fe4c-654b-4b88-9e08-2e85688cb00e for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.735 222021 DEBUG nova.compute.manager [req-8c036207-83f1-4c46-8833-d7ce796f4906 req-bcd7d6e5-8d7f-44fb-8b3a-81d81c81153c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.735 222021 DEBUG oslo_concurrency.lockutils [req-8c036207-83f1-4c46-8833-d7ce796f4906 req-bcd7d6e5-8d7f-44fb-8b3a-81d81c81153c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.735 222021 DEBUG oslo_concurrency.lockutils [req-8c036207-83f1-4c46-8833-d7ce796f4906 req-bcd7d6e5-8d7f-44fb-8b3a-81d81c81153c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.735 222021 DEBUG oslo_concurrency.lockutils [req-8c036207-83f1-4c46-8833-d7ce796f4906 req-bcd7d6e5-8d7f-44fb-8b3a-81d81c81153c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.735 222021 DEBUG nova.compute.manager [req-8c036207-83f1-4c46-8833-d7ce796f4906 req-bcd7d6e5-8d7f-44fb-8b3a-81d81c81153c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.736 222021 WARNING nova.compute.manager [req-8c036207-83f1-4c46-8833-d7ce796f4906 req-bcd7d6e5-8d7f-44fb-8b3a-81d81c81153c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.768 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.768 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.769 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.769 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.769 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.769 222021 WARNING nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-08ec741b-b592-417f-9a64-1ee4d2e4e006 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.769 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.770 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.770 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.770 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.770 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-unplugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.770 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-unplugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.770 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.771 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.771 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.771 222021 DEBUG oslo_concurrency.lockutils [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.771 222021 DEBUG nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:47 np0005593233 nova_compute[222017]: 2026-01-23 09:55:47.771 222021 WARNING nova.compute.manager [req-76c15966-9f52-4527-9369-a45bf0da7772 req-2e9f428b-feab-4c49-91ce-0056f15e0cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.119 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.338 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.731 222021 DEBUG nova.compute.manager [req-cec61995-042c-4db5-ab6e-22b751dfe225 req-574fba45-bbac-475e-a80b-c00ed1eddc51 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.731 222021 DEBUG oslo_concurrency.lockutils [req-cec61995-042c-4db5-ab6e-22b751dfe225 req-574fba45-bbac-475e-a80b-c00ed1eddc51 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.732 222021 DEBUG oslo_concurrency.lockutils [req-cec61995-042c-4db5-ab6e-22b751dfe225 req-574fba45-bbac-475e-a80b-c00ed1eddc51 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.732 222021 DEBUG oslo_concurrency.lockutils [req-cec61995-042c-4db5-ab6e-22b751dfe225 req-574fba45-bbac-475e-a80b-c00ed1eddc51 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.732 222021 DEBUG nova.compute.manager [req-cec61995-042c-4db5-ab6e-22b751dfe225 req-574fba45-bbac-475e-a80b-c00ed1eddc51 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] No waiting events found dispatching network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:48 np0005593233 nova_compute[222017]: 2026-01-23 09:55:48.732 222021 WARNING nova.compute.manager [req-cec61995-042c-4db5-ab6e-22b751dfe225 req-574fba45-bbac-475e-a80b-c00ed1eddc51 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received unexpected event network-vif-plugged-1e83b219-c51b-488a-8fc7-8240efb384c0 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:55:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:48.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:48.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:50 np0005593233 nova_compute[222017]: 2026-01-23 09:55:50.105 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8f671859-24dc-4140-915c-bbad6f16e0d8", "address": "fa:16:3e:7c:55:dc", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f671859-24", "ovs_interfaceid": "8f671859-24dc-4140-915c-bbad6f16e0d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:50 np0005593233 nova_compute[222017]: 2026-01-23 09:55:50.162 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-c4100b68-be14-4cd7-8243-2c9a793caa5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:50 np0005593233 nova_compute[222017]: 2026-01-23 09:55:50.163 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:55:50 np0005593233 nova_compute[222017]: 2026-01-23 09:55:50.299 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:50 np0005593233 nova_compute[222017]: 2026-01-23 09:55:50.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:50.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:50.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:52.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:52.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:54 np0005593233 nova_compute[222017]: 2026-01-23 09:55:54.157 222021 DEBUG nova.compute.manager [req-720ebd1b-e3ce-46a3-9cda-821558fd2b78 req-5a7ce1e3-6228-4014-8e3a-2a1198acc38b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-deleted-8f671859-24dc-4140-915c-bbad6f16e0d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:54 np0005593233 nova_compute[222017]: 2026-01-23 09:55:54.158 222021 INFO nova.compute.manager [req-720ebd1b-e3ce-46a3-9cda-821558fd2b78 req-5a7ce1e3-6228-4014-8e3a-2a1198acc38b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Neutron deleted interface 8f671859-24dc-4140-915c-bbad6f16e0d8; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:55:54 np0005593233 nova_compute[222017]: 2026-01-23 09:55:54.158 222021 DEBUG nova.network.neutron [req-720ebd1b-e3ce-46a3-9cda-821558fd2b78 req-5a7ce1e3-6228-4014-8e3a-2a1198acc38b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "address": "fa:16:3e:bf:32:35", "network": {"id": "6a690804-4ecf-4c63-9b31-acfe5ecf9a3a", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest-1804406247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6216c8-fb", "ovs_interfaceid": "0c6216c8-fb3f-49e9-a5c5-6eb26d39b408", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:54 np0005593233 nova_compute[222017]: 2026-01-23 09:55:54.230 222021 DEBUG nova.compute.manager [req-720ebd1b-e3ce-46a3-9cda-821558fd2b78 req-5a7ce1e3-6228-4014-8e3a-2a1198acc38b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Detach interface failed, port_id=8f671859-24dc-4140-915c-bbad6f16e0d8, reason: Instance c4100b68-be14-4cd7-8243-2c9a793caa5f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:55:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:54.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:54.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:55 np0005593233 nova_compute[222017]: 2026-01-23 09:55:55.158 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:55 np0005593233 nova_compute[222017]: 2026-01-23 09:55:55.158 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:55 np0005593233 nova_compute[222017]: 2026-01-23 09:55:55.331 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:55 np0005593233 nova_compute[222017]: 2026-01-23 09:55:55.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:56 np0005593233 nova_compute[222017]: 2026-01-23 09:55:56.395 222021 DEBUG nova.compute.manager [req-11b4b0a4-423f-466f-88fa-b34c8ed7e30f req-0e255abd-15bb-4996-9e2f-d26e92936e9f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-deleted-0c6216c8-fb3f-49e9-a5c5-6eb26d39b408 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:56 np0005593233 nova_compute[222017]: 2026-01-23 09:55:56.396 222021 INFO nova.compute.manager [req-11b4b0a4-423f-466f-88fa-b34c8ed7e30f req-0e255abd-15bb-4996-9e2f-d26e92936e9f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Neutron deleted interface 0c6216c8-fb3f-49e9-a5c5-6eb26d39b408; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:55:56 np0005593233 nova_compute[222017]: 2026-01-23 09:55:56.396 222021 DEBUG nova.network.neutron [req-11b4b0a4-423f-466f-88fa-b34c8ed7e30f req-0e255abd-15bb-4996-9e2f-d26e92936e9f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "address": "fa:16:3e:6a:da:72", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2358fe4c-65", "ovs_interfaceid": "2358fe4c-654b-4b88-9e08-2e85688cb00e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:56 np0005593233 nova_compute[222017]: 2026-01-23 09:55:56.434 222021 DEBUG nova.compute.manager [req-11b4b0a4-423f-466f-88fa-b34c8ed7e30f req-0e255abd-15bb-4996-9e2f-d26e92936e9f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Detach interface failed, port_id=0c6216c8-fb3f-49e9-a5c5-6eb26d39b408, reason: Instance c4100b68-be14-4cd7-8243-2c9a793caa5f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:55:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:56.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:55:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:57.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:55:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:58.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:55:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:59.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.050 222021 DEBUG nova.compute.manager [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-deleted-2358fe4c-654b-4b88-9e08-2e85688cb00e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.050 222021 INFO nova.compute.manager [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Neutron deleted interface 2358fe4c-654b-4b88-9e08-2e85688cb00e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.051 222021 DEBUG nova.network.neutron [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "address": "fa:16:3e:e3:af:d4", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.246", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8adc9155-eb", "ovs_interfaceid": "8adc9155-eb6d-41ea-ae59-7de12ff3fc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.075 222021 DEBUG nova.compute.manager [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Detach interface failed, port_id=2358fe4c-654b-4b88-9e08-2e85688cb00e, reason: Instance c4100b68-be14-4cd7-8243-2c9a793caa5f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.076 222021 DEBUG nova.compute.manager [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-deleted-8adc9155-eb6d-41ea-ae59-7de12ff3fc5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.076 222021 INFO nova.compute.manager [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Neutron deleted interface 8adc9155-eb6d-41ea-ae59-7de12ff3fc5a; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.077 222021 DEBUG nova.network.neutron [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [{"id": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "address": "fa:16:3e:53:6d:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.86", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08ec741b-b5", "ovs_interfaceid": "08ec741b-b592-417f-9a64-1ee4d2e4e006", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "address": "fa:16:3e:12:a4:39", "network": {"id": "9a939889-e790-45a5-a577-4dfb1df5c2f4", "bridge": "br-int", "label": "tempest-device-tagging-net1-454868639", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7e6d8c9-43", "ovs_interfaceid": "c7e6d8c9-43b5-49f5-a3ef-18ccd56a8f5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "1e83b219-c51b-488a-8fc7-8240efb384c0", "address": "fa:16:3e:6d:d6:55", "network": {"id": "334ca2e2-bd22-482d-8aad-3c18e88d90a5", "bridge": "br-int", "label": "tempest-device-tagging-net2-853859176", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9bc2d47d48c446c7ae1fc44cd9c32878", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e83b219-c5", "ovs_interfaceid": "1e83b219-c51b-488a-8fc7-8240efb384c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.104 222021 DEBUG nova.compute.manager [req-da8ed382-520d-4748-b4ae-a80e1d00fbd4 req-a0eb4d58-b815-4172-9e43-763d1134a184 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Detach interface failed, port_id=8adc9155-eb6d-41ea-ae59-7de12ff3fc5a, reason: Instance c4100b68-be14-4cd7-8243-2c9a793caa5f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.445 222021 DEBUG nova.network.neutron [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:59 np0005593233 nova_compute[222017]: 2026-01-23 09:55:59.467 222021 INFO nova.compute.manager [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Took 13.23 seconds to deallocate network for instance.#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.269 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162145.2676234, c4100b68-be14-4cd7-8243-2c9a793caa5f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.270 222021 INFO nova.compute.manager [-] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.308 222021 DEBUG nova.compute.manager [None req-6eeea5d0-72a6-47fe-94e4-97ffcab98569 - - - - - -] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.333 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.417 222021 INFO nova.compute.manager [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Took 0.95 seconds to detach 3 volumes for instance.#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.481 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.482 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:00 np0005593233 nova_compute[222017]: 2026-01-23 09:56:00.558 222021 DEBUG oslo_concurrency.processutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:00.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:01.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:56:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2217861235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.058 222021 DEBUG oslo_concurrency.processutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.069 222021 DEBUG nova.compute.provider_tree [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.094 222021 DEBUG nova.scheduler.client.report [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.137 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.204 222021 INFO nova.scheduler.client.report [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Deleted allocations for instance c4100b68-be14-4cd7-8243-2c9a793caa5f#033[00m
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.222 222021 DEBUG nova.compute.manager [req-b65a6929-dd51-4cd1-825d-8bf149a71062 req-d00f4128-4b54-4a0f-a2b5-a9bed32d2973 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c4100b68-be14-4cd7-8243-2c9a793caa5f] Received event network-vif-deleted-1e83b219-c51b-488a-8fc7-8240efb384c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:56:01 np0005593233 nova_compute[222017]: 2026-01-23 09:56:01.331 222021 DEBUG oslo_concurrency.lockutils [None req-36e979b3-6cfc-4948-aa72-f75fc72a3da9 d512838ce2b44554b0566fdbb3c702b4 9bc2d47d48c446c7ae1fc44cd9c32878 - - default default] Lock "c4100b68-be14-4cd7-8243-2c9a793caa5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 17.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:02.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:03.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:04.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:05.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:05 np0005593233 podman[256987]: 2026-01-23 09:56:05.129266351 +0000 UTC m=+0.138372200 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:56:05 np0005593233 nova_compute[222017]: 2026-01-23 09:56:05.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:05 np0005593233 nova_compute[222017]: 2026-01-23 09:56:05.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:06.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:07.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:08.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:09.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:10 np0005593233 nova_compute[222017]: 2026-01-23 09:56:10.338 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:10 np0005593233 nova_compute[222017]: 2026-01-23 09:56:10.433 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:10.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:11.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:12.080 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:56:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:12.081 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:56:12 np0005593233 nova_compute[222017]: 2026-01-23 09:56:12.081 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:12.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:13.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:14.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:15.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:15 np0005593233 nova_compute[222017]: 2026-01-23 09:56:15.375 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:15 np0005593233 nova_compute[222017]: 2026-01-23 09:56:15.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:16.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:18 np0005593233 podman[257016]: 2026-01-23 09:56:18.051341545 +0000 UTC m=+0.055769114 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 04:56:18 np0005593233 nova_compute[222017]: 2026-01-23 09:56:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:18.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:19.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:20 np0005593233 nova_compute[222017]: 2026-01-23 09:56:20.379 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:20 np0005593233 nova_compute[222017]: 2026-01-23 09:56:20.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:20.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:21.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:21.083 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:22.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:23.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:24 np0005593233 nova_compute[222017]: 2026-01-23 09:56:24.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:24.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:25.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:25 np0005593233 nova_compute[222017]: 2026-01-23 09:56:25.382 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:25 np0005593233 nova_compute[222017]: 2026-01-23 09:56:25.438 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:25 np0005593233 nova_compute[222017]: 2026-01-23 09:56:25.913 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:25 np0005593233 nova_compute[222017]: 2026-01-23 09:56:25.914 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:25 np0005593233 nova_compute[222017]: 2026-01-23 09:56:25.945 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:56:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.110 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.111 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.121 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.122 222021 INFO nova.compute.claims [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.327 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.450 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:56:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3162798154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.809 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.814 222021 DEBUG nova.compute.provider_tree [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:56:26 np0005593233 nova_compute[222017]: 2026-01-23 09:56:26.834 222021 DEBUG nova.scheduler.client.report [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:56:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:26.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:27.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.067 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.068 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.071 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.071 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.071 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.072 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:56:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4218284091' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.526 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.729 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.730 222021 DEBUG nova.network.neutron [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.754 222021 INFO nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.788 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.799 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.799 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4643MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.800 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.800 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.927 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 6f55b474-788d-4957-9995-50c03af64105 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.928 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.928 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:56:27 np0005593233 nova_compute[222017]: 2026-01-23 09:56:27.980 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.081 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.083 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.084 222021 INFO nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Creating image(s)#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.115 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.144 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.179 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.183 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.279 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.281 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.282 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.283 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.331 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.337 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6f55b474-788d-4957-9995-50c03af64105_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.425 222021 DEBUG nova.policy [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e3fbf576a49e4d37ac2a826cce5ae7c8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '18d7959ce7f646d6a690e5976ef97cb3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:56:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:56:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1660414020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.470 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.479 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.696 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6f55b474-788d-4957-9995-50c03af64105_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.359s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.779 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] resizing rbd image 6f55b474-788d-4957-9995-50c03af64105_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.906 222021 DEBUG nova.objects.instance [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lazy-loading 'migration_context' on Instance uuid 6f55b474-788d-4957-9995-50c03af64105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:56:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:28.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:28 np0005593233 nova_compute[222017]: 2026-01-23 09:56:28.970 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:56:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:29.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:29 np0005593233 nova_compute[222017]: 2026-01-23 09:56:29.055 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:56:29 np0005593233 nova_compute[222017]: 2026-01-23 09:56:29.055 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Ensure instance console log exists: /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:56:29 np0005593233 nova_compute[222017]: 2026-01-23 09:56:29.056 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:29 np0005593233 nova_compute[222017]: 2026-01-23 09:56:29.056 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:29 np0005593233 nova_compute[222017]: 2026-01-23 09:56:29.057 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.906961) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162189907114, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2401, "num_deletes": 254, "total_data_size": 5656606, "memory_usage": 5754992, "flush_reason": "Manual Compaction"}
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162189962191, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3712948, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44036, "largest_seqno": 46432, "table_properties": {"data_size": 3703079, "index_size": 6235, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20882, "raw_average_key_size": 20, "raw_value_size": 3683228, "raw_average_value_size": 3653, "num_data_blocks": 271, "num_entries": 1008, "num_filter_entries": 1008, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161984, "oldest_key_time": 1769161984, "file_creation_time": 1769162189, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 55217 microseconds, and 11037 cpu microseconds.
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.962258) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3712948 bytes OK
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.962294) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.965892) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.965927) EVENT_LOG_v1 {"time_micros": 1769162189965904, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.965952) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5645785, prev total WAL file size 5645785, number of live WAL files 2.
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.967880) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3625KB)], [87(9511KB)]
Jan 23 04:56:29 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162189968092, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 13452484, "oldest_snapshot_seqno": -1}
Jan 23 04:56:30 np0005593233 nova_compute[222017]: 2026-01-23 09:56:30.050 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:56:30 np0005593233 nova_compute[222017]: 2026-01-23 09:56:30.051 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7031 keys, 11522532 bytes, temperature: kUnknown
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162190073623, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 11522532, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11474444, "index_size": 29416, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17605, "raw_key_size": 180837, "raw_average_key_size": 25, "raw_value_size": 11347467, "raw_average_value_size": 1613, "num_data_blocks": 1169, "num_entries": 7031, "num_filter_entries": 7031, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162189, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.074231) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 11522532 bytes
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.075703) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.0 rd, 108.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.3 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 7556, records dropped: 525 output_compression: NoCompression
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.075723) EVENT_LOG_v1 {"time_micros": 1769162190075713, "job": 54, "event": "compaction_finished", "compaction_time_micros": 105895, "compaction_time_cpu_micros": 39943, "output_level": 6, "num_output_files": 1, "total_output_size": 11522532, "num_input_records": 7556, "num_output_records": 7031, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162190077073, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162190079407, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:29.967621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.079692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.079709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.079715) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.079718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:56:30.079723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593233 nova_compute[222017]: 2026-01-23 09:56:30.384 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:30 np0005593233 nova_compute[222017]: 2026-01-23 09:56:30.441 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:30.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.051 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.052 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.052 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.052 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:56:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:31.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.064 222021 DEBUG nova.network.neutron [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Successfully created port: 9d36ace4-3322-4523-86d1-eac3c616f01e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.427 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:56:31 np0005593233 nova_compute[222017]: 2026-01-23 09:56:31.428 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.651 222021 DEBUG nova.network.neutron [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Successfully updated port: 9d36ace4-3322-4523-86d1-eac3c616f01e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.689 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.690 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquired lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.690 222021 DEBUG nova.network.neutron [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:56:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:32.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.940 222021 DEBUG nova.compute.manager [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-changed-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.941 222021 DEBUG nova.compute.manager [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Refreshing instance network info cache due to event network-changed-9d36ace4-3322-4523-86d1-eac3c616f01e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:56:32 np0005593233 nova_compute[222017]: 2026-01-23 09:56:32.941 222021 DEBUG oslo_concurrency.lockutils [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:56:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:33.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:33 np0005593233 nova_compute[222017]: 2026-01-23 09:56:33.115 222021 DEBUG nova.network.neutron [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:56:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:56:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:56:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:56:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:34.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:35.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.386 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.422 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.711 222021 DEBUG nova.network.neutron [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Updating instance_info_cache with network_info: [{"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.742 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Releasing lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.742 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Instance network_info: |[{"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.743 222021 DEBUG oslo_concurrency.lockutils [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.743 222021 DEBUG nova.network.neutron [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Refreshing network info cache for port 9d36ace4-3322-4523-86d1-eac3c616f01e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.747 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Start _get_guest_xml network_info=[{"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.752 222021 WARNING nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.758 222021 DEBUG nova.virt.libvirt.host [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.759 222021 DEBUG nova.virt.libvirt.host [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.764 222021 DEBUG nova.virt.libvirt.host [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.764 222021 DEBUG nova.virt.libvirt.host [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.766 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.766 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.766 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.766 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.767 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.768 222021 DEBUG nova.virt.hardware [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:56:35 np0005593233 nova_compute[222017]: 2026-01-23 09:56:35.770 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:36 np0005593233 podman[257419]: 2026-01-23 09:56:36.103589142 +0000 UTC m=+0.111269510 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 23 04:56:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:56:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3691020289' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.269 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.300 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.306 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:56:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2851571903' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.783 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.787 222021 DEBUG nova.virt.libvirt.vif [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:56:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-308505705',display_name='tempest-ServersTestManualDisk-server-308505705',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-308505705',id=86,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOU8eQR+8Y9RC/S/pQ59ooN7JT/YeHcnzHqPJxmt+RjQc8s2N+KyYphVd5+7IEv+C/C3rcyOoMwEU2u2hGPpN/qjblNtMeO8XAXAacoj0c+QTX1b6zq+yD+gEoxZKMYh3g==',key_name='tempest-keypair-1962091146',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='18d7959ce7f646d6a690e5976ef97cb3',ramdisk_id='',reservation_id='r-tet96c7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1015917862',owner_user_name='tempest-ServersTestManualDisk-1015917862-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:56:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3fbf576a49e4d37ac2a826cce5ae7c8',uuid=6f55b474-788d-4957-9995-50c03af64105,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.788 222021 DEBUG nova.network.os_vif_util [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Converting VIF {"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.790 222021 DEBUG nova.network.os_vif_util [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.793 222021 DEBUG nova.objects.instance [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6f55b474-788d-4957-9995-50c03af64105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.814 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <uuid>6f55b474-788d-4957-9995-50c03af64105</uuid>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <name>instance-00000056</name>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersTestManualDisk-server-308505705</nova:name>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:56:35</nova:creationTime>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:user uuid="e3fbf576a49e4d37ac2a826cce5ae7c8">tempest-ServersTestManualDisk-1015917862-project-member</nova:user>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:project uuid="18d7959ce7f646d6a690e5976ef97cb3">tempest-ServersTestManualDisk-1015917862</nova:project>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <nova:port uuid="9d36ace4-3322-4523-86d1-eac3c616f01e">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <entry name="serial">6f55b474-788d-4957-9995-50c03af64105</entry>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <entry name="uuid">6f55b474-788d-4957-9995-50c03af64105</entry>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/6f55b474-788d-4957-9995-50c03af64105_disk">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/6f55b474-788d-4957-9995-50c03af64105_disk.config">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:f1:f6:e6"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <target dev="tap9d36ace4-33"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/console.log" append="off"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:56:36 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:56:36 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:56:36 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:56:36 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.815 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Preparing to wait for external event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.816 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.816 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.816 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.818 222021 DEBUG nova.virt.libvirt.vif [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:56:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-308505705',display_name='tempest-ServersTestManualDisk-server-308505705',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-308505705',id=86,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOU8eQR+8Y9RC/S/pQ59ooN7JT/YeHcnzHqPJxmt+RjQc8s2N+KyYphVd5+7IEv+C/C3rcyOoMwEU2u2hGPpN/qjblNtMeO8XAXAacoj0c+QTX1b6zq+yD+gEoxZKMYh3g==',key_name='tempest-keypair-1962091146',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='18d7959ce7f646d6a690e5976ef97cb3',ramdisk_id='',reservation_id='r-tet96c7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1015917862',owner_user_name='tempest-ServersTestManualDisk-1015917862-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:56:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3fbf576a49e4d37ac2a826cce5ae7c8',uuid=6f55b474-788d-4957-9995-50c03af64105,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.818 222021 DEBUG nova.network.os_vif_util [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Converting VIF {"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.819 222021 DEBUG nova.network.os_vif_util [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.819 222021 DEBUG os_vif [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.820 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.821 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.822 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.828 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d36ace4-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.828 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d36ace4-33, col_values=(('external_ids', {'iface-id': '9d36ace4-3322-4523-86d1-eac3c616f01e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:f6:e6', 'vm-uuid': '6f55b474-788d-4957-9995-50c03af64105'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:36 np0005593233 NetworkManager[48871]: <info>  [1769162196.8323] manager: (tap9d36ace4-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.835 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.839 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:36 np0005593233 nova_compute[222017]: 2026-01-23 09:56:36.840 222021 INFO os_vif [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33')#033[00m
Jan 23 04:56:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:36.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.054 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.055 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.055 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] No VIF found with MAC fa:16:3e:f1:f6:e6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.056 222021 INFO nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Using config drive#033[00m
Jan 23 04:56:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:37.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.088 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.746 222021 INFO nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Creating config drive at /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/disk.config#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.753 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4d2d4yk5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.892 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4d2d4yk5" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.931 222021 DEBUG nova.storage.rbd_utils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] rbd image 6f55b474-788d-4957-9995-50c03af64105_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:56:37 np0005593233 nova_compute[222017]: 2026-01-23 09:56:37.936 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/disk.config 6f55b474-788d-4957-9995-50c03af64105_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.114 222021 DEBUG oslo_concurrency.processutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/disk.config 6f55b474-788d-4957-9995-50c03af64105_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.116 222021 INFO nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Deleting local config drive /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105/disk.config because it was imported into RBD.#033[00m
Jan 23 04:56:38 np0005593233 kernel: tap9d36ace4-33: entered promiscuous mode
Jan 23 04:56:38 np0005593233 NetworkManager[48871]: <info>  [1769162198.1927] manager: (tap9d36ace4-33): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Jan 23 04:56:38 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:38Z|00377|binding|INFO|Claiming lport 9d36ace4-3322-4523-86d1-eac3c616f01e for this chassis.
Jan 23 04:56:38 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:38Z|00378|binding|INFO|9d36ace4-3322-4523-86d1-eac3c616f01e: Claiming fa:16:3e:f1:f6:e6 10.100.0.10
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.193 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.207 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:f6:e6 10.100.0.10'], port_security=['fa:16:3e:f1:f6:e6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6f55b474-788d-4957-9995-50c03af64105', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '18d7959ce7f646d6a690e5976ef97cb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cc6ff8d1-bdd6-4a8d-a84a-7fcf0da73fd8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154f7cd7-0259-4e5f-957a-4a973b009bfe, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=9d36ace4-3322-4523-86d1-eac3c616f01e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.209 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 9d36ace4-3322-4523-86d1-eac3c616f01e in datapath e5f1d16a-8f59-44b1-a9ed-dc65f415ceac bound to our chassis#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.210 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e5f1d16a-8f59-44b1-a9ed-dc65f415ceac#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.213 222021 DEBUG nova.network.neutron [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Updated VIF entry in instance network info cache for port 9d36ace4-3322-4523-86d1-eac3c616f01e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.213 222021 DEBUG nova.network.neutron [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Updating instance_info_cache with network_info: [{"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.222 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6978cf-852f-421b-b1fb-3de192a85dc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.223 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape5f1d16a-81 in ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:56:38 np0005593233 systemd-udevd[257560]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.225 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape5f1d16a-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.226 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4441869f-fb7e-4f3e-82c5-56ace5a2e9f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.227 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04a4e693-8acb-4eb8-a8ff-1333eeee9a16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 systemd-machined[190954]: New machine qemu-44-instance-00000056.
Jan 23 04:56:38 np0005593233 NetworkManager[48871]: <info>  [1769162198.2433] device (tap9d36ace4-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:56:38 np0005593233 NetworkManager[48871]: <info>  [1769162198.2446] device (tap9d36ace4-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.240 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c16477b6-df11-4946-811b-da032515d30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.259 222021 DEBUG oslo_concurrency.lockutils [req-d59abf6a-e0b6-4e08-9eb1-3c61762d1572 req-27984dd1-3950-42b7-84c3-eede0f08468e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.271 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[62e8640c-e2ab-4fe7-88f2-9244b9fa9a65]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 systemd[1]: Started Virtual Machine qemu-44-instance-00000056.
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.279 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:38Z|00379|binding|INFO|Setting lport 9d36ace4-3322-4523-86d1-eac3c616f01e ovn-installed in OVS
Jan 23 04:56:38 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:38Z|00380|binding|INFO|Setting lport 9d36ace4-3322-4523-86d1-eac3c616f01e up in Southbound
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.290 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.310 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5a48729e-7a15-4c54-88c0-69e65262f0d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 NetworkManager[48871]: <info>  [1769162198.3182] manager: (tape5f1d16a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.317 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7a64191b-97a1-4ed5-ac8f-9a342f1a3e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.355 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[76e4cd43-5f96-4e1d-aedf-1c171fbafce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.359 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[798ec84f-b486-4542-a1c1-40d9412fdb84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 NetworkManager[48871]: <info>  [1769162198.3808] device (tape5f1d16a-80): carrier: link connected
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.385 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e87a02-229a-49e0-85c6-b299996875dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.403 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ef3bf7cf-c873-4237-9bb2-b6a62b6d3809]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5f1d16a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:9f:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611101, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257593, 'error': None, 'target': 'ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.421 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f5217c47-a9d1-4a8e-851a-32821b97fef0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:9f14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 611101, 'tstamp': 611101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257594, 'error': None, 'target': 'ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.447 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6687ae3-81e1-4c34-8682-f234188c6ab4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5f1d16a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:9f:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611101, 'reachable_time': 34334, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257595, 'error': None, 'target': 'ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.486 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b209b431-0a83-4644-9a01-50d384002e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.562 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[72aad017-4c09-45a7-8c13-7ac2cae81b14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.564 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5f1d16a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.565 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.565 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5f1d16a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.567 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 kernel: tape5f1d16a-80: entered promiscuous mode
Jan 23 04:56:38 np0005593233 NetworkManager[48871]: <info>  [1769162198.5696] manager: (tape5f1d16a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.570 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape5f1d16a-80, col_values=(('external_ids', {'iface-id': 'd639003c-0ddd-4298-9a67-1b2dc8b5dc4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.571 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:38Z|00381|binding|INFO|Releasing lport d639003c-0ddd-4298-9a67-1b2dc8b5dc4c from this chassis (sb_readonly=0)
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.573 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.574 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e5f1d16a-8f59-44b1-a9ed-dc65f415ceac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e5f1d16a-8f59-44b1-a9ed-dc65f415ceac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.575 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ba4867-4a28-4e8f-9e29-dadcf8928e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.576 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/e5f1d16a-8f59-44b1-a9ed-dc65f415ceac.pid.haproxy
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID e5f1d16a-8f59-44b1-a9ed-dc65f415ceac
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:56:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:38.577 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'env', 'PROCESS_TAG=haproxy-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e5f1d16a-8f59-44b1-a9ed-dc65f415ceac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.586 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.669 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162198.6680198, 6f55b474-788d-4957-9995-50c03af64105 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.670 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] VM Started (Lifecycle Event)#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.706 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.712 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162198.6683052, 6f55b474-788d-4957-9995-50c03af64105 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.713 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.744 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.749 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.775 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:56:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:38.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.946 222021 DEBUG nova.compute.manager [req-dde2c14a-b38a-40a2-a19b-5892cd1a4e9e req-70e3386b-c9d7-4cd8-941a-b2fc7bd0996b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.948 222021 DEBUG oslo_concurrency.lockutils [req-dde2c14a-b38a-40a2-a19b-5892cd1a4e9e req-70e3386b-c9d7-4cd8-941a-b2fc7bd0996b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.948 222021 DEBUG oslo_concurrency.lockutils [req-dde2c14a-b38a-40a2-a19b-5892cd1a4e9e req-70e3386b-c9d7-4cd8-941a-b2fc7bd0996b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.949 222021 DEBUG oslo_concurrency.lockutils [req-dde2c14a-b38a-40a2-a19b-5892cd1a4e9e req-70e3386b-c9d7-4cd8-941a-b2fc7bd0996b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.949 222021 DEBUG nova.compute.manager [req-dde2c14a-b38a-40a2-a19b-5892cd1a4e9e req-70e3386b-c9d7-4cd8-941a-b2fc7bd0996b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Processing event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.950 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.953 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162198.9536386, 6f55b474-788d-4957-9995-50c03af64105 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.954 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.956 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.960 222021 INFO nova.virt.libvirt.driver [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] Instance spawned successfully.#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.960 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.977 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.981 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.991 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.991 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.992 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.992 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.992 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:56:38 np0005593233 nova_compute[222017]: 2026-01-23 09:56:38.993 222021 DEBUG nova.virt.libvirt.driver [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:56:39 np0005593233 nova_compute[222017]: 2026-01-23 09:56:39.004 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:56:39 np0005593233 podman[257669]: 2026-01-23 09:56:39.009459116 +0000 UTC m=+0.066721721 container create 4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:56:39 np0005593233 systemd[1]: Started libpod-conmon-4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4.scope.
Jan 23 04:56:39 np0005593233 podman[257669]: 2026-01-23 09:56:38.974671251 +0000 UTC m=+0.031933896 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:56:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:39.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:39 np0005593233 nova_compute[222017]: 2026-01-23 09:56:39.071 222021 INFO nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Took 10.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:56:39 np0005593233 nova_compute[222017]: 2026-01-23 09:56:39.072 222021 DEBUG nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:56:39 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:56:39 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b8b615d40805ba2cdcc50c2a0ca34b1cbc799e129c1dc6dd89de08d886f9465/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:39 np0005593233 podman[257669]: 2026-01-23 09:56:39.108594705 +0000 UTC m=+0.165857320 container init 4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 04:56:39 np0005593233 podman[257669]: 2026-01-23 09:56:39.114644455 +0000 UTC m=+0.171907050 container start 4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 04:56:39 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [NOTICE]   (257689) : New worker (257691) forked
Jan 23 04:56:39 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [NOTICE]   (257689) : Loading success.
Jan 23 04:56:39 np0005593233 nova_compute[222017]: 2026-01-23 09:56:39.174 222021 INFO nova.compute.manager [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Took 13.11 seconds to build instance.#033[00m
Jan 23 04:56:39 np0005593233 nova_compute[222017]: 2026-01-23 09:56:39.205 222021 DEBUG oslo_concurrency.lockutils [None req-9fc36da8-9068-433e-a8ad-02d4816c8879 e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:40 np0005593233 nova_compute[222017]: 2026-01-23 09:56:40.389 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:40.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:41.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:56:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.575 222021 DEBUG nova.compute.manager [req-b92ee1d9-f748-43bd-bf97-4bb9d1291d90 req-6e74776a-6a91-45e0-a2ff-9bf276b8916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.576 222021 DEBUG oslo_concurrency.lockutils [req-b92ee1d9-f748-43bd-bf97-4bb9d1291d90 req-6e74776a-6a91-45e0-a2ff-9bf276b8916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.577 222021 DEBUG oslo_concurrency.lockutils [req-b92ee1d9-f748-43bd-bf97-4bb9d1291d90 req-6e74776a-6a91-45e0-a2ff-9bf276b8916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.577 222021 DEBUG oslo_concurrency.lockutils [req-b92ee1d9-f748-43bd-bf97-4bb9d1291d90 req-6e74776a-6a91-45e0-a2ff-9bf276b8916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.577 222021 DEBUG nova.compute.manager [req-b92ee1d9-f748-43bd-bf97-4bb9d1291d90 req-6e74776a-6a91-45e0-a2ff-9bf276b8916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] No waiting events found dispatching network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.577 222021 WARNING nova.compute.manager [req-b92ee1d9-f748-43bd-bf97-4bb9d1291d90 req-6e74776a-6a91-45e0-a2ff-9bf276b8916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received unexpected event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e for instance with vm_state active and task_state None.#033[00m
Jan 23 04:56:41 np0005593233 nova_compute[222017]: 2026-01-23 09:56:41.831 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:42.658 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:42.658 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:56:42.659 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:42 np0005593233 nova_compute[222017]: 2026-01-23 09:56:42.909 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:42 np0005593233 NetworkManager[48871]: <info>  [1769162202.9106] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 23 04:56:42 np0005593233 NetworkManager[48871]: <info>  [1769162202.9121] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 23 04:56:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:42.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:43.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.087 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:43Z|00382|binding|INFO|Releasing lport d639003c-0ddd-4298-9a67-1b2dc8b5dc4c from this chassis (sb_readonly=0)
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.112 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.930 222021 DEBUG nova.compute.manager [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-changed-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.931 222021 DEBUG nova.compute.manager [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Refreshing instance network info cache due to event network-changed-9d36ace4-3322-4523-86d1-eac3c616f01e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.931 222021 DEBUG oslo_concurrency.lockutils [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.931 222021 DEBUG oslo_concurrency.lockutils [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:56:43 np0005593233 nova_compute[222017]: 2026-01-23 09:56:43.931 222021 DEBUG nova.network.neutron [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Refreshing network info cache for port 9d36ace4-3322-4523-86d1-eac3c616f01e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:56:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:44.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:45.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:45 np0005593233 nova_compute[222017]: 2026-01-23 09:56:45.392 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:46 np0005593233 nova_compute[222017]: 2026-01-23 09:56:46.836 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:46.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:47.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:47 np0005593233 nova_compute[222017]: 2026-01-23 09:56:47.753 222021 DEBUG nova.network.neutron [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Updated VIF entry in instance network info cache for port 9d36ace4-3322-4523-86d1-eac3c616f01e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:56:47 np0005593233 nova_compute[222017]: 2026-01-23 09:56:47.753 222021 DEBUG nova.network.neutron [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Updating instance_info_cache with network_info: [{"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:56:47 np0005593233 nova_compute[222017]: 2026-01-23 09:56:47.795 222021 DEBUG oslo_concurrency.lockutils [req-5075d306-da18-43b9-8126-479a7cc7b987 req-74772434-9c5e-4c62-9d82-4d718440977f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6f55b474-788d-4957-9995-50c03af64105" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:56:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:48.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:49 np0005593233 podman[257751]: 2026-01-23 09:56:49.069955259 +0000 UTC m=+0.071164626 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 23 04:56:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:49.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:50 np0005593233 nova_compute[222017]: 2026-01-23 09:56:50.394 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:50.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:51.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:51 np0005593233 nova_compute[222017]: 2026-01-23 09:56:51.840 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:52.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:53.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:53 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:53Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f1:f6:e6 10.100.0.10
Jan 23 04:56:53 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:53Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f1:f6:e6 10.100.0.10
Jan 23 04:56:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:54.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:55.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:55 np0005593233 nova_compute[222017]: 2026-01-23 09:56:55.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:56 np0005593233 nova_compute[222017]: 2026-01-23 09:56:56.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:56.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:57.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:56:58 np0005593233 ovn_controller[130653]: 2026-01-23T09:56:58Z|00383|binding|INFO|Releasing lport d639003c-0ddd-4298-9a67-1b2dc8b5dc4c from this chassis (sb_readonly=0)
Jan 23 04:56:58 np0005593233 nova_compute[222017]: 2026-01-23 09:56:58.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:58.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:56:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:56:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:59.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:00 np0005593233 nova_compute[222017]: 2026-01-23 09:57:00.401 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:00.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:01.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:01 np0005593233 nova_compute[222017]: 2026-01-23 09:57:01.844 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:02.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:03.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:04.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:05.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:05 np0005593233 nova_compute[222017]: 2026-01-23 09:57:05.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:06 np0005593233 nova_compute[222017]: 2026-01-23 09:57:06.848 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:06.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:07 np0005593233 podman[257771]: 2026-01-23 09:57:07.096212828 +0000 UTC m=+0.104224053 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:57:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:07.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.991 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.991 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.991 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.992 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.992 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.993 222021 INFO nova.compute.manager [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Terminating instance#033[00m
Jan 23 04:57:07 np0005593233 nova_compute[222017]: 2026-01-23 09:57:07.994 222021 DEBUG nova.compute.manager [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:57:08 np0005593233 kernel: tap9d36ace4-33 (unregistering): left promiscuous mode
Jan 23 04:57:08 np0005593233 NetworkManager[48871]: <info>  [1769162228.2919] device (tap9d36ace4-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:57:08 np0005593233 ovn_controller[130653]: 2026-01-23T09:57:08Z|00384|binding|INFO|Releasing lport 9d36ace4-3322-4523-86d1-eac3c616f01e from this chassis (sb_readonly=0)
Jan 23 04:57:08 np0005593233 ovn_controller[130653]: 2026-01-23T09:57:08Z|00385|binding|INFO|Setting lport 9d36ace4-3322-4523-86d1-eac3c616f01e down in Southbound
Jan 23 04:57:08 np0005593233 ovn_controller[130653]: 2026-01-23T09:57:08Z|00386|binding|INFO|Removing iface tap9d36ace4-33 ovn-installed in OVS
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.304 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593233 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000056.scope: Deactivated successfully.
Jan 23 04:57:08 np0005593233 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000056.scope: Consumed 14.749s CPU time.
Jan 23 04:57:08 np0005593233 systemd-machined[190954]: Machine qemu-44-instance-00000056 terminated.
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.432 222021 INFO nova.virt.libvirt.driver [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] Instance destroyed successfully.#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.434 222021 DEBUG nova.objects.instance [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lazy-loading 'resources' on Instance uuid 6f55b474-788d-4957-9995-50c03af64105 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.593 222021 DEBUG nova.virt.libvirt.vif [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:56:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-308505705',display_name='tempest-ServersTestManualDisk-server-308505705',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-308505705',id=86,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOU8eQR+8Y9RC/S/pQ59ooN7JT/YeHcnzHqPJxmt+RjQc8s2N+KyYphVd5+7IEv+C/C3rcyOoMwEU2u2hGPpN/qjblNtMeO8XAXAacoj0c+QTX1b6zq+yD+gEoxZKMYh3g==',key_name='tempest-keypair-1962091146',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:56:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='18d7959ce7f646d6a690e5976ef97cb3',ramdisk_id='',reservation_id='r-tet96c7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1015917862',owner_user_name='tempest-ServersTestManualDisk-1015917862-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:56:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3fbf576a49e4d37ac2a826cce5ae7c8',uuid=6f55b474-788d-4957-9995-50c03af64105,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.595 222021 DEBUG nova.network.os_vif_util [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Converting VIF {"id": "9d36ace4-3322-4523-86d1-eac3c616f01e", "address": "fa:16:3e:f1:f6:e6", "network": {"id": "e5f1d16a-8f59-44b1-a9ed-dc65f415ceac", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-99550732-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18d7959ce7f646d6a690e5976ef97cb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d36ace4-33", "ovs_interfaceid": "9d36ace4-3322-4523-86d1-eac3c616f01e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.597 222021 DEBUG nova.network.os_vif_util [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.598 222021 DEBUG os_vif [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.601 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.602 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d36ace4-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.604 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.610 222021 INFO os_vif [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:f6:e6,bridge_name='br-int',has_traffic_filtering=True,id=9d36ace4-3322-4523-86d1-eac3c616f01e,network=Network(e5f1d16a-8f59-44b1-a9ed-dc65f415ceac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d36ace4-33')#033[00m
Jan 23 04:57:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:08.694 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:f6:e6 10.100.0.10'], port_security=['fa:16:3e:f1:f6:e6 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6f55b474-788d-4957-9995-50c03af64105', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '18d7959ce7f646d6a690e5976ef97cb3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc6ff8d1-bdd6-4a8d-a84a-7fcf0da73fd8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154f7cd7-0259-4e5f-957a-4a973b009bfe, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=9d36ace4-3322-4523-86d1-eac3c616f01e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:08.697 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 9d36ace4-3322-4523-86d1-eac3c616f01e in datapath e5f1d16a-8f59-44b1-a9ed-dc65f415ceac unbound from our chassis#033[00m
Jan 23 04:57:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:08.699 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:57:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:08.701 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e8461167-c4fe-4d84-8597-913579822820]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:08.702 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac namespace which is not needed anymore#033[00m
Jan 23 04:57:08 np0005593233 nova_compute[222017]: 2026-01-23 09:57:08.804 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:08.804 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:08.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:09 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [NOTICE]   (257689) : haproxy version is 2.8.14-c23fe91
Jan 23 04:57:09 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [NOTICE]   (257689) : path to executable is /usr/sbin/haproxy
Jan 23 04:57:09 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [WARNING]  (257689) : Exiting Master process...
Jan 23 04:57:09 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [ALERT]    (257689) : Current worker (257691) exited with code 143 (Terminated)
Jan 23 04:57:09 np0005593233 neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac[257685]: [WARNING]  (257689) : All workers exited. Exiting... (0)
Jan 23 04:57:09 np0005593233 systemd[1]: libpod-4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4.scope: Deactivated successfully.
Jan 23 04:57:09 np0005593233 podman[257850]: 2026-01-23 09:57:09.02704669 +0000 UTC m=+0.192291571 container died 4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:57:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:09.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:09 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4-userdata-shm.mount: Deactivated successfully.
Jan 23 04:57:09 np0005593233 systemd[1]: var-lib-containers-storage-overlay-3b8b615d40805ba2cdcc50c2a0ca34b1cbc799e129c1dc6dd89de08d886f9465-merged.mount: Deactivated successfully.
Jan 23 04:57:09 np0005593233 podman[257850]: 2026-01-23 09:57:09.194522895 +0000 UTC m=+0.359767716 container cleanup 4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:57:09 np0005593233 podman[257879]: 2026-01-23 09:57:09.327464551 +0000 UTC m=+0.111461485 container remove 4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 04:57:09 np0005593233 systemd[1]: libpod-conmon-4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4.scope: Deactivated successfully.
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.336 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5726b3-b0c6-4f47-90a5-cfbda0290d80]: (4, ('Fri Jan 23 09:57:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac (4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4)\n4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4\nFri Jan 23 09:57:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac (4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4)\n4668a2f89c593da57f4b10afe1921cf307647cc8992ecf8ea63f7fb1c2cdc7a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.338 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cd22a9cc-8cca-4f9e-9b41-b548f55b42fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.340 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5f1d16a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:09 np0005593233 kernel: tape5f1d16a-80: left promiscuous mode
Jan 23 04:57:09 np0005593233 nova_compute[222017]: 2026-01-23 09:57:09.344 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:09 np0005593233 nova_compute[222017]: 2026-01-23 09:57:09.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.362 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab28a40-b7bd-442a-a818-94875b593f86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.384 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b2c0a4-c5b5-4457-be27-033d71cde4de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.385 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce90ac6-9521-4f4b-9709-1e14aa0bccee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.411 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0af7b33c-3b88-4b17-af88-e05ca78989a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611093, 'reachable_time': 40043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257895, 'error': None, 'target': 'ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.415 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e5f1d16a-8f59-44b1-a9ed-dc65f415ceac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:57:09 np0005593233 systemd[1]: run-netns-ovnmeta\x2de5f1d16a\x2d8f59\x2d44b1\x2da9ed\x2ddc65f415ceac.mount: Deactivated successfully.
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.416 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[a83c1115-0639-489a-8ec9-5fdcfabcc10d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:09.417 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:57:09 np0005593233 nova_compute[222017]: 2026-01-23 09:57:09.778 222021 INFO nova.virt.libvirt.driver [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Deleting instance files /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105_del#033[00m
Jan 23 04:57:09 np0005593233 nova_compute[222017]: 2026-01-23 09:57:09.779 222021 INFO nova.virt.libvirt.driver [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Deletion of /var/lib/nova/instances/6f55b474-788d-4957-9995-50c03af64105_del complete#033[00m
Jan 23 04:57:10 np0005593233 nova_compute[222017]: 2026-01-23 09:57:10.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:10.419 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:10.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:11.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.612 222021 DEBUG nova.compute.manager [req-72a6b1de-9e5f-47ce-b968-2fb6cf8bb567 req-fc6ea09f-5992-4f91-8614-c83f7244fead 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-vif-unplugged-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.612 222021 DEBUG oslo_concurrency.lockutils [req-72a6b1de-9e5f-47ce-b968-2fb6cf8bb567 req-fc6ea09f-5992-4f91-8614-c83f7244fead 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.613 222021 DEBUG oslo_concurrency.lockutils [req-72a6b1de-9e5f-47ce-b968-2fb6cf8bb567 req-fc6ea09f-5992-4f91-8614-c83f7244fead 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.613 222021 DEBUG oslo_concurrency.lockutils [req-72a6b1de-9e5f-47ce-b968-2fb6cf8bb567 req-fc6ea09f-5992-4f91-8614-c83f7244fead 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.614 222021 DEBUG nova.compute.manager [req-72a6b1de-9e5f-47ce-b968-2fb6cf8bb567 req-fc6ea09f-5992-4f91-8614-c83f7244fead 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] No waiting events found dispatching network-vif-unplugged-9d36ace4-3322-4523-86d1-eac3c616f01e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.614 222021 DEBUG nova.compute.manager [req-72a6b1de-9e5f-47ce-b968-2fb6cf8bb567 req-fc6ea09f-5992-4f91-8614-c83f7244fead 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-vif-unplugged-9d36ace4-3322-4523-86d1-eac3c616f01e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.939 222021 INFO nova.compute.manager [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Took 3.95 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.941 222021 DEBUG oslo.service.loopingcall [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.941 222021 DEBUG nova.compute.manager [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:57:11 np0005593233 nova_compute[222017]: 2026-01-23 09:57:11.941 222021 DEBUG nova.network.neutron [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:57:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:12.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:13.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:13 np0005593233 nova_compute[222017]: 2026-01-23 09:57:13.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.004 222021 DEBUG nova.compute.manager [req-6e754c1c-454e-4ae5-bc67-a89d6dd92654 req-fa139de1-3e19-41dc-8c92-536c722ae7c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.004 222021 DEBUG oslo_concurrency.lockutils [req-6e754c1c-454e-4ae5-bc67-a89d6dd92654 req-fa139de1-3e19-41dc-8c92-536c722ae7c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6f55b474-788d-4957-9995-50c03af64105-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.004 222021 DEBUG oslo_concurrency.lockutils [req-6e754c1c-454e-4ae5-bc67-a89d6dd92654 req-fa139de1-3e19-41dc-8c92-536c722ae7c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.004 222021 DEBUG oslo_concurrency.lockutils [req-6e754c1c-454e-4ae5-bc67-a89d6dd92654 req-fa139de1-3e19-41dc-8c92-536c722ae7c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.005 222021 DEBUG nova.compute.manager [req-6e754c1c-454e-4ae5-bc67-a89d6dd92654 req-fa139de1-3e19-41dc-8c92-536c722ae7c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] No waiting events found dispatching network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.005 222021 WARNING nova.compute.manager [req-6e754c1c-454e-4ae5-bc67-a89d6dd92654 req-fa139de1-3e19-41dc-8c92-536c722ae7c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received unexpected event network-vif-plugged-9d36ace4-3322-4523-86d1-eac3c616f01e for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:57:14 np0005593233 nova_compute[222017]: 2026-01-23 09:57:14.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:14.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:15.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:15 np0005593233 nova_compute[222017]: 2026-01-23 09:57:15.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:16.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:17.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:17 np0005593233 nova_compute[222017]: 2026-01-23 09:57:17.439 222021 DEBUG nova.network.neutron [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:57:17 np0005593233 nova_compute[222017]: 2026-01-23 09:57:17.479 222021 INFO nova.compute.manager [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] Took 5.54 seconds to deallocate network for instance.#033[00m
Jan 23 04:57:17 np0005593233 nova_compute[222017]: 2026-01-23 09:57:17.558 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:17 np0005593233 nova_compute[222017]: 2026-01-23 09:57:17.559 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:17 np0005593233 nova_compute[222017]: 2026-01-23 09:57:17.684 222021 DEBUG nova.compute.manager [req-f3806dd6-fc5c-4fce-b4b8-3edf0351e978 req-f7601a5c-8374-4eee-a3cd-3236e9693900 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6f55b474-788d-4957-9995-50c03af64105] Received event network-vif-deleted-9d36ace4-3322-4523-86d1-eac3c616f01e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:17 np0005593233 nova_compute[222017]: 2026-01-23 09:57:17.687 222021 DEBUG oslo_concurrency.processutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3335486013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.174 222021 DEBUG oslo_concurrency.processutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.182 222021 DEBUG nova.compute.provider_tree [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.217 222021 DEBUG nova.scheduler.client.report [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.267 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.408 222021 INFO nova.scheduler.client.report [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Deleted allocations for instance 6f55b474-788d-4957-9995-50c03af64105#033[00m
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.532 222021 DEBUG oslo_concurrency.lockutils [None req-1e04c096-7c96-4e8c-985b-85b0a3eff35b e3fbf576a49e4d37ac2a826cce5ae7c8 18d7959ce7f646d6a690e5976ef97cb3 - - default default] Lock "6f55b474-788d-4957-9995-50c03af64105" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:18 np0005593233 nova_compute[222017]: 2026-01-23 09:57:18.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:18.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:19.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:20 np0005593233 podman[257918]: 2026-01-23 09:57:20.045139917 +0000 UTC m=+0.054622862 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 04:57:20 np0005593233 nova_compute[222017]: 2026-01-23 09:57:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:20 np0005593233 nova_compute[222017]: 2026-01-23 09:57:20.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:20.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:21.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:22 np0005593233 nova_compute[222017]: 2026-01-23 09:57:22.890 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:22.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:23.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:23 np0005593233 nova_compute[222017]: 2026-01-23 09:57:23.430 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162228.4285738, 6f55b474-788d-4957-9995-50c03af64105 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:57:23 np0005593233 nova_compute[222017]: 2026-01-23 09:57:23.431 222021 INFO nova.compute.manager [-] [instance: 6f55b474-788d-4957-9995-50c03af64105] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:57:23 np0005593233 nova_compute[222017]: 2026-01-23 09:57:23.462 222021 DEBUG nova.compute.manager [None req-ca40b214-2f42-4069-a3b0-1161d0ff2434 - - - - - -] [instance: 6f55b474-788d-4957-9995-50c03af64105] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:57:23 np0005593233 nova_compute[222017]: 2026-01-23 09:57:23.610 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:24 np0005593233 nova_compute[222017]: 2026-01-23 09:57:24.041 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:25.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:25.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:25 np0005593233 nova_compute[222017]: 2026-01-23 09:57:25.413 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:26 np0005593233 nova_compute[222017]: 2026-01-23 09:57:26.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:26 np0005593233 nova_compute[222017]: 2026-01-23 09:57:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:27.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:27.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.416 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.416 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.416 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.417 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.417 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3801758805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:27 np0005593233 nova_compute[222017]: 2026-01-23 09:57:27.854 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.040 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.041 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4657MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.042 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.042 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.115 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.115 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.156 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1410584276' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.612 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.615 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.623 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.650 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.688 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:57:28 np0005593233 nova_compute[222017]: 2026-01-23 09:57:28.689 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:29.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:29.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:30 np0005593233 nova_compute[222017]: 2026-01-23 09:57:30.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:30 np0005593233 nova_compute[222017]: 2026-01-23 09:57:30.189 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:30 np0005593233 nova_compute[222017]: 2026-01-23 09:57:30.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:30 np0005593233 nova_compute[222017]: 2026-01-23 09:57:30.689 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:30 np0005593233 nova_compute[222017]: 2026-01-23 09:57:30.690 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:57:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:31.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:31.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:32 np0005593233 nova_compute[222017]: 2026-01-23 09:57:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:32 np0005593233 nova_compute[222017]: 2026-01-23 09:57:32.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:57:32 np0005593233 nova_compute[222017]: 2026-01-23 09:57:32.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:57:32 np0005593233 nova_compute[222017]: 2026-01-23 09:57:32.418 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:57:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:33.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:33.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:33 np0005593233 nova_compute[222017]: 2026-01-23 09:57:33.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:35.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:35.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:35 np0005593233 nova_compute[222017]: 2026-01-23 09:57:35.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:37.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:37.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:37 np0005593233 nova_compute[222017]: 2026-01-23 09:57:37.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:38 np0005593233 podman[257985]: 2026-01-23 09:57:38.087228589 +0000 UTC m=+0.093338128 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 23 04:57:38 np0005593233 nova_compute[222017]: 2026-01-23 09:57:38.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:39.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:39.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:40 np0005593233 nova_compute[222017]: 2026-01-23 09:57:40.423 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:40 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:57:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:41.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:41.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:41 np0005593233 nova_compute[222017]: 2026-01-23 09:57:41.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:41 np0005593233 podman[258183]: 2026-01-23 09:57:41.686793987 +0000 UTC m=+0.093315007 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 23 04:57:41 np0005593233 podman[258183]: 2026-01-23 09:57:41.803417636 +0000 UTC m=+0.209938696 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:57:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:42.659 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:42.660 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:42.660 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:43.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:43.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:57:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:57:43 np0005593233 nova_compute[222017]: 2026-01-23 09:57:43.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:45.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:45.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:45 np0005593233 nova_compute[222017]: 2026-01-23 09:57:45.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:46.096 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:46 np0005593233 nova_compute[222017]: 2026-01-23 09:57:46.097 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:46.098 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:57:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:47.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:47.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:57:48.100 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:48 np0005593233 nova_compute[222017]: 2026-01-23 09:57:48.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:49.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:49.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:50 np0005593233 nova_compute[222017]: 2026-01-23 09:57:50.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:51.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:51 np0005593233 podman[258486]: 2026-01-23 09:57:51.059593594 +0000 UTC m=+0.065379054 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 04:57:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:51.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:53.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:57:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:53.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:57:53 np0005593233 nova_compute[222017]: 2026-01-23 09:57:53.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:55.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:55.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:55 np0005593233 nova_compute[222017]: 2026-01-23 09:57:55.430 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:57.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:57.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:58 np0005593233 nova_compute[222017]: 2026-01-23 09:57:58.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:59.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:57:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:59.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.619 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.619 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.655 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.925 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.926 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.953 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:57:59 np0005593233 nova_compute[222017]: 2026-01-23 09:57:59.954 222021 INFO nova.compute.claims [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.247 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.432 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:00 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3811722513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.700 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.708 222021 DEBUG nova.compute.provider_tree [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.732 222021 DEBUG nova.scheduler.client.report [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.761 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.763 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.837 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:58:00 np0005593233 nova_compute[222017]: 2026-01-23 09:58:00.837 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:58:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:01.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.166 222021 INFO nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:58:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:01.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.251 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.342 222021 INFO nova.virt.block_device [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Booting with volume d7d5138e-60ed-44f2-b4e5-1e53de5cf442 at /dev/vda#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.613 222021 DEBUG os_brick.utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.616 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.631 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.631 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ce3d5e-07ca-49c8-b9fe-350d36b6b9ce]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.633 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.643 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.644 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[301fce9a-f33e-40b0-b1ec-566f4d3ce2b1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.646 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.656 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.657 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcd7940-72fa-4353-9767-61d0194089de]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.658 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[d37cbc16-cafd-4054-bfb6-865ce4fadce2]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.659 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.690 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.696 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.696 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.697 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.698 222021 DEBUG os_brick.utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] <== get_connector_properties: return (83ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:58:01 np0005593233 nova_compute[222017]: 2026-01-23 09:58:01.699 222021 DEBUG nova.virt.block_device [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating existing volume attachment record: c67fc8c9-4438-4662-915a-db444ba9c2e4 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:58:02 np0005593233 nova_compute[222017]: 2026-01-23 09:58:02.142 222021 DEBUG nova.policy [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '35b29e4a06884f7d88683d00f85d4630', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:58:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:03.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:03.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:03 np0005593233 nova_compute[222017]: 2026-01-23 09:58:03.667 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:03 np0005593233 nova_compute[222017]: 2026-01-23 09:58:03.866 222021 INFO nova.virt.block_device [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Booting with volume 3fa8e844-6545-43af-a18d-ac7dfc7d2071 at /dev/vdb#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.335 222021 DEBUG os_brick.utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.336 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.383 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.384 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c1703c1b-fa78-4d04-9465-ad906ddcca33]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.387 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully created port: 446c7501-f73f-4cbf-8a43-e78948d8bec1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.386 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.400 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.401 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6479fa-5f7b-4e63-875f-0722085a740e]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.403 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.417 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.418 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca416e0-edd3-4ae2-8d85-32b70b467da7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.419 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[ce814f09-9cdc-487f-b33e-6c64039b619c]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.420 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.456 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "nvme version" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.460 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.461 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.461 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.462 222021 DEBUG os_brick.utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] <== get_connector_properties: return (126ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:58:04 np0005593233 nova_compute[222017]: 2026-01-23 09:58:04.462 222021 DEBUG nova.virt.block_device [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating existing volume attachment record: 7cf3deb7-daa6-4dca-9273-14482997f86b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:58:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:05.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:05.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:05 np0005593233 nova_compute[222017]: 2026-01-23 09:58:05.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:05 np0005593233 nova_compute[222017]: 2026-01-23 09:58:05.640 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully created port: 79d4ff96-8918-4a77-9ba5-62ac2bc78903 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:06 np0005593233 nova_compute[222017]: 2026-01-23 09:58:06.833 222021 INFO nova.virt.block_device [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Booting with volume 4c100131-c84f-4546-b7f0-7e22ffc499a0 at /dev/vdc#033[00m
Jan 23 04:58:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:07.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:07.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.487 222021 DEBUG os_brick.utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.489 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.503 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.504 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[521fd8ed-d908-4a30-b43b-8e5aa7791df4]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.505 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.516 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.516 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[1dad1a94-43c8-478a-8adb-4cc03ee0a9a1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.518 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.529 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.530 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[23fd34e2-a96c-44c7-86a7-cd43adec882b]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.531 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a2c00d-289c-44ea-9df0-9f0e32182a6a]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.533 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.577 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully created port: 5598d21a-d2b3-4fe1-ae77-65ec863bd48e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.582 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "nvme version" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.584 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.585 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.585 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.586 222021 DEBUG os_brick.utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] <== get_connector_properties: return (98ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:58:07 np0005593233 nova_compute[222017]: 2026-01-23 09:58:07.586 222021 DEBUG nova.virt.block_device [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating existing volume attachment record: 59e30594-12a6-4ea0-9446-9cd96f4d4dd0 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.673 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.965 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.967 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.967 222021 INFO nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Creating image(s)#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.968 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.968 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Ensure instance console log exists: /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.969 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.969 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:08 np0005593233 nova_compute[222017]: 2026-01-23 09:58:08.970 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:09.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:09 np0005593233 podman[258549]: 2026-01-23 09:58:09.14253112 +0000 UTC m=+0.146134257 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:58:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:09.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:10 np0005593233 nova_compute[222017]: 2026-01-23 09:58:10.228 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully created port: a98412a1-0341-4835-956a-c4201becd7ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:10 np0005593233 nova_compute[222017]: 2026-01-23 09:58:10.438 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:11.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:12 np0005593233 nova_compute[222017]: 2026-01-23 09:58:12.673 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully created port: 6d043e19-5004-46eb-b144-cc1b3476f21f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:13.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:13 np0005593233 nova_compute[222017]: 2026-01-23 09:58:13.725 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:15.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:15.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:15 np0005593233 nova_compute[222017]: 2026-01-23 09:58:15.441 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:17.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:17.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:18 np0005593233 nova_compute[222017]: 2026-01-23 09:58:18.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:18 np0005593233 nova_compute[222017]: 2026-01-23 09:58:18.797 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: 446c7501-f73f-4cbf-8a43-e78948d8bec1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:19.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:19.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:19 np0005593233 nova_compute[222017]: 2026-01-23 09:58:19.442 222021 DEBUG nova.compute.manager [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:19 np0005593233 nova_compute[222017]: 2026-01-23 09:58:19.442 222021 DEBUG nova.compute.manager [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-446c7501-f73f-4cbf-8a43-e78948d8bec1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:19 np0005593233 nova_compute[222017]: 2026-01-23 09:58:19.443 222021 DEBUG oslo_concurrency.lockutils [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:19 np0005593233 nova_compute[222017]: 2026-01-23 09:58:19.443 222021 DEBUG oslo_concurrency.lockutils [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:19 np0005593233 nova_compute[222017]: 2026-01-23 09:58:19.443 222021 DEBUG nova.network.neutron [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port 446c7501-f73f-4cbf-8a43-e78948d8bec1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:20 np0005593233 nova_compute[222017]: 2026-01-23 09:58:20.242 222021 DEBUG nova.network.neutron [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:20 np0005593233 nova_compute[222017]: 2026-01-23 09:58:20.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:21.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:21.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:22 np0005593233 podman[258576]: 2026-01-23 09:58:22.061354016 +0000 UTC m=+0.071568307 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 04:58:22 np0005593233 nova_compute[222017]: 2026-01-23 09:58:22.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:22 np0005593233 nova_compute[222017]: 2026-01-23 09:58:22.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:22 np0005593233 nova_compute[222017]: 2026-01-23 09:58:22.754 222021 DEBUG nova.network.neutron [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:22 np0005593233 nova_compute[222017]: 2026-01-23 09:58:22.799 222021 DEBUG oslo_concurrency.lockutils [req-32f6ba83-568e-4715-9975-6b2a2bc89906 req-5e2acc50-8e36-47bf-8502-2d887d5f5f15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:23.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:23.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:23 np0005593233 nova_compute[222017]: 2026-01-23 09:58:23.789 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:25.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.118 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:25.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.446 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.628 222021 DEBUG nova.compute.manager [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.628 222021 DEBUG nova.compute.manager [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-14f0f686-d5a3-4f53-a3d8-30c646ece1c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.628 222021 DEBUG oslo_concurrency.lockutils [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.629 222021 DEBUG oslo_concurrency.lockutils [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:25 np0005593233 nova_compute[222017]: 2026-01-23 09:58:25.629 222021 DEBUG nova.network.neutron [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:26 np0005593233 nova_compute[222017]: 2026-01-23 09:58:26.147 222021 DEBUG nova.network.neutron [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:58:26.476 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:58:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:58:26.477 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:58:26 np0005593233 nova_compute[222017]: 2026-01-23 09:58:26.478 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:27.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:27.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.459 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.460 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.461 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.530 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.530 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.531 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.531 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:58:27 np0005593233 nova_compute[222017]: 2026-01-23 09:58:27.531 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3736700102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.031 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.100 222021 DEBUG nova.network.neutron [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.123 222021 DEBUG oslo_concurrency.lockutils [req-0229d229-bb34-4eb5-83f1-95d1eac93c7f req-6b39e819-0e45-44cd-824f-0e1471fc4202 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.220 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.222 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4624MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.222 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.222 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.485 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 6d680830-de0e-445d-9d57-b3b0724cb5a8 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.485 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.486 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.600 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: e75ad2d5-c059-4218-adfe-89823d98a762 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.791 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.920 222021 DEBUG nova.compute.manager [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-e75ad2d5-c059-4218-adfe-89823d98a762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.921 222021 DEBUG nova.compute.manager [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-e75ad2d5-c059-4218-adfe-89823d98a762. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.921 222021 DEBUG oslo_concurrency.lockutils [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.922 222021 DEBUG oslo_concurrency.lockutils [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:28 np0005593233 nova_compute[222017]: 2026-01-23 09:58:28.922 222021 DEBUG nova.network.neutron [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port e75ad2d5-c059-4218-adfe-89823d98a762 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.009 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:29.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/58949388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.501 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.508 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.520 222021 DEBUG nova.network.neutron [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.532 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.561 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:58:29 np0005593233 nova_compute[222017]: 2026-01-23 09:58:29.562 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:30 np0005593233 nova_compute[222017]: 2026-01-23 09:58:30.448 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:30 np0005593233 nova_compute[222017]: 2026-01-23 09:58:30.621 222021 DEBUG nova.network.neutron [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:30 np0005593233 nova_compute[222017]: 2026-01-23 09:58:30.659 222021 DEBUG oslo_concurrency.lockutils [req-27471cee-e539-4be1-86fa-6b468438992b req-7a9f357a-5694-4f4e-b895-2dc833f26d64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:31.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.374 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: 79d4ff96-8918-4a77-9ba5-62ac2bc78903 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:58:31.479 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.488 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.489 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.489 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.490 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.552 222021 DEBUG nova.compute.manager [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-79d4ff96-8918-4a77-9ba5-62ac2bc78903 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.553 222021 DEBUG nova.compute.manager [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-79d4ff96-8918-4a77-9ba5-62ac2bc78903. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.554 222021 DEBUG oslo_concurrency.lockutils [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.554 222021 DEBUG oslo_concurrency.lockutils [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.554 222021 DEBUG nova.network.neutron [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port 79d4ff96-8918-4a77-9ba5-62ac2bc78903 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:31 np0005593233 nova_compute[222017]: 2026-01-23 09:58:31.887 222021 DEBUG nova.network.neutron [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:32 np0005593233 nova_compute[222017]: 2026-01-23 09:58:32.511 222021 DEBUG nova.network.neutron [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:32 np0005593233 nova_compute[222017]: 2026-01-23 09:58:32.543 222021 DEBUG oslo_concurrency.lockutils [req-5db6cee7-5f85-43bd-89c0-e962ac8eeffb req-fb9162f3-205b-4922-9012-7bc25ec2a5aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:33.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:33.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:33 np0005593233 nova_compute[222017]: 2026-01-23 09:58:33.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:33.999 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: 5598d21a-d2b3-4fe1-ae77-65ec863bd48e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.149 222021 DEBUG nova.compute.manager [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-5598d21a-d2b3-4fe1-ae77-65ec863bd48e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.150 222021 DEBUG nova.compute.manager [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-5598d21a-d2b3-4fe1-ae77-65ec863bd48e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.150 222021 DEBUG oslo_concurrency.lockutils [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.151 222021 DEBUG oslo_concurrency.lockutils [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.151 222021 DEBUG nova.network.neutron [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port 5598d21a-d2b3-4fe1-ae77-65ec863bd48e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.417 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.417 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:58:34 np0005593233 nova_compute[222017]: 2026-01-23 09:58:34.819 222021 DEBUG nova.network.neutron [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:35.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:35.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:35 np0005593233 nova_compute[222017]: 2026-01-23 09:58:35.450 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:35 np0005593233 nova_compute[222017]: 2026-01-23 09:58:35.586 222021 DEBUG nova.network.neutron [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:35 np0005593233 nova_compute[222017]: 2026-01-23 09:58:35.617 222021 DEBUG oslo_concurrency.lockutils [req-c4fb8d8f-8adb-4d2f-a346-3c3ea7b0e764 req-e745dc4b-404a-4d60-85d6-789c414bf9bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:37.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:37.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.562 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: a98412a1-0341-4835-956a-c4201becd7ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.748 222021 DEBUG nova.compute.manager [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-a98412a1-0341-4835-956a-c4201becd7ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.748 222021 DEBUG nova.compute.manager [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-a98412a1-0341-4835-956a-c4201becd7ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.749 222021 DEBUG oslo_concurrency.lockutils [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.749 222021 DEBUG oslo_concurrency.lockutils [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.749 222021 DEBUG nova.network.neutron [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port a98412a1-0341-4835-956a-c4201becd7ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:38 np0005593233 nova_compute[222017]: 2026-01-23 09:58:38.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 04:58:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:39.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 04:58:39 np0005593233 nova_compute[222017]: 2026-01-23 09:58:39.170 222021 DEBUG nova.network.neutron [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:39.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:40 np0005593233 podman[258639]: 2026-01-23 09:58:40.106850083 +0000 UTC m=+0.112685750 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 04:58:40 np0005593233 nova_compute[222017]: 2026-01-23 09:58:40.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:41.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:41.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:41 np0005593233 nova_compute[222017]: 2026-01-23 09:58:41.923 222021 DEBUG nova.network.neutron [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:41 np0005593233 nova_compute[222017]: 2026-01-23 09:58:41.969 222021 DEBUG oslo_concurrency.lockutils [req-636c2a53-c5fe-4bea-9ff7-31e81328be48 req-ddeebff9-e8e1-4c5b-a1f6-8087bfe0d1dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:58:42Z|00387|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 04:58:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:58:42.660 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:58:42.661 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:58:42.661 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:43.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:43.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:43 np0005593233 nova_compute[222017]: 2026-01-23 09:58:43.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:45.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:45.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:45 np0005593233 nova_compute[222017]: 2026-01-23 09:58:45.507 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:46 np0005593233 nova_compute[222017]: 2026-01-23 09:58:46.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:46 np0005593233 nova_compute[222017]: 2026-01-23 09:58:46.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:58:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:47.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:47.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.001 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Successfully updated port: 6d043e19-5004-46eb-b144-cc1b3476f21f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.031 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.032 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.032 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.531 222021 DEBUG nova.compute.manager [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-6d043e19-5004-46eb-b144-cc1b3476f21f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.532 222021 DEBUG nova.compute.manager [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-6d043e19-5004-46eb-b144-cc1b3476f21f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.532 222021 DEBUG oslo_concurrency.lockutils [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:48 np0005593233 nova_compute[222017]: 2026-01-23 09:58:48.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:49 np0005593233 nova_compute[222017]: 2026-01-23 09:58:49.008 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:49.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:49.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:50 np0005593233 nova_compute[222017]: 2026-01-23 09:58:50.509 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:51.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:51.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:58:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:58:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:58:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:58:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:58:53 np0005593233 podman[258797]: 2026-01-23 09:58:53.112896964 +0000 UTC m=+0.114908352 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 23 04:58:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:53.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:53.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:53 np0005593233 nova_compute[222017]: 2026-01-23 09:58:53.853 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:55.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:55.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:55 np0005593233 nova_compute[222017]: 2026-01-23 09:58:55.512 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:57.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:57.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:57 np0005593233 nova_compute[222017]: 2026-01-23 09:58:57.414 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:57 np0005593233 nova_compute[222017]: 2026-01-23 09:58:57.415 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:58:57 np0005593233 nova_compute[222017]: 2026-01-23 09:58:57.477 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:58:58 np0005593233 nova_compute[222017]: 2026-01-23 09:58:58.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:59.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:58:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:58:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:58:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:59.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:00 np0005593233 nova_compute[222017]: 2026-01-23 09:59:00.515 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:59:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:01.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.246120) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341246185, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 1857, "num_deletes": 256, "total_data_size": 4152210, "memory_usage": 4222416, "flush_reason": "Manual Compaction"}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341267283, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2708623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 46437, "largest_seqno": 48289, "table_properties": {"data_size": 2701150, "index_size": 4352, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16110, "raw_average_key_size": 19, "raw_value_size": 2685897, "raw_average_value_size": 3320, "num_data_blocks": 191, "num_entries": 809, "num_filter_entries": 809, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162190, "oldest_key_time": 1769162190, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 21240 microseconds, and 7545 cpu microseconds.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.267359) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2708623 bytes OK
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.267389) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.270023) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.270039) EVENT_LOG_v1 {"time_micros": 1769162341270033, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.270065) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 4143671, prev total WAL file size 4164753, number of live WAL files 2.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.271395) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353034' seq:72057594037927935, type:22 .. '6C6F676D0031373536' seq:0, type:0; will stop at (end)
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2645KB)], [90(10MB)]
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341271494, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 14231155, "oldest_snapshot_seqno": -1}
Jan 23 04:59:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:01.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7313 keys, 14086540 bytes, temperature: kUnknown
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341425553, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 14086540, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14033986, "index_size": 33178, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18309, "raw_key_size": 187660, "raw_average_key_size": 25, "raw_value_size": 13899637, "raw_average_value_size": 1900, "num_data_blocks": 1328, "num_entries": 7313, "num_filter_entries": 7313, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.426009) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 14086540 bytes
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.427380) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 92.3 rd, 91.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 11.0 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 7840, records dropped: 527 output_compression: NoCompression
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.427399) EVENT_LOG_v1 {"time_micros": 1769162341427389, "job": 56, "event": "compaction_finished", "compaction_time_micros": 154175, "compaction_time_cpu_micros": 35158, "output_level": 6, "num_output_files": 1, "total_output_size": 14086540, "num_input_records": 7840, "num_output_records": 7313, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341428045, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341430654, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.271220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.430696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.430703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.430704) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.430706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.430708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.431085) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 1
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341431114, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 257, "num_deletes": 251, "total_data_size": 23108, "memory_usage": 28952, "flush_reason": "Manual Compaction"}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341433201, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 13893, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48291, "largest_seqno": 48546, "table_properties": {"data_size": 12141, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 5144, "raw_average_key_size": 20, "raw_value_size": 8731, "raw_average_value_size": 34, "num_data_blocks": 2, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162341, "oldest_key_time": 1769162341, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 2164 microseconds, and 574 cpu microseconds.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.433248) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 13893 bytes OK
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.433267) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.434404) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.434415) EVENT_LOG_v1 {"time_micros": 1769162341434412, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.434424) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 21082, prev total WAL file size 21082, number of live WAL files 2.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.434814) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353035' seq:72057594037927935, type:22 .. '6D6772737461740031373537' seq:0, type:0; will stop at (end)
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(13KB)], [93(13MB)]
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341434905, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14100433, "oldest_snapshot_seqno": -1}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7063 keys, 10259466 bytes, temperature: kUnknown
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341537473, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 10259466, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10213527, "index_size": 27203, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17669, "raw_key_size": 182658, "raw_average_key_size": 25, "raw_value_size": 10088430, "raw_average_value_size": 1428, "num_data_blocks": 1078, "num_entries": 7063, "num_filter_entries": 7063, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.537751) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 10259466 bytes
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.546026) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 137.4 rd, 99.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 13.4 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(1753.4) write-amplify(738.5) OK, records in: 7569, records dropped: 506 output_compression: NoCompression
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.546072) EVENT_LOG_v1 {"time_micros": 1769162341546056, "job": 58, "event": "compaction_finished", "compaction_time_micros": 102652, "compaction_time_cpu_micros": 28812, "output_level": 6, "num_output_files": 1, "total_output_size": 10259466, "num_input_records": 7569, "num_output_records": 7063, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341546358, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341548823, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.434697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.548950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.548961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.548963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.548965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:01.548967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:03.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:59:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:03.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:59:03 np0005593233 nova_compute[222017]: 2026-01-23 09:59:03.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:05.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:05.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:05 np0005593233 nova_compute[222017]: 2026-01-23 09:59:05.517 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:07.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:07.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:08 np0005593233 nova_compute[222017]: 2026-01-23 09:59:08.886 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:09.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:09.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:10 np0005593233 nova_compute[222017]: 2026-01-23 09:59:10.519 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:11 np0005593233 podman[258866]: 2026-01-23 09:59:11.098988267 +0000 UTC m=+0.107536355 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:59:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:11.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:11.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.522 222021 DEBUG nova.network.neutron [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.573 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.573 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance network_info: |[{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.574 222021 DEBUG oslo_concurrency.lockutils [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.575 222021 DEBUG nova.network.neutron [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port 6d043e19-5004-46eb-b144-cc1b3476f21f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.583 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Start _get_guest_xml network_info=[{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk', 'boot_index': '2'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk', 'boot_index': '3'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-
Jan 23 04:59:11 np0005593233 nova_compute[222017]: =0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-d7d5138e-60ed-44f2-b4e5-1e53de5cf442', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'd7d5138e-60ed-44f2-b4e5-1e53de5cf442', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'attached_at': '', 'detached_at': '', 'volume_id': 'd7d5138e-60ed-44f2-b4e5-1e53de5cf442', 'serial': 'd7d5138e-60ed-44f2-b4e5-1e53de5cf442'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'c67fc8c9-4438-4662-915a-db444ba9c2e4', 'volume_type': None}, {'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-3fa8e844-6545-43af-a18d-ac7dfc7d2071', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '3fa8e844-6545-43af-a18d-ac7dfc7d2071', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'attached_at': '', 'detached_at': '', 'volume_id': '3fa8e844-6545-43af-a18d-ac7dfc7d2071', 'serial': '3fa8e844-6545-43af-a18d-ac7dfc7d2071'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vdb', 'boot_index': 1, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '7cf3deb7-daa6-4dca-9273-14482997f86b', 'volume_type': None}, {'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-4c100131-c84f-4546-b7f0-7e22ffc499a0', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '4c100131-c84f-4546-b7f0-7e22ffc499a0', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'attached_at': '', 'detached_at': '', 'volume_id': '4c100131-c84f-4546-b7f0-7e22ffc499a0', 'serial': '4c100131-c84f-4546-b7f0-7e22ffc499a0'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vdc', 'boot_index': 2, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '59e30594-12a6-4ea0-9446-9cd96f4d4dd0', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.589 222021 WARNING nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.599 222021 DEBUG nova.virt.libvirt.host [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.600 222021 DEBUG nova.virt.libvirt.host [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.604 222021 DEBUG nova.virt.libvirt.host [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.604 222021 DEBUG nova.virt.libvirt.host [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.606 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.606 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.607 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.607 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.607 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.607 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.607 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.608 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.608 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.608 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.608 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.609 222021 DEBUG nova.virt.hardware [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.643 222021 DEBUG nova.storage.rbd_utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] rbd image 6d680830-de0e-445d-9d57-b3b0724cb5a8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:11 np0005593233 nova_compute[222017]: 2026-01-23 09:59:11.652 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:11 np0005593233 rsyslogd[1009]: message too long (8192) with configured size 8096, begin of message is: 2026-01-23 09:59:11.583 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Jan 23 04:59:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:59:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2358792409' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.142 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.313 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.314 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.316 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.317 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.318 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.319 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.320 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.321 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.322 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.323 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.323 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.324 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.326 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.326 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.327 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.328 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.329 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.330 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.331 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.331 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.332 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.335 222021 DEBUG nova.objects.instance [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d680830-de0e-445d-9d57-b3b0724cb5a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.380 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <uuid>6d680830-de0e-445d-9d57-b3b0724cb5a8</uuid>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <name>instance-00000059</name>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:name>tempest-device-tagging-server-135862627</nova:name>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:59:11</nova:creationTime>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:user uuid="35b29e4a06884f7d88683d00f85d4630">tempest-TaggedBootDevicesTest_v242-1752959028-project-member</nova:user>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:project uuid="8924c80a71a94fdeb114c6bdbdb2939c">tempest-TaggedBootDevicesTest_v242-1752959028</nova:project>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="446c7501-f73f-4cbf-8a43-e78948d8bec1">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="14f0f686-d5a3-4f53-a3d8-30c646ece1c3">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.42" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="e75ad2d5-c059-4218-adfe-89823d98a762">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.51" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="79d4ff96-8918-4a77-9ba5-62ac2bc78903">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.184" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="5598d21a-d2b3-4fe1-ae77-65ec863bd48e">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.1.1.102" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="a98412a1-0341-4835-956a-c4201becd7ef">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.2.2.100" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <nova:port uuid="6d043e19-5004-46eb-b144-cc1b3476f21f">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.2.2.200" ipVersion="4"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <entry name="serial">6d680830-de0e-445d-9d57-b3b0724cb5a8</entry>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <entry name="uuid">6d680830-de0e-445d-9d57-b3b0724cb5a8</entry>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/6d680830-de0e-445d-9d57-b3b0724cb5a8_disk.config">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-d7d5138e-60ed-44f2-b4e5-1e53de5cf442">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <serial>d7d5138e-60ed-44f2-b4e5-1e53de5cf442</serial>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-3fa8e844-6545-43af-a18d-ac7dfc7d2071">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="vdb" bus="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <serial>3fa8e844-6545-43af-a18d-ac7dfc7d2071</serial>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-4c100131-c84f-4546-b7f0-7e22ffc499a0">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="vdc" bus="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <serial>4c100131-c84f-4546-b7f0-7e22ffc499a0</serial>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:7f:a6:73"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tap446c7501-f7"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:9c:d3:23"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tap14f0f686-d5"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:a4:d2:12"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tape75ad2d5-c0"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:0a:85:d8"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tap79d4ff96-89"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:c2:b8:70"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tap5598d21a-d2"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:eb:eb:a0"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tapa98412a1-03"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:9e:ac:0a"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <target dev="tap6d043e19-50"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/console.log" append="off"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:59:12 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:59:12 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:59:12 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:59:12 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.382 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.383 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.383 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.383 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.384 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.384 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.384 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.384 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.384 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.384 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.385 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.385 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.385 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.385 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.385 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.386 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.386 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.386 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.386 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.386 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.387 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.387 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.387 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.387 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.387 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Preparing to wait for external event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.388 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.388 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.388 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.389 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.390 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.390 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.391 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.392 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.393 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.393 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.398 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.398 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap446c7501-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.399 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap446c7501-f7, col_values=(('external_ids', {'iface-id': '446c7501-f73f-4cbf-8a43-e78948d8bec1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:a6:73', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.401 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.4025] manager: (tap446c7501-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.410 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.412 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.413 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.413 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.413 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.414 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.414 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.415 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.418 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.418 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14f0f686-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.419 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap14f0f686-d5, col_values=(('external_ids', {'iface-id': '14f0f686-d5a3-4f53-a3d8-30c646ece1c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:d3:23', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.4211] manager: (tap14f0f686-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.426 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.427 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.428 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.429 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.429 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.430 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.430 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.430 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.430 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.432 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.433 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape75ad2d5-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.433 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape75ad2d5-c0, col_values=(('external_ids', {'iface-id': 'e75ad2d5-c059-4218-adfe-89823d98a762', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:d2:12', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.4352] manager: (tape75ad2d5-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.444 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.445 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.446 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.446 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.447 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.447 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.447 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.448 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.448 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.450 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.450 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79d4ff96-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.450 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap79d4ff96-89, col_values=(('external_ids', {'iface-id': '79d4ff96-8918-4a77-9ba5-62ac2bc78903', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:85:d8', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.4526] manager: (tap79d4ff96-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.463 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.465 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.466 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.466 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.467 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.467 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.468 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.468 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.468 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.471 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.471 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5598d21a-d2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.471 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5598d21a-d2, col_values=(('external_ids', {'iface-id': '5598d21a-d2b3-4fe1-ae77-65ec863bd48e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:b8:70', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.4738] manager: (tap5598d21a-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.475 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.486 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.487 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.488 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.488 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.489 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.489 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.489 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.490 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.490 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.492 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.492 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa98412a1-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.492 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa98412a1-03, col_values=(('external_ids', {'iface-id': 'a98412a1-0341-4835-956a-c4201becd7ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:eb:a0', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.4948] manager: (tapa98412a1-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.497 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.511 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.512 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.513 222021 DEBUG nova.virt.libvirt.vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.513 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.514 222021 DEBUG nova.network.os_vif_util [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.514 222021 DEBUG os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.515 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.515 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.516 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.519 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.520 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d043e19-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.521 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6d043e19-50, col_values=(('external_ids', {'iface-id': '6d043e19-5004-46eb-b144-cc1b3476f21f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:ac:0a', 'vm-uuid': '6d680830-de0e-445d-9d57-b3b0724cb5a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.523 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 NetworkManager[48871]: <info>  [1769162352.5239] manager: (tap6d043e19-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.525 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.546 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.547 222021 INFO os_vif [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50')#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.730 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.731 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.731 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] No VIF found with MAC fa:16:3e:7f:a6:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.731 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] No VIF found with MAC fa:16:3e:c2:b8:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.732 222021 INFO nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Using config drive#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.762 222021 DEBUG nova.storage.rbd_utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] rbd image 6d680830-de0e-445d-9d57-b3b0724cb5a8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.810 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.811 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593233 nova_compute[222017]: 2026-01-23 09:59:12.855 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:59:13 np0005593233 nova_compute[222017]: 2026-01-23 09:59:13.031 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:13 np0005593233 nova_compute[222017]: 2026-01-23 09:59:13.032 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:13 np0005593233 nova_compute[222017]: 2026-01-23 09:59:13.051 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:59:13 np0005593233 nova_compute[222017]: 2026-01-23 09:59:13.051 222021 INFO nova.compute.claims [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 04:59:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:13.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:13.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.140 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.576 222021 INFO nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Creating config drive at /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/disk.config#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.583 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4btblzn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1209784302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.641 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.651 222021 DEBUG nova.compute.provider_tree [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.689 222021 DEBUG nova.scheduler.client.report [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.731 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4btblzn" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.782 222021 DEBUG nova.storage.rbd_utils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] rbd image 6d680830-de0e-445d-9d57-b3b0724cb5a8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.788 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/disk.config 6d680830-de0e-445d-9d57-b3b0724cb5a8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.829 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.830 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.913 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.914 222021 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.940 222021 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:59:14 np0005593233 nova_compute[222017]: 2026-01-23 09:59:14.981 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.156 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.158 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.159 222021 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Creating image(s)#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.188 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:15.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.219 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.252 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.258 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:15.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.350 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.351 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.352 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.352 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.379 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.385 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2e38e1c5-8495-488c-a1d1-082804b39c88_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:15 np0005593233 nova_compute[222017]: 2026-01-23 09:59:15.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.076 222021 DEBUG nova.policy [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd83df80213fd40f99fdc68c146fe9a2a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c288779980de4f03be20b7eed343b775', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:59:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:59:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:17.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.245 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2e38e1c5-8495-488c-a1d1-082804b39c88_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.860s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:17.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.326 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] resizing rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.367 222021 DEBUG oslo_concurrency.processutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/disk.config 6d680830-de0e-445d-9d57-b3b0724cb5a8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.580s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.368 222021 INFO nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Deleting local config drive /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8/disk.config because it was imported into RBD.#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.4429] manager: (tap446c7501-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Jan 23 04:59:17 np0005593233 kernel: tap446c7501-f7: entered promiscuous mode
Jan 23 04:59:17 np0005593233 kernel: tap14f0f686-d5: entered promiscuous mode
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.4603] manager: (tap14f0f686-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00388|binding|INFO|Claiming lport 446c7501-f73f-4cbf-8a43-e78948d8bec1 for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00389|binding|INFO|446c7501-f73f-4cbf-8a43-e78948d8bec1: Claiming fa:16:3e:7f:a6:73 10.100.0.9
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00390|binding|INFO|Claiming lport 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00391|binding|INFO|14f0f686-d5a3-4f53-a3d8-30c646ece1c3: Claiming fa:16:3e:9c:d3:23 10.1.1.42
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5196] manager: (tape75ad2d5-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/194)
Jan 23 04:59:17 np0005593233 kernel: tape75ad2d5-c0: entered promiscuous mode
Jan 23 04:59:17 np0005593233 systemd-udevd[259218]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:17 np0005593233 systemd-udevd[259223]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:17 np0005593233 systemd-udevd[259222]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.535 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:a6:73 10.100.0.9'], port_security=['fa:16:3e:7f:a6:73 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7dcd690-014c-42e5-b3de-8c4087e847b3, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=446c7501-f73f-4cbf-8a43-e78948d8bec1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5382] manager: (tap79d4ff96-89): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.537 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:d3:23 10.1.1.42'], port_security=['fa:16:3e:9c:d3:23 10.1.1.42'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-1194929224', 'neutron:cidrs': '10.1.1.42/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-1194929224', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'e99a8f8e-4511-419c-93e3-9e8ca4222a4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=14f0f686-d5a3-4f53-a3d8-30c646ece1c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.537 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 446c7501-f73f-4cbf-8a43-e78948d8bec1 in datapath 9bd04a8e-3b21-48a4-942d-6ede17d32ccd bound to our chassis#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.539 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bd04a8e-3b21-48a4-942d-6ede17d32ccd#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5459] device (tap446c7501-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5470] device (tap446c7501-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 systemd-udevd[259232]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5508] device (tape75ad2d5-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5518] device (tape75ad2d5-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5545] device (tap14f0f686-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.553 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[19ca529e-1df7-4c14-bb1b-4d55e5b5c5b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.554 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bd04a8e-31 in ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5556] device (tap14f0f686-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.556 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bd04a8e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.557 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[763ab369-28c4-4e23-910a-69090a13bbff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.558 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9c05b0b9-1c02-41a2-9967-9de9db955beb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 systemd-udevd[259233]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5634] manager: (tap5598d21a-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.572 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1b378a-6912-4852-b577-5db2c1c218d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.5822] manager: (tapa98412a1-03): new Tun device (/org/freedesktop/NetworkManager/Devices/197)
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6045] manager: (tap6d043e19-50): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.605 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5843b64e-7391-4d1e-913c-aff39253a294]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00392|binding|INFO|Claiming lport e75ad2d5-c059-4218-adfe-89823d98a762 for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00393|binding|INFO|e75ad2d5-c059-4218-adfe-89823d98a762: Claiming fa:16:3e:a4:d2:12 10.1.1.51
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6260] device (tapa98412a1-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 kernel: tapa98412a1-03: entered promiscuous mode
Jan 23 04:59:17 np0005593233 kernel: tap5598d21a-d2: entered promiscuous mode
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6280] device (tap5598d21a-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6289] device (tapa98412a1-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 kernel: tap79d4ff96-89: entered promiscuous mode
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6295] device (tap79d4ff96-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6303] device (tap5598d21a-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 kernel: tap6d043e19-50: entered promiscuous mode
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6308] device (tap6d043e19-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.630 222021 DEBUG nova.objects.instance [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e38e1c5-8495-488c-a1d1-082804b39c88 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6318] device (tap79d4ff96-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6323] device (tap6d043e19-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00394|if_status|INFO|Not updating pb chassis for 5598d21a-d2b3-4fe1-ae77-65ec863bd48e now as sb is readonly
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00395|binding|INFO|Claiming lport 6d043e19-5004-46eb-b144-cc1b3476f21f for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00396|binding|INFO|6d043e19-5004-46eb-b144-cc1b3476f21f: Claiming fa:16:3e:9e:ac:0a 10.2.2.200
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00397|binding|INFO|Claiming lport 79d4ff96-8918-4a77-9ba5-62ac2bc78903 for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00398|binding|INFO|79d4ff96-8918-4a77-9ba5-62ac2bc78903: Claiming fa:16:3e:0a:85:d8 10.1.1.184
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00399|binding|INFO|Claiming lport a98412a1-0341-4835-956a-c4201becd7ef for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00400|binding|INFO|a98412a1-0341-4835-956a-c4201becd7ef: Claiming fa:16:3e:eb:eb:a0 10.2.2.100
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00401|binding|INFO|Claiming lport 5598d21a-d2b3-4fe1-ae77-65ec863bd48e for this chassis.
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00402|binding|INFO|5598d21a-d2b3-4fe1-ae77-65ec863bd48e: Claiming fa:16:3e:c2:b8:70 10.1.1.102
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.644 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:d2:12 10.1.1.51'], port_security=['fa:16:3e:a4:d2:12 10.1.1.51'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-504651905', 'neutron:cidrs': '10.1.1.51/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-504651905', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'e99a8f8e-4511-419c-93e3-9e8ca4222a4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e75ad2d5-c059-4218-adfe-89823d98a762) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 systemd-machined[190954]: New machine qemu-45-instance-00000059.
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.654 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00403|binding|INFO|Setting lport 446c7501-f73f-4cbf-8a43-e78948d8bec1 ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00404|binding|INFO|Setting lport 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00405|binding|INFO|Setting lport 446c7501-f73f-4cbf-8a43-e78948d8bec1 up in Southbound
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00406|binding|INFO|Setting lport 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 up in Southbound
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.657 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Ensure instance console log exists: /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.658 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.658 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.658 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.660 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:b8:70 10.1.1.102'], port_security=['fa:16:3e:c2:b8:70 10.1.1.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.102/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5598d21a-d2b3-4fe1-ae77-65ec863bd48e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.662 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:85:d8 10.1.1.184'], port_security=['fa:16:3e:0a:85:d8 10.1.1.184'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.184/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=79d4ff96-8918-4a77-9ba5-62ac2bc78903) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.664 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:ac:0a 10.2.2.200'], port_security=['fa:16:3e:9e:ac:0a 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f53c3172-2247-4d09-9d71-c69e6158a9f6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=6d043e19-5004-46eb-b144-cc1b3476f21f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.666 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:eb:a0 10.2.2.100'], port_security=['fa:16:3e:eb:eb:a0 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f53c3172-2247-4d09-9d71-c69e6158a9f6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=a98412a1-0341-4835-956a-c4201becd7ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.659 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5129b7-883d-416d-aebc-e45ae35f3231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 systemd[1]: Started Virtual Machine qemu-45-instance-00000059.
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.6795] manager: (tap9bd04a8e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.678 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2c874b-b1d0-4bfc-ac63-951a9bfbfd13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.729 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f09f90-5993-4a8b-bbf2-42e6120a0f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.733 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[42d64ab3-6538-472f-901f-8eb169066414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00407|binding|INFO|Setting lport e75ad2d5-c059-4218-adfe-89823d98a762 ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00408|binding|INFO|Setting lport e75ad2d5-c059-4218-adfe-89823d98a762 up in Southbound
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00409|binding|INFO|Setting lport 5598d21a-d2b3-4fe1-ae77-65ec863bd48e ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00410|binding|INFO|Setting lport 5598d21a-d2b3-4fe1-ae77-65ec863bd48e up in Southbound
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00411|binding|INFO|Setting lport a98412a1-0341-4835-956a-c4201becd7ef ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00412|binding|INFO|Setting lport a98412a1-0341-4835-956a-c4201becd7ef up in Southbound
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00413|binding|INFO|Setting lport 6d043e19-5004-46eb-b144-cc1b3476f21f ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00414|binding|INFO|Setting lport 6d043e19-5004-46eb-b144-cc1b3476f21f up in Southbound
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00415|binding|INFO|Setting lport 79d4ff96-8918-4a77-9ba5-62ac2bc78903 ovn-installed in OVS
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00416|binding|INFO|Setting lport 79d4ff96-8918-4a77-9ba5-62ac2bc78903 up in Southbound
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.7626] device (tap9bd04a8e-30): carrier: link connected
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.769 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b93885cb-c6b8-4253-a981-0274f5e5ac01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.790 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e135fa23-3a4e-4739-9df8-5aa189b1a935]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bd04a8e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:61:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627039, 'reachable_time': 25901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259287, 'error': None, 'target': 'ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.809 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2488240a-3d6d-42c6-a363-806e36d4e880]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:61ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627039, 'tstamp': 627039}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259288, 'error': None, 'target': 'ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.831 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d558d2-0f6e-49eb-a31d-976178ed81dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bd04a8e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:61:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627039, 'reachable_time': 25901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259289, 'error': None, 'target': 'ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.866 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a07ccb9-8052-45a5-a748-81914f1f5a09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.940 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b79c13d0-9a45-4f14-9093-41a7ae90bfd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.941 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bd04a8e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.941 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.942 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bd04a8e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:17 np0005593233 NetworkManager[48871]: <info>  [1769162357.9445] manager: (tap9bd04a8e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.944 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 kernel: tap9bd04a8e-30: entered promiscuous mode
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.948 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bd04a8e-30, col_values=(('external_ids', {'iface-id': 'b4b2029e-fa11-4534-9927-8291416ede63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:17Z|00417|binding|INFO|Releasing lport b4b2029e-fa11-4534-9927-8291416ede63 from this chassis (sb_readonly=0)
Jan 23 04:59:17 np0005593233 nova_compute[222017]: 2026-01-23 09:59:17.966 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.967 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bd04a8e-3b21-48a4-942d-6ede17d32ccd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bd04a8e-3b21-48a4-942d-6ede17d32ccd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.968 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf13c77-b9af-43a8-89a7-9a5bc5d036d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.969 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-9bd04a8e-3b21-48a4-942d-6ede17d32ccd
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/9bd04a8e-3b21-48a4-942d-6ede17d32ccd.pid.haproxy
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 9bd04a8e-3b21-48a4-942d-6ede17d32ccd
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:59:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:17.970 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'env', 'PROCESS_TAG=haproxy-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bd04a8e-3b21-48a4-942d-6ede17d32ccd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.307 222021 DEBUG nova.network.neutron [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updated VIF entry in instance network info cache for port 6d043e19-5004-46eb-b144-cc1b3476f21f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.308 222021 DEBUG nova.network.neutron [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.376 222021 DEBUG oslo_concurrency.lockutils [req-dbeb3f52-13eb-486a-9f8f-77aab80c6af3 req-bb23983c-5766-4cad-9152-264e67e73ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:18 np0005593233 podman[259390]: 2026-01-23 09:59:18.463230223 +0000 UTC m=+0.067716169 container create a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 04:59:18 np0005593233 systemd[1]: Started libpod-conmon-a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992.scope.
Jan 23 04:59:18 np0005593233 podman[259390]: 2026-01-23 09:59:18.427735128 +0000 UTC m=+0.032221104 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:59:18 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:59:18 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3794ea3b2c83101d50c549cb1e3f3aa769a3c4b6aab8c190b1fc70a0a30ac333/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.575 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162358.5743916, 6d680830-de0e-445d-9d57-b3b0724cb5a8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.577 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] VM Started (Lifecycle Event)#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.618 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.623 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162358.5751214, 6d680830-de0e-445d-9d57-b3b0724cb5a8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.624 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:59:18 np0005593233 podman[259390]: 2026-01-23 09:59:18.629686929 +0000 UTC m=+0.234172885 container init a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:59:18 np0005593233 podman[259390]: 2026-01-23 09:59:18.636939782 +0000 UTC m=+0.241425728 container start a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 04:59:18 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [NOTICE]   (259420) : New worker (259422) forked
Jan 23 04:59:18 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [NOTICE]   (259420) : Loading success.
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.676 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.681 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.718 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.722 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c166c58f-c448-497c-a470-dba6713fe726#033[00m
Jan 23 04:59:18 np0005593233 nova_compute[222017]: 2026-01-23 09:59:18.729 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.737 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a82f3eee-e1ee-422f-85d9-61e6c4ee3261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.738 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc166c58f-c1 in ovnmeta-c166c58f-c448-497c-a470-dba6713fe726 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.743 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc166c58f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.743 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[745b676a-c84b-47d2-a1be-d44c5dbf0e6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.744 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e2404225-9606-4fe1-a843-08f5f528d9ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.761 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[d08b81eb-6433-4ca9-9a22-ffa83d63bef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.778 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9b812fb4-8f39-47f0-b599-5aa5710264b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.826 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[232ee0a1-1116-4692-8b59-1d218f2f2e0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 NetworkManager[48871]: <info>  [1769162358.8363] manager: (tapc166c58f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.836 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[056f803c-0975-461b-a115-e1853ac78a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 systemd-udevd[259269]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.885 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[65d297ce-45d0-49eb-8902-737d535aac61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.889 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3591a7a3-c64f-45b0-95ba-d2be992dd160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 NetworkManager[48871]: <info>  [1769162358.9210] device (tapc166c58f-c0): carrier: link connected
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.926 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e0074ac5-77c2-4920-8a13-d16ea061621b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.951 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6253d57-4c3b-4fb8-ab20-823af6f3ad53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc166c58f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:27:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627155, 'reachable_time': 20523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259441, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.972 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[76d99ad0-d37c-40d2-9465-cc333da127ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:2774'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627155, 'tstamp': 627155}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259442, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:18.996 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[573e40ae-446e-40d3-8e6d-34204556a646]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc166c58f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:27:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627155, 'reachable_time': 20523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259443, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.040 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ddd7031d-e517-44ea-b3f5-dee188b8a4c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.120 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[651df2c3-ce37-4b05-883c-62a0dcdcf661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.125 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc166c58f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.125 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.126 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc166c58f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.128 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 NetworkManager[48871]: <info>  [1769162359.1295] manager: (tapc166c58f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 23 04:59:19 np0005593233 kernel: tapc166c58f-c0: entered promiscuous mode
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.132 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.133 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc166c58f-c0, col_values=(('external_ids', {'iface-id': '6cef1fb4-85cb-4b31-819a-b0e3e6793f34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:19 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:19Z|00418|binding|INFO|Releasing lport 6cef1fb4-85cb-4b31-819a-b0e3e6793f34 from this chassis (sb_readonly=0)
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.135 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.152 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.153 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c166c58f-c448-497c-a470-dba6713fe726.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c166c58f-c448-497c-a470-dba6713fe726.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.155 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6f23b2-d174-436b-9af6-ed9f9f12773e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.156 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-c166c58f-c448-497c-a470-dba6713fe726
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/c166c58f-c448-497c-a470-dba6713fe726.pid.haproxy
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID c166c58f-c448-497c-a470-dba6713fe726
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.157 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'env', 'PROCESS_TAG=haproxy-c166c58f-c448-497c-a470-dba6713fe726', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c166c58f-c448-497c-a470-dba6713fe726.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:59:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:19.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.306 222021 DEBUG nova.compute.manager [req-221b4d0f-1327-4728-b722-08f7e1c27bf2 req-8dab42be-4504-49ac-b186-a16378f7d5d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.307 222021 DEBUG oslo_concurrency.lockutils [req-221b4d0f-1327-4728-b722-08f7e1c27bf2 req-8dab42be-4504-49ac-b186-a16378f7d5d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.307 222021 DEBUG oslo_concurrency.lockutils [req-221b4d0f-1327-4728-b722-08f7e1c27bf2 req-8dab42be-4504-49ac-b186-a16378f7d5d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.308 222021 DEBUG oslo_concurrency.lockutils [req-221b4d0f-1327-4728-b722-08f7e1c27bf2 req-8dab42be-4504-49ac-b186-a16378f7d5d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.308 222021 DEBUG nova.compute.manager [req-221b4d0f-1327-4728-b722-08f7e1c27bf2 req-8dab42be-4504-49ac-b186-a16378f7d5d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:19.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.424 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.424 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.527 222021 DEBUG nova.compute.manager [req-0fcaf274-7f1f-48cc-83da-1e7f920f085d req-f6850a7c-09f5-4fec-924a-f9351e933ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.528 222021 DEBUG oslo_concurrency.lockutils [req-0fcaf274-7f1f-48cc-83da-1e7f920f085d req-f6850a7c-09f5-4fec-924a-f9351e933ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.528 222021 DEBUG oslo_concurrency.lockutils [req-0fcaf274-7f1f-48cc-83da-1e7f920f085d req-f6850a7c-09f5-4fec-924a-f9351e933ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.528 222021 DEBUG oslo_concurrency.lockutils [req-0fcaf274-7f1f-48cc-83da-1e7f920f085d req-f6850a7c-09f5-4fec-924a-f9351e933ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.529 222021 DEBUG nova.compute.manager [req-0fcaf274-7f1f-48cc-83da-1e7f920f085d req-f6850a7c-09f5-4fec-924a-f9351e933ef8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:19 np0005593233 podman[259475]: 2026-01-23 09:59:19.552130356 +0000 UTC m=+0.056729421 container create 593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:59:19 np0005593233 systemd[1]: Started libpod-conmon-593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc.scope.
Jan 23 04:59:19 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:59:19 np0005593233 podman[259475]: 2026-01-23 09:59:19.523970347 +0000 UTC m=+0.028569442 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:59:19 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01cdf5c626a41c62d7a4320133484d11f3d12b5f7b8375a62b188d73465c9632/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:19 np0005593233 podman[259475]: 2026-01-23 09:59:19.634372972 +0000 UTC m=+0.138972057 container init 593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:59:19 np0005593233 podman[259475]: 2026-01-23 09:59:19.64143502 +0000 UTC m=+0.146034085 container start 593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:59:19 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [NOTICE]   (259494) : New worker (259496) forked
Jan 23 04:59:19 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [NOTICE]   (259494) : Loading success.
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.731 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e75ad2d5-c059-4218-adfe-89823d98a762 in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.734 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c166c58f-c448-497c-a470-dba6713fe726#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.755 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3b29e5b9-fb65-4bbc-87ee-09c7d0af3c68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.802 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d30ecd1a-a895-47c8-beaa-89ec8e8bcdcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.807 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7834dc-902e-4b07-989c-219ccf3b4e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.844 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6518d8-2761-4ab1-b015-d4585e9a7285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.868 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd7ebab-9f26-4c0c-ace6-da3f6c7106e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc166c58f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:27:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627155, 'reachable_time': 20523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259510, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.891 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d193bb01-34bd-466c-913d-e0b834afb4ef]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapc166c58f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627170, 'tstamp': 627170}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259511, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc166c58f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627174, 'tstamp': 627174}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259511, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.893 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc166c58f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.937 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 nova_compute[222017]: 2026-01-23 09:59:19.938 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.939 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc166c58f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.941 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.942 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc166c58f-c0, col_values=(('external_ids', {'iface-id': '6cef1fb4-85cb-4b31-819a-b0e3e6793f34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.942 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.943 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5598d21a-d2b3-4fe1-ae77-65ec863bd48e in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.945 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c166c58f-c448-497c-a470-dba6713fe726#033[00m
Jan 23 04:59:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:19.966 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[14480cab-7d92-464b-ac6e-7bfaa84dd5f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.006 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[45dca124-7bfe-4d26-95aa-c20b1b8a6451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.012 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d52ab8a6-fb06-4211-aef6-876ada075439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.055 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4057bfb4-7379-476f-bacb-e96aab3c258a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.080 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[da489417-ad67-4ad8-8a4f-a2707d37b8b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc166c58f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:27:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 7, 'rx_bytes': 266, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 7, 'rx_bytes': 266, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627155, 'reachable_time': 20523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259517, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.105 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c650db84-f438-4e51-94a3-3ac3623780f3]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapc166c58f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627170, 'tstamp': 627170}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259518, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc166c58f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627174, 'tstamp': 627174}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259518, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.107 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc166c58f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 nova_compute[222017]: 2026-01-23 09:59:20.109 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.111 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc166c58f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.112 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.112 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc166c58f-c0, col_values=(('external_ids', {'iface-id': '6cef1fb4-85cb-4b31-819a-b0e3e6793f34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.113 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.114 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 79d4ff96-8918-4a77-9ba5-62ac2bc78903 in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.117 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c166c58f-c448-497c-a470-dba6713fe726#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.136 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1230eccf-e7c8-40b9-91da-b1f677e5f314]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.176 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5a91ef-28cb-4a28-8d28-ab114e689fa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.181 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7adf8877-3751-4dcc-90dd-f0034133e7bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.225 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6b00fdec-9383-4a66-b2fb-11b918ecee2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.253 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[baa1fe87-5bf2-476e-8a05-ccf5ffb061ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc166c58f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:27:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 9, 'rx_bytes': 266, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 9, 'rx_bytes': 266, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627155, 'reachable_time': 20523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259524, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.282 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4cc150-75e0-4ae1-8669-0db2314d4dbf]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.1.2'], ['IFA_LOCAL', '10.1.1.2'], ['IFA_BROADCAST', '10.1.1.255'], ['IFA_LABEL', 'tapc166c58f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627170, 'tstamp': 627170}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259525, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc166c58f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627174, 'tstamp': 627174}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259525, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.284 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc166c58f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 nova_compute[222017]: 2026-01-23 09:59:20.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.289 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc166c58f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.289 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.289 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc166c58f-c0, col_values=(('external_ids', {'iface-id': '6cef1fb4-85cb-4b31-819a-b0e3e6793f34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.290 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.291 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 6d043e19-5004-46eb-b144-cc1b3476f21f in datapath dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 unbound from our chassis#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.295 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.312 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[732d4a1f-6c3d-4be4-b799-cdff8534afa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.313 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdcd83a8b-31 in ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.316 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdcd83a8b-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.316 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa94559-6de8-467e-b744-247d31840ad7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.317 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9f83245c-f0a8-493e-a454-0d359a3f85f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.333 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[2290422d-bc6e-4c62-b468-0fba425015a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.351 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[31a6cd41-d865-4489-8b46-261ecee9e645]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.395 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1bcd9719-2c24-4636-8204-36d4e573de66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 NetworkManager[48871]: <info>  [1769162360.4022] manager: (tapdcd83a8b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/203)
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.403 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04cb5b82-7de2-4917-809a-953224fdbbd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.451 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8859cded-d7eb-4cb6-a4d0-190a46903feb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.455 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2417f1ef-042a-46a2-bfbf-b86e7cf72182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 NetworkManager[48871]: <info>  [1769162360.4841] device (tapdcd83a8b-30): carrier: link connected
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.492 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a376cecb-8e36-4ebd-a852-74c3168531e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.517 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04acfdba-151f-4960-84c6-8345219fd2a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcd83a8b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:3c:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627311, 'reachable_time': 22952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259536, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 nova_compute[222017]: 2026-01-23 09:59:20.526 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.538 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e1928036-ea25-44f8-9e84-8fd1b1d42a82]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0d:3c79'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627311, 'tstamp': 627311}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259537, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.562 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c5940b44-2917-4f46-900b-b7c017d377ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcd83a8b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:3c:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627311, 'reachable_time': 22952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259538, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.602 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[31029177-b2d0-4289-b612-dc3c2ccb60b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.675 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aeec2b53-c33b-4e49-9f23-a79bd5b5ffb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.677 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcd83a8b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.678 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.678 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcd83a8b-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 NetworkManager[48871]: <info>  [1769162360.6813] manager: (tapdcd83a8b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Jan 23 04:59:20 np0005593233 nova_compute[222017]: 2026-01-23 09:59:20.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:20 np0005593233 kernel: tapdcd83a8b-30: entered promiscuous mode
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.685 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcd83a8b-30, col_values=(('external_ids', {'iface-id': '734f76ca-5797-4a08-bce7-bcdc6ad19947'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:20 np0005593233 nova_compute[222017]: 2026-01-23 09:59:20.687 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:20 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:20Z|00419|binding|INFO|Releasing lport 734f76ca-5797-4a08-bce7-bcdc6ad19947 from this chassis (sb_readonly=0)
Jan 23 04:59:20 np0005593233 nova_compute[222017]: 2026-01-23 09:59:20.700 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.702 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.703 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b4df8b-b507-40ce-b3c6-6ed0cf4c49cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.705 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3.pid.haproxy
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:59:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:20.706 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'env', 'PROCESS_TAG=haproxy-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:59:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:21 np0005593233 podman[259571]: 2026-01-23 09:59:21.135713635 +0000 UTC m=+0.060927198 container create d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 04:59:21 np0005593233 systemd[1]: Started libpod-conmon-d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506.scope.
Jan 23 04:59:21 np0005593233 podman[259571]: 2026-01-23 09:59:21.10593786 +0000 UTC m=+0.031151463 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:59:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:21.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:21 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:59:21 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62d5d3493acace1641cd29182f3a40e30188e1f0e633b26106eba08e38528d2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:21 np0005593233 podman[259571]: 2026-01-23 09:59:21.237457817 +0000 UTC m=+0.162671380 container init d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:21 np0005593233 podman[259571]: 2026-01-23 09:59:21.243811035 +0000 UTC m=+0.169024608 container start d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 04:59:21 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [NOTICE]   (259590) : New worker (259592) forked
Jan 23 04:59:21 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [NOTICE]   (259590) : Loading success.
Jan 23 04:59:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:21.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.339 140224 INFO neutron.agent.ovn.metadata.agent [-] Port a98412a1-0341-4835-956a-c4201becd7ef in datapath dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 unbound from our chassis#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.342 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.360 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d62278e0-d304-45c5-b4f4-05f9b6a2fbc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.400 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[83023b2b-7722-4d77-9344-43f75816b3c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.405 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5caafb22-c244-400b-81e4-b64732589282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.439 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b17e3df9-c5ac-45a4-925f-8154ea3f3ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.460 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0fdebe-9acf-4b73-a69f-545509bb1be9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcd83a8b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0d:3c:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627311, 'reachable_time': 22952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259606, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.484 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[21dba5af-26bc-4725-bdd0-965419a4376a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdcd83a8b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627326, 'tstamp': 627326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259607, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.2.2.2'], ['IFA_LOCAL', '10.2.2.2'], ['IFA_BROADCAST', '10.2.2.255'], ['IFA_LABEL', 'tapdcd83a8b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627330, 'tstamp': 627330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259607, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.486 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcd83a8b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:21 np0005593233 nova_compute[222017]: 2026-01-23 09:59:21.488 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:21 np0005593233 nova_compute[222017]: 2026-01-23 09:59:21.490 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.490 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcd83a8b-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.491 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.491 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcd83a8b-30, col_values=(('external_ids', {'iface-id': '734f76ca-5797-4a08-bce7-bcdc6ad19947'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.492 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:21.493 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.388 222021 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Successfully created port: 60096294-f3c4-470e-9e79-14735e5625ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:59:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:22.495 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.745 222021 DEBUG nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.746 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.746 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.746 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.746 222021 DEBUG nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No event matching network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 in dict_keys([('network-vif-plugged', '446c7501-f73f-4cbf-8a43-e78948d8bec1'), ('network-vif-plugged', 'e75ad2d5-c059-4218-adfe-89823d98a762'), ('network-vif-plugged', '79d4ff96-8918-4a77-9ba5-62ac2bc78903'), ('network-vif-plugged', '5598d21a-d2b3-4fe1-ae77-65ec863bd48e'), ('network-vif-plugged', 'a98412a1-0341-4835-956a-c4201becd7ef')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.747 222021 WARNING nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.747 222021 DEBUG nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.747 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.748 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.748 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.748 222021 DEBUG nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.748 222021 DEBUG nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.749 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.749 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.749 222021 DEBUG oslo_concurrency.lockutils [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.749 222021 DEBUG nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No event matching network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 in dict_keys([('network-vif-plugged', 'e75ad2d5-c059-4218-adfe-89823d98a762'), ('network-vif-plugged', '79d4ff96-8918-4a77-9ba5-62ac2bc78903'), ('network-vif-plugged', '5598d21a-d2b3-4fe1-ae77-65ec863bd48e'), ('network-vif-plugged', 'a98412a1-0341-4835-956a-c4201becd7ef')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.750 222021 WARNING nova.compute.manager [req-4ac0b8f0-d5e5-48d8-8d80-d31a6260b7c7 req-c693e4b7-1e0e-4d55-8ae1-1702fb23cc4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.901 222021 DEBUG nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.902 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.902 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.902 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.903 222021 DEBUG nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No event matching network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f in dict_keys([('network-vif-plugged', 'e75ad2d5-c059-4218-adfe-89823d98a762'), ('network-vif-plugged', '79d4ff96-8918-4a77-9ba5-62ac2bc78903'), ('network-vif-plugged', '5598d21a-d2b3-4fe1-ae77-65ec863bd48e'), ('network-vif-plugged', 'a98412a1-0341-4835-956a-c4201becd7ef')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.903 222021 WARNING nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.903 222021 DEBUG nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.903 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.903 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.904 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.904 222021 DEBUG nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.904 222021 DEBUG nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.904 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.904 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.905 222021 DEBUG oslo_concurrency.lockutils [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.905 222021 DEBUG nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No event matching network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 in dict_keys([('network-vif-plugged', 'e75ad2d5-c059-4218-adfe-89823d98a762'), ('network-vif-plugged', '5598d21a-d2b3-4fe1-ae77-65ec863bd48e'), ('network-vif-plugged', 'a98412a1-0341-4835-956a-c4201becd7ef')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:22 np0005593233 nova_compute[222017]: 2026-01-23 09:59:22.905 222021 WARNING nova.compute.manager [req-2a29dcf3-104f-462d-af13-eec9abc898d8 req-e22b5d46-2800-44fe-bff1-d91f2e40408e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.051 222021 DEBUG nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.052 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.053 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.054 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.054 222021 DEBUG nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.055 222021 DEBUG nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.055 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.056 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.056 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.057 222021 DEBUG nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No event matching network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef in dict_keys([('network-vif-plugged', 'e75ad2d5-c059-4218-adfe-89823d98a762'), ('network-vif-plugged', '5598d21a-d2b3-4fe1-ae77-65ec863bd48e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.057 222021 WARNING nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.058 222021 DEBUG nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.058 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.059 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.059 222021 DEBUG oslo_concurrency.lockutils [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:23 np0005593233 nova_compute[222017]: 2026-01-23 09:59:23.060 222021 DEBUG nova.compute.manager [req-4da1d534-9729-4c55-ac7c-fb308e1bf127 req-67cba60b-d216-4ba4-85fe-7baff9def157 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:23.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:23.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:24 np0005593233 podman[259608]: 2026-01-23 09:59:24.044688707 +0000 UTC m=+0.056000111 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 04:59:24 np0005593233 nova_compute[222017]: 2026-01-23 09:59:24.448 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:25.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:25.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:25 np0005593233 nova_compute[222017]: 2026-01-23 09:59:25.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.338 222021 DEBUG nova.compute.manager [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.339 222021 DEBUG oslo_concurrency.lockutils [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.339 222021 DEBUG oslo_concurrency.lockutils [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.339 222021 DEBUG oslo_concurrency.lockutils [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.340 222021 DEBUG nova.compute.manager [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Processing event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.340 222021 DEBUG nova.compute.manager [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.340 222021 DEBUG oslo_concurrency.lockutils [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.341 222021 DEBUG oslo_concurrency.lockutils [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.341 222021 DEBUG oslo_concurrency.lockutils [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.341 222021 DEBUG nova.compute.manager [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.341 222021 WARNING nova.compute.manager [req-b4df6ff8-d4b1-447d-ab0c-9ba9da498058 req-7f4fe6bb-4221-45d8-a3d9-7043b9ef449b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.342 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance event wait completed in 7 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.348 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162366.3481178, 6d680830-de0e-445d-9d57-b3b0724cb5a8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.348 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.351 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.356 222021 INFO nova.virt.libvirt.driver [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance spawned successfully.#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.357 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.455 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.462 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.546 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.552 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.553 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.554 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.554 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.555 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.555 222021 DEBUG nova.virt.libvirt.driver [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.966 222021 INFO nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Took 78.00 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:59:26 np0005593233 nova_compute[222017]: 2026-01-23 09:59:26.967 222021 DEBUG nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:27.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:27.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.360 222021 DEBUG nova.compute.manager [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.361 222021 DEBUG oslo_concurrency.lockutils [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.361 222021 DEBUG oslo_concurrency.lockutils [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.361 222021 DEBUG oslo_concurrency.lockutils [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.361 222021 DEBUG nova.compute.manager [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.362 222021 WARNING nova.compute.manager [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.462 222021 INFO nova.compute.manager [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Took 87.69 seconds to build instance.#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.652 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:27 np0005593233 nova_compute[222017]: 2026-01-23 09:59:27.755 222021 DEBUG oslo_concurrency.lockutils [None req-c5d3151e-08c0-4775-a874-aa6c18958d1c 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 88.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.089 222021 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Successfully updated port: 60096294-f3c4-470e-9e79-14735e5625ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.134 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "refresh_cache-2e38e1c5-8495-488c-a1d1-082804b39c88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.135 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquired lock "refresh_cache-2e38e1c5-8495-488c-a1d1-082804b39c88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.135 222021 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:59:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:29.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:29.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.590 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.591 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.591 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.591 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.592 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.787 222021 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.897 222021 DEBUG nova.compute.manager [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-changed-60096294-f3c4-470e-9e79-14735e5625ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.898 222021 DEBUG nova.compute.manager [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Refreshing instance network info cache due to event network-changed-60096294-f3c4-470e-9e79-14735e5625ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:59:29 np0005593233 nova_compute[222017]: 2026-01-23 09:59:29.899 222021 DEBUG oslo_concurrency.lockutils [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-2e38e1c5-8495-488c-a1d1-082804b39c88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2278170754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.075 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.188 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000059 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.188 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000059 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.189 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000059 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.189 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000059 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.442 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.443 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4352MB free_disk=20.92584228515625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.444 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.444 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:30 np0005593233 nova_compute[222017]: 2026-01-23 09:59:30.533 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:31.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:31.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.449 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 6d680830-de0e-445d-9d57-b3b0724cb5a8 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.451 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 2e38e1c5-8495-488c-a1d1-082804b39c88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.451 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.452 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.480 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.519 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.521 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.542 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.582 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:59:31 np0005593233 nova_compute[222017]: 2026-01-23 09:59:31.757 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2315961543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.307 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.317 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.364 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.414 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.655 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.770 222021 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Updating instance_info_cache with network_info: [{"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.820 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Releasing lock "refresh_cache-2e38e1c5-8495-488c-a1d1-082804b39c88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.821 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Instance network_info: |[{"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.822 222021 DEBUG oslo_concurrency.lockutils [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-2e38e1c5-8495-488c-a1d1-082804b39c88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.822 222021 DEBUG nova.network.neutron [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Refreshing network info cache for port 60096294-f3c4-470e-9e79-14735e5625ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.826 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Start _get_guest_xml network_info=[{"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.830 222021 WARNING nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.851 222021 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.853 222021 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.860 222021 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.860 222021 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.862 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.863 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.864 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.864 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.865 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.865 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.865 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.866 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.866 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.866 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.867 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.867 222021 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:59:32 np0005593233 nova_compute[222017]: 2026-01-23 09:59:32.871 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:33.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:33.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:59:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2757717692' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:59:33 np0005593233 nova_compute[222017]: 2026-01-23 09:59:33.788 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.917s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:33 np0005593233 nova_compute[222017]: 2026-01-23 09:59:33.829 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:33 np0005593233 nova_compute[222017]: 2026-01-23 09:59:33.837 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:59:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1932111615' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.336 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.340 222021 DEBUG nova.virt.libvirt.vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-2087045140',display_name='tempest-MultipleCreateTestJSON-server-2087045140-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-2087045140-2',id=95,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-2nwggk6q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:15Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=2e38e1c5-8495-488c-a1d1-082804b39c88,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.341 222021 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.342 222021 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.343 222021 DEBUG nova.objects.instance [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e38e1c5-8495-488c-a1d1-082804b39c88 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.384 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <uuid>2e38e1c5-8495-488c-a1d1-082804b39c88</uuid>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <name>instance-0000005f</name>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:name>tempest-MultipleCreateTestJSON-server-2087045140-2</nova:name>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 09:59:32</nova:creationTime>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:user uuid="d83df80213fd40f99fdc68c146fe9a2a">tempest-MultipleCreateTestJSON-351408189-project-member</nova:user>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:project uuid="c288779980de4f03be20b7eed343b775">tempest-MultipleCreateTestJSON-351408189</nova:project>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <nova:port uuid="60096294-f3c4-470e-9e79-14735e5625ef">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <system>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <entry name="serial">2e38e1c5-8495-488c-a1d1-082804b39c88</entry>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <entry name="uuid">2e38e1c5-8495-488c-a1d1-082804b39c88</entry>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </system>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <os>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </os>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <features>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </features>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </clock>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  <devices>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2e38e1c5-8495-488c-a1d1-082804b39c88_disk">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/2e38e1c5-8495-488c-a1d1-082804b39c88_disk.config">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </source>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      </auth>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </disk>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:6d:2e:f3"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <target dev="tap60096294-f3"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </interface>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/console.log" append="off"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </serial>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <video>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </video>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </rng>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 04:59:34 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 04:59:34 np0005593233 nova_compute[222017]:  </devices>
Jan 23 04:59:34 np0005593233 nova_compute[222017]: </domain>
Jan 23 04:59:34 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.392 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Preparing to wait for external event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.393 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.393 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.394 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.395 222021 DEBUG nova.virt.libvirt.vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-2087045140',display_name='tempest-MultipleCreateTestJSON-server-2087045140-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-2087045140-2',id=95,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-2nwggk6q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:15Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=2e38e1c5-8495-488c-a1d1-082804b39c88,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.395 222021 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.396 222021 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.397 222021 DEBUG os_vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.398 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.400 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.400 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.404 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60096294-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.404 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60096294-f3, col_values=(('external_ids', {'iface-id': '60096294-f3c4-470e-9e79-14735e5625ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:2e:f3', 'vm-uuid': '2e38e1c5-8495-488c-a1d1-082804b39c88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.406 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:34 np0005593233 NetworkManager[48871]: <info>  [1769162374.4075] manager: (tap60096294-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.414 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.414 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.415 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.417 222021 INFO os_vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3')#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.700 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.701 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.701 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No VIF found with MAC fa:16:3e:6d:2e:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.702 222021 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Using config drive#033[00m
Jan 23 04:59:34 np0005593233 nova_compute[222017]: 2026-01-23 09:59:34.730 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:35 np0005593233 NetworkManager[48871]: <info>  [1769162375.1405] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.139 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:35 np0005593233 NetworkManager[48871]: <info>  [1769162375.1416] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Jan 23 04:59:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:35.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:35Z|00420|binding|INFO|Releasing lport b4b2029e-fa11-4534-9927-8291416ede63 from this chassis (sb_readonly=0)
Jan 23 04:59:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:35Z|00421|binding|INFO|Releasing lport 734f76ca-5797-4a08-bce7-bcdc6ad19947 from this chassis (sb_readonly=0)
Jan 23 04:59:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:35Z|00422|binding|INFO|Releasing lport 6cef1fb4-85cb-4b31-819a-b0e3e6793f34 from this chassis (sb_readonly=0)
Jan 23 04:59:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:35.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:35Z|00423|binding|INFO|Releasing lport b4b2029e-fa11-4534-9927-8291416ede63 from this chassis (sb_readonly=0)
Jan 23 04:59:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:35Z|00424|binding|INFO|Releasing lport 734f76ca-5797-4a08-bce7-bcdc6ad19947 from this chassis (sb_readonly=0)
Jan 23 04:59:35 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:35Z|00425|binding|INFO|Releasing lport 6cef1fb4-85cb-4b31-819a-b0e3e6793f34 from this chassis (sb_readonly=0)
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.386 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.537 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.733 222021 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Creating config drive at /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/disk.config#033[00m
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.740 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmd7cnxu_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.888 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmd7cnxu_" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.924 222021 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e38e1c5-8495-488c-a1d1-082804b39c88_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:35 np0005593233 nova_compute[222017]: 2026-01-23 09:59:35.930 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/disk.config 2e38e1c5-8495-488c-a1d1-082804b39c88_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.116 222021 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/disk.config 2e38e1c5-8495-488c-a1d1-082804b39c88_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.186s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.117 222021 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Deleting local config drive /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88/disk.config because it was imported into RBD.#033[00m
Jan 23 04:59:36 np0005593233 kernel: tap60096294-f3: entered promiscuous mode
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.190 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:36Z|00426|binding|INFO|Claiming lport 60096294-f3c4-470e-9e79-14735e5625ef for this chassis.
Jan 23 04:59:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:36Z|00427|binding|INFO|60096294-f3c4-470e-9e79-14735e5625ef: Claiming fa:16:3e:6d:2e:f3 10.100.0.3
Jan 23 04:59:36 np0005593233 NetworkManager[48871]: <info>  [1769162376.2019] manager: (tap60096294-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 23 04:59:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:36Z|00428|binding|INFO|Setting lport 60096294-f3c4-470e-9e79-14735e5625ef ovn-installed in OVS
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.216 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:36 np0005593233 systemd-udevd[259808]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:36 np0005593233 systemd-machined[190954]: New machine qemu-46-instance-0000005f.
Jan 23 04:59:36 np0005593233 NetworkManager[48871]: <info>  [1769162376.2618] device (tap60096294-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:36 np0005593233 NetworkManager[48871]: <info>  [1769162376.2630] device (tap60096294-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:36 np0005593233 systemd[1]: Started Virtual Machine qemu-46-instance-0000005f.
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.381 222021 DEBUG nova.network.neutron [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Updated VIF entry in instance network info cache for port 60096294-f3c4-470e-9e79-14735e5625ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.382 222021 DEBUG nova.network.neutron [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Updating instance_info_cache with network_info: [{"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:59:36 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:36Z|00429|binding|INFO|Setting lport 60096294-f3c4-470e-9e79-14735e5625ef up in Southbound
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.506 222021 DEBUG nova.compute.manager [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-changed-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.506 222021 DEBUG nova.compute.manager [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing instance network info cache due to event network-changed-446c7501-f73f-4cbf-8a43-e78948d8bec1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.506 222021 DEBUG oslo_concurrency.lockutils [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.507 222021 DEBUG oslo_concurrency.lockutils [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.507 222021 DEBUG nova.network.neutron [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Refreshing network info cache for port 446c7501-f73f-4cbf-8a43-e78948d8bec1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.505 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:2e:f3 10.100.0.3'], port_security=['fa:16:3e:6d:2e:f3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2e38e1c5-8495-488c-a1d1-082804b39c88', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c5732b3-3484-43db-a231-53d04de40d61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c288779980de4f03be20b7eed343b775', 'neutron:revision_number': '2', 'neutron:security_group_ids': '288ecf98-3e6e-478c-8e27-86a4106b4ef8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2529943-1c00-4757-827e-798919a83756, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=60096294-f3c4-470e-9e79-14735e5625ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.508 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 60096294-f3c4-470e-9e79-14735e5625ef in datapath 6c5732b3-3484-43db-a231-53d04de40d61 bound to our chassis#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.511 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c5732b3-3484-43db-a231-53d04de40d61#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.529 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5391b7f5-4c4e-4ff8-9e5e-dde0c1205be2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.530 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c5732b3-31 in ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.533 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c5732b3-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.533 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f5644a-04e6-440d-96b4-0282296d00a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.534 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a13d4add-c2f6-4b6f-9e67-dd0c8f344988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.555 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[842e747b-34a9-4bfd-8290-da7f5fa6847a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.582 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.583 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[008fbcb6-45e4-404c-b42b-0d70e9edaf37]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 nova_compute[222017]: 2026-01-23 09:59:36.588 222021 DEBUG oslo_concurrency.lockutils [req-5d8d98dd-1271-40ff-954f-666097254da1 req-394f39aa-2e8f-455c-b817-8978440ec10b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-2e38e1c5-8495-488c-a1d1-082804b39c88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.622 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[52828d96-4f55-41ac-9e25-e8c05dc96f30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 systemd-udevd[259810]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.636 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[814433f0-2f01-48e3-90ac-c8a49294ca6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 NetworkManager[48871]: <info>  [1769162376.6380] manager: (tap6c5732b3-30): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.801 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5c237bae-93af-49d7-ae93-9e53d96b100c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.806 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5c32f8da-8d71-443d-9e6e-18016a24f7bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 NetworkManager[48871]: <info>  [1769162376.8411] device (tap6c5732b3-30): carrier: link connected
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.854 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[cf27a5a1-35df-4d29-ba09-395496de7695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.885 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[de3387c7-fa14-4f7a-933d-7056f901333a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c5732b3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:ad:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628947, 'reachable_time': 24488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259841, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.903 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[41b66300-af9d-400a-8935-3da2c8430256]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:adb9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628947, 'tstamp': 628947}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259842, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.937 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[209d8846-c015-48e2-a676-66d86c0c7647]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c5732b3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:ad:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628947, 'reachable_time': 24488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259850, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:36.980 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cb48e9-e236-4354-bd08-b77ae87ca202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.062 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f149d9-7871-465f-8bd4-76ba987585eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.067 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5732b3-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.067 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.068 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c5732b3-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:37 np0005593233 kernel: tap6c5732b3-30: entered promiscuous mode
Jan 23 04:59:37 np0005593233 NetworkManager[48871]: <info>  [1769162377.0714] manager: (tap6c5732b3-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.073 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.075 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c5732b3-30, col_values=(('external_ids', {'iface-id': '4f372140-9451-4bb5-99b3-fc5570b8346b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.078 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:59:37 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:37Z|00430|binding|INFO|Releasing lport 4f372140-9451-4bb5-99b3-fc5570b8346b from this chassis (sb_readonly=0)
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.084 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8b8e08-848c-439b-b86a-e05bd7745d0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.085 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-6c5732b3-3484-43db-a231-53d04de40d61
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 6c5732b3-3484-43db-a231-53d04de40d61
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:59:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:37.085 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'env', 'PROCESS_TAG=haproxy-6c5732b3-3484-43db-a231-53d04de40d61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c5732b3-3484-43db-a231-53d04de40d61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.094 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.153 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162377.1529896, 2e38e1c5-8495-488c-a1d1-082804b39c88 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.154 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] VM Started (Lifecycle Event)#033[00m
Jan 23 04:59:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:37.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:37.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:37 np0005593233 podman[259917]: 2026-01-23 09:59:37.542718485 +0000 UTC m=+0.086615849 container create 203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:37 np0005593233 systemd[1]: Started libpod-conmon-203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8.scope.
Jan 23 04:59:37 np0005593233 podman[259917]: 2026-01-23 09:59:37.51041676 +0000 UTC m=+0.054314134 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:59:37 np0005593233 systemd[1]: Started libcrun container.
Jan 23 04:59:37 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2ad7fa6c531294103037cf9d3da5ed4595938fe4d84a37eac095da73db800e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:37 np0005593233 podman[259917]: 2026-01-23 09:59:37.66383406 +0000 UTC m=+0.207731524 container init 203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.678 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:37 np0005593233 podman[259917]: 2026-01-23 09:59:37.678762349 +0000 UTC m=+0.222659743 container start 203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.684 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162377.1532493, 2e38e1c5-8495-488c-a1d1-082804b39c88 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.690 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:59:37 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [NOTICE]   (259936) : New worker (259938) forked
Jan 23 04:59:37 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [NOTICE]   (259936) : Loading success.
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.729 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.736 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.907 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:37 np0005593233 nova_compute[222017]: 2026-01-23 09:59:37.936 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.369 222021 DEBUG nova.compute.manager [req-cbf22941-cd62-43a3-8bf7-684fa291638b req-a28499e0-87a7-4d57-983d-d6f8e8cd88a2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.370 222021 DEBUG oslo_concurrency.lockutils [req-cbf22941-cd62-43a3-8bf7-684fa291638b req-a28499e0-87a7-4d57-983d-d6f8e8cd88a2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.371 222021 DEBUG oslo_concurrency.lockutils [req-cbf22941-cd62-43a3-8bf7-684fa291638b req-a28499e0-87a7-4d57-983d-d6f8e8cd88a2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.371 222021 DEBUG oslo_concurrency.lockutils [req-cbf22941-cd62-43a3-8bf7-684fa291638b req-a28499e0-87a7-4d57-983d-d6f8e8cd88a2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.372 222021 DEBUG nova.compute.manager [req-cbf22941-cd62-43a3-8bf7-684fa291638b req-a28499e0-87a7-4d57-983d-d6f8e8cd88a2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Processing event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.376 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.381 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162379.3810294, 2e38e1c5-8495-488c-a1d1-082804b39c88 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.383 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.385 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:59:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:39.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.389 222021 INFO nova.virt.libvirt.driver [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Instance spawned successfully.#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.390 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.562 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.575 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.601 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.602 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.603 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.603 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.604 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.604 222021 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:39 np0005593233 nova_compute[222017]: 2026-01-23 09:59:39.727 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:40 np0005593233 nova_compute[222017]: 2026-01-23 09:59:40.226 222021 INFO nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Took 25.07 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:59:40 np0005593233 nova_compute[222017]: 2026-01-23 09:59:40.227 222021 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:40 np0005593233 nova_compute[222017]: 2026-01-23 09:59:40.464 222021 INFO nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Took 27.48 seconds to build instance.#033[00m
Jan 23 04:59:40 np0005593233 nova_compute[222017]: 2026-01-23 09:59:40.488 222021 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:40 np0005593233 nova_compute[222017]: 2026-01-23 09:59:40.571 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:41.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:41.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:41 np0005593233 nova_compute[222017]: 2026-01-23 09:59:41.554 222021 DEBUG nova.compute.manager [req-d5a07e8b-1081-45a6-8160-f0621d372ca8 req-01809d1d-b674-495a-8ac3-ba3e146af7fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:41 np0005593233 nova_compute[222017]: 2026-01-23 09:59:41.554 222021 DEBUG oslo_concurrency.lockutils [req-d5a07e8b-1081-45a6-8160-f0621d372ca8 req-01809d1d-b674-495a-8ac3-ba3e146af7fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:41 np0005593233 nova_compute[222017]: 2026-01-23 09:59:41.554 222021 DEBUG oslo_concurrency.lockutils [req-d5a07e8b-1081-45a6-8160-f0621d372ca8 req-01809d1d-b674-495a-8ac3-ba3e146af7fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:41 np0005593233 nova_compute[222017]: 2026-01-23 09:59:41.555 222021 DEBUG oslo_concurrency.lockutils [req-d5a07e8b-1081-45a6-8160-f0621d372ca8 req-01809d1d-b674-495a-8ac3-ba3e146af7fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:41 np0005593233 nova_compute[222017]: 2026-01-23 09:59:41.555 222021 DEBUG nova.compute.manager [req-d5a07e8b-1081-45a6-8160-f0621d372ca8 req-01809d1d-b674-495a-8ac3-ba3e146af7fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] No waiting events found dispatching network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:41 np0005593233 nova_compute[222017]: 2026-01-23 09:59:41.555 222021 WARNING nova.compute.manager [req-d5a07e8b-1081-45a6-8160-f0621d372ca8 req-01809d1d-b674-495a-8ac3-ba3e146af7fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received unexpected event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef for instance with vm_state active and task_state None.#033[00m
Jan 23 04:59:42 np0005593233 podman[259947]: 2026-01-23 09:59:42.105078622 +0000 UTC m=+0.104954143 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:eb:a0 10.2.2.100
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:eb:a0 10.2.2.100
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:85:d8 10.1.1.184
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:85:d8 10.1.1.184
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:d2:12 10.1.1.51
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:d2:12 10.1.1.51
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:a6:73 10.100.0.9
Jan 23 04:59:42 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:42Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:a6:73 10.100.0.9
Jan 23 04:59:42 np0005593233 nova_compute[222017]: 2026-01-23 09:59:42.554 222021 DEBUG nova.network.neutron [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updated VIF entry in instance network info cache for port 446c7501-f73f-4cbf-8a43-e78948d8bec1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:59:42 np0005593233 nova_compute[222017]: 2026-01-23 09:59:42.555 222021 DEBUG nova.network.neutron [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:42 np0005593233 nova_compute[222017]: 2026-01-23 09:59:42.649 222021 DEBUG oslo_concurrency.lockutils [req-2431d31c-70de-486a-948a-d7b3a939ce24 req-4976f247-a924-45c1-a523-1198af6d719f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:42 np0005593233 nova_compute[222017]: 2026-01-23 09:59:42.651 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:42 np0005593233 nova_compute[222017]: 2026-01-23 09:59:42.651 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:59:42 np0005593233 nova_compute[222017]: 2026-01-23 09:59:42.651 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6d680830-de0e-445d-9d57-b3b0724cb5a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:42.662 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:42.663 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:42.665 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:43Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:ac:0a 10.2.2.200
Jan 23 04:59:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:43Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:ac:0a 10.2.2.200
Jan 23 04:59:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:43.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:43Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9c:d3:23 10.1.1.42
Jan 23 04:59:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:43Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9c:d3:23 10.1.1.42
Jan 23 04:59:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:43Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:b8:70 10.1.1.102
Jan 23 04:59:43 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:43Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:b8:70 10.1.1.102
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.902 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.903 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.903 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.903 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.904 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.906 222021 INFO nova.compute.manager [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Terminating instance#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.908 222021 DEBUG nova.compute.manager [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:59:44 np0005593233 kernel: tap60096294-f3 (unregistering): left promiscuous mode
Jan 23 04:59:44 np0005593233 NetworkManager[48871]: <info>  [1769162384.9566] device (tap60096294-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.967 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:44Z|00431|binding|INFO|Releasing lport 60096294-f3c4-470e-9e79-14735e5625ef from this chassis (sb_readonly=0)
Jan 23 04:59:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:44Z|00432|binding|INFO|Setting lport 60096294-f3c4-470e-9e79-14735e5625ef down in Southbound
Jan 23 04:59:44 np0005593233 ovn_controller[130653]: 2026-01-23T09:59:44Z|00433|binding|INFO|Removing iface tap60096294-f3 ovn-installed in OVS
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.970 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593233 nova_compute[222017]: 2026-01-23 09:59:44.982 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.001 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:2e:f3 10.100.0.3'], port_security=['fa:16:3e:6d:2e:f3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2e38e1c5-8495-488c-a1d1-082804b39c88', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c5732b3-3484-43db-a231-53d04de40d61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c288779980de4f03be20b7eed343b775', 'neutron:revision_number': '4', 'neutron:security_group_ids': '288ecf98-3e6e-478c-8e27-86a4106b4ef8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2529943-1c00-4757-827e-798919a83756, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=60096294-f3c4-470e-9e79-14735e5625ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.002 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 60096294-f3c4-470e-9e79-14735e5625ef in datapath 6c5732b3-3484-43db-a231-53d04de40d61 unbound from our chassis#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.004 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c5732b3-3484-43db-a231-53d04de40d61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.005 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c522f0-fb62-4398-85d6-45ce116e427b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.006 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 namespace which is not needed anymore#033[00m
Jan 23 04:59:45 np0005593233 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Jan 23 04:59:45 np0005593233 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005f.scope: Consumed 6.520s CPU time.
Jan 23 04:59:45 np0005593233 systemd-machined[190954]: Machine qemu-46-instance-0000005f terminated.
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.139 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [NOTICE]   (259936) : haproxy version is 2.8.14-c23fe91
Jan 23 04:59:45 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [NOTICE]   (259936) : path to executable is /usr/sbin/haproxy
Jan 23 04:59:45 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [WARNING]  (259936) : Exiting Master process...
Jan 23 04:59:45 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [WARNING]  (259936) : Exiting Master process...
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.147 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [ALERT]    (259936) : Current worker (259938) exited with code 143 (Terminated)
Jan 23 04:59:45 np0005593233 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[259932]: [WARNING]  (259936) : All workers exited. Exiting... (0)
Jan 23 04:59:45 np0005593233 systemd[1]: libpod-203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8.scope: Deactivated successfully.
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.159 222021 INFO nova.virt.libvirt.driver [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Instance destroyed successfully.#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.160 222021 DEBUG nova.objects.instance [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'resources' on Instance uuid 2e38e1c5-8495-488c-a1d1-082804b39c88 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:45 np0005593233 podman[259999]: 2026-01-23 09:59:45.161492765 +0000 UTC m=+0.057740629 container died 203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.191 222021 DEBUG nova.virt.libvirt.vif [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-2087045140',display_name='tempest-MultipleCreateTestJSON-server-2087045140-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-2087045140-2',id=95,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-23T09:59:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-2nwggk6q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:40Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=2e38e1c5-8495-488c-a1d1-082804b39c88,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.191 222021 DEBUG nova.network.os_vif_util [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "60096294-f3c4-470e-9e79-14735e5625ef", "address": "fa:16:3e:6d:2e:f3", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60096294-f3", "ovs_interfaceid": "60096294-f3c4-470e-9e79-14735e5625ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.193 222021 DEBUG nova.network.os_vif_util [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:45 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8-userdata-shm.mount: Deactivated successfully.
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.193 222021 DEBUG os_vif [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.197 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.198 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60096294-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:45 np0005593233 systemd[1]: var-lib-containers-storage-overlay-d2ad7fa6c531294103037cf9d3da5ed4595938fe4d84a37eac095da73db800e9-merged.mount: Deactivated successfully.
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.202 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.208 222021 INFO os_vif [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:2e:f3,bridge_name='br-int',has_traffic_filtering=True,id=60096294-f3c4-470e-9e79-14735e5625ef,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60096294-f3')#033[00m
Jan 23 04:59:45 np0005593233 podman[259999]: 2026-01-23 09:59:45.22267202 +0000 UTC m=+0.118919874 container cleanup 203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:59:45 np0005593233 systemd[1]: libpod-conmon-203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8.scope: Deactivated successfully.
Jan 23 04:59:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:45.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:45 np0005593233 podman[260049]: 2026-01-23 09:59:45.38857615 +0000 UTC m=+0.137080673 container remove 203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.395 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[85ee4b83-55e2-4429-9ab5-f374fa8b1ff1]: (4, ('Fri Jan 23 09:59:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 (203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8)\n203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8\nFri Jan 23 09:59:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 (203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8)\n203245e2275b32357548637115bc584108cf55918c5bc07c2668a397d1e1b6a8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.397 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[54586461-62bc-49ba-ba16-f00f8bfa4af0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:45.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.399 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5732b3-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:45 np0005593233 kernel: tap6c5732b3-30: left promiscuous mode
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.402 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.411 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dc72ecaa-8252-402d-8c0c-fec2972c29d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.435 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3b72195f-7ed6-43a8-8954-2428f8172fdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.438 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2ba8c94e-8803-4099-9f02-a6c637d0e122]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.458 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[95174ed5-a94b-45b9-bce0-4885091a3e74]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628924, 'reachable_time': 40155, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260067, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.461 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:59:45 np0005593233 systemd[1]: run-netns-ovnmeta\x2d6c5732b3\x2d3484\x2d43db\x2da231\x2d53d04de40d61.mount: Deactivated successfully.
Jan 23 04:59:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:45.462 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0a14ac-9a37-44fa-9646-32f04e0595c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:45 np0005593233 nova_compute[222017]: 2026-01-23 09:59:45.574 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.139 222021 DEBUG nova.compute.manager [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-vif-unplugged-60096294-f3c4-470e-9e79-14735e5625ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.139 222021 DEBUG oslo_concurrency.lockutils [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.139 222021 DEBUG oslo_concurrency.lockutils [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG oslo_concurrency.lockutils [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG nova.compute.manager [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] No waiting events found dispatching network-vif-unplugged-60096294-f3c4-470e-9e79-14735e5625ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG nova.compute.manager [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-vif-unplugged-60096294-f3c4-470e-9e79-14735e5625ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG nova.compute.manager [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG oslo_concurrency.lockutils [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG oslo_concurrency.lockutils [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.140 222021 DEBUG oslo_concurrency.lockutils [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.141 222021 DEBUG nova.compute.manager [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] No waiting events found dispatching network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:47 np0005593233 nova_compute[222017]: 2026-01-23 09:59:47.141 222021 WARNING nova.compute.manager [req-dd8b201a-a045-4c03-8ece-23c26407f4cb req-1fb36d61-d4b1-4bcf-8fc1-02905e61d07d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received unexpected event network-vif-plugged-60096294-f3c4-470e-9e79-14735e5625ef for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:59:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:47.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:47.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:48 np0005593233 nova_compute[222017]: 2026-01-23 09:59:48.348 222021 INFO nova.virt.libvirt.driver [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Deleting instance files /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88_del#033[00m
Jan 23 04:59:48 np0005593233 nova_compute[222017]: 2026-01-23 09:59:48.349 222021 INFO nova.virt.libvirt.driver [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Deletion of /var/lib/nova/instances/2e38e1c5-8495-488c-a1d1-082804b39c88_del complete#033[00m
Jan 23 04:59:48 np0005593233 nova_compute[222017]: 2026-01-23 09:59:48.553 222021 INFO nova.compute.manager [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Took 3.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:59:48 np0005593233 nova_compute[222017]: 2026-01-23 09:59:48.554 222021 DEBUG oslo.service.loopingcall [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:59:48 np0005593233 nova_compute[222017]: 2026-01-23 09:59:48.554 222021 DEBUG nova.compute.manager [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:59:48 np0005593233 nova_compute[222017]: 2026-01-23 09:59:48.555 222021 DEBUG nova.network.neutron [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.004293) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389004397, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 721, "num_deletes": 251, "total_data_size": 1218215, "memory_usage": 1243952, "flush_reason": "Manual Compaction"}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389013936, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 803810, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48551, "largest_seqno": 49267, "table_properties": {"data_size": 800371, "index_size": 1283, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8049, "raw_average_key_size": 19, "raw_value_size": 793505, "raw_average_value_size": 1902, "num_data_blocks": 57, "num_entries": 417, "num_filter_entries": 417, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162342, "oldest_key_time": 1769162342, "file_creation_time": 1769162389, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 9688 microseconds, and 5214 cpu microseconds.
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.013979) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 803810 bytes OK
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.014009) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.015992) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.016015) EVENT_LOG_v1 {"time_micros": 1769162389016008, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.016044) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 1214321, prev total WAL file size 1214321, number of live WAL files 2.
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.017137) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(784KB)], [96(10019KB)]
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389017231, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 11063276, "oldest_snapshot_seqno": -1}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 6970 keys, 9139697 bytes, temperature: kUnknown
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389122303, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 9139697, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9095295, "index_size": 25858, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17477, "raw_key_size": 181453, "raw_average_key_size": 26, "raw_value_size": 8972715, "raw_average_value_size": 1287, "num_data_blocks": 1014, "num_entries": 6970, "num_filter_entries": 6970, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162389, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.122712) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 9139697 bytes
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.124602) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.2 rd, 86.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 9.8 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(25.1) write-amplify(11.4) OK, records in: 7480, records dropped: 510 output_compression: NoCompression
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.124644) EVENT_LOG_v1 {"time_micros": 1769162389124626, "job": 60, "event": "compaction_finished", "compaction_time_micros": 105177, "compaction_time_cpu_micros": 40439, "output_level": 6, "num_output_files": 1, "total_output_size": 9139697, "num_input_records": 7480, "num_output_records": 6970, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389125303, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389129166, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.016996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.129237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.129242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.129244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.129246) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-09:59:49.129248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:49.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:49.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:50 np0005593233 nova_compute[222017]: 2026-01-23 09:59:50.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:50 np0005593233 nova_compute[222017]: 2026-01-23 09:59:50.598 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:51.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:51.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:51 np0005593233 nova_compute[222017]: 2026-01-23 09:59:51.744 222021 DEBUG nova.network.neutron [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:51 np0005593233 nova_compute[222017]: 2026-01-23 09:59:51.802 222021 INFO nova.compute.manager [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Took 3.25 seconds to deallocate network for instance.#033[00m
Jan 23 04:59:51 np0005593233 nova_compute[222017]: 2026-01-23 09:59:51.889 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:51 np0005593233 nova_compute[222017]: 2026-01-23 09:59:51.890 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:51 np0005593233 nova_compute[222017]: 2026-01-23 09:59:51.936 222021 DEBUG nova.compute.manager [req-1fd1b2c4-14b0-47b7-aaee-aff18b164736 req-6a5645b0-0c45-471d-b7ee-9b84c55bf70d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Received event network-vif-deleted-60096294-f3c4-470e-9e79-14735e5625ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:52 np0005593233 nova_compute[222017]: 2026-01-23 09:59:52.044 222021 DEBUG oslo_concurrency.processutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3004145331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:52 np0005593233 nova_compute[222017]: 2026-01-23 09:59:52.518 222021 DEBUG oslo_concurrency.processutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:52 np0005593233 nova_compute[222017]: 2026-01-23 09:59:52.530 222021 DEBUG nova.compute.provider_tree [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:52 np0005593233 nova_compute[222017]: 2026-01-23 09:59:52.773 222021 DEBUG nova.scheduler.client.report [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:52 np0005593233 nova_compute[222017]: 2026-01-23 09:59:52.813 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:52 np0005593233 nova_compute[222017]: 2026-01-23 09:59:52.888 222021 INFO nova.scheduler.client.report [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Deleted allocations for instance 2e38e1c5-8495-488c-a1d1-082804b39c88#033[00m
Jan 23 04:59:53 np0005593233 nova_compute[222017]: 2026-01-23 09:59:53.018 222021 DEBUG oslo_concurrency.lockutils [None req-53d7dc56-824d-4a89-9ae3-d249dcddfc94 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e38e1c5-8495-488c-a1d1-082804b39c88" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:53.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:53.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:55 np0005593233 podman[260092]: 2026-01-23 09:59:55.088036153 +0000 UTC m=+0.079437588 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:59:55 np0005593233 nova_compute[222017]: 2026-01-23 09:59:55.243 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:55.259 140481 DEBUG eventlet.wsgi.server [-] (140481) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:55.262 140481 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: Accept: */*#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: Connection: close#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: Content-Type: text/plain#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: Host: 169.254.169.254#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: User-Agent: curl/7.84.0#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: X-Forwarded-For: 10.100.0.9#015
Jan 23 04:59:55 np0005593233 ovn_metadata_agent[140219]: X-Ovn-Network-Id: 9bd04a8e-3b21-48a4-942d-6ede17d32ccd __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 23 04:59:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:55.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:55.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:55 np0005593233 nova_compute[222017]: 2026-01-23 09:59:55.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:57.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:57.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:58.757 140481 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 23 04:59:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 09:59:58.757 140481 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 2548 time: 3.4960716#033[00m
Jan 23 04:59:58 np0005593233 haproxy-metadata-proxy-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259422]: 10.100.0.9:55918 [23/Jan/2026:09:59:55.258] listener listener/metadata 0/0/0/3499/3499 200 2532 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 23 04:59:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:59.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:59:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 04:59:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:59:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:59.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.157 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162385.1563983, 2e38e1c5-8495-488c-a1d1-082804b39c88 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.158 222021 INFO nova.compute.manager [-] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.247 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:00.710 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.711 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:00.712 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.735 222021 DEBUG nova.compute.manager [None req-1af5afba-2295-40da-9e02-5b18624a4ed2 - - - - - -] [instance: 2e38e1c5-8495-488c-a1d1-082804b39c88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.884 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.885 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.885 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.885 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.886 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.887 222021 INFO nova.compute.manager [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Terminating instance#033[00m
Jan 23 05:00:00 np0005593233 nova_compute[222017]: 2026-01-23 10:00:00.888 222021 DEBUG nova.compute.manager [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:00:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:01.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:01.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:01 np0005593233 kernel: tap446c7501-f7 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.6266] device (tap446c7501-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00434|binding|INFO|Releasing lport 446c7501-f73f-4cbf-8a43-e78948d8bec1 from this chassis (sb_readonly=0)
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00435|binding|INFO|Setting lport 446c7501-f73f-4cbf-8a43-e78948d8bec1 down in Southbound
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00436|binding|INFO|Removing iface tap446c7501-f7 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.642 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.652 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:a6:73 10.100.0.9'], port_security=['fa:16:3e:7f:a6:73 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.206'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7dcd690-014c-42e5-b3de-8c4087e847b3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=446c7501-f73f-4cbf-8a43-e78948d8bec1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 kernel: tap14f0f686-d5 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.653 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 446c7501-f73f-4cbf-8a43-e78948d8bec1 in datapath 9bd04a8e-3b21-48a4-942d-6ede17d32ccd unbound from our chassis#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.655 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bd04a8e-3b21-48a4-942d-6ede17d32ccd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.657 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8d62b00e-1464-4668-8a73-7d0c845b814c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.658 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd namespace which is not needed anymore#033[00m
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.6597] device (tap14f0f686-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00437|binding|INFO|Releasing lport 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 from this chassis (sb_readonly=0)
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00438|binding|INFO|Setting lport 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 down in Southbound
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00439|binding|INFO|Removing iface tap14f0f686-d5 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 kernel: tape75ad2d5-c0 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.692 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:d3:23 10.1.1.42'], port_security=['fa:16:3e:9c:d3:23 10.1.1.42'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-1194929224', 'neutron:cidrs': '10.1.1.42/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-1194929224', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e99a8f8e-4511-419c-93e3-9e8ca4222a4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=14f0f686-d5a3-4f53-a3d8-30c646ece1c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.6964] device (tape75ad2d5-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.699 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.712 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00440|binding|INFO|Releasing lport e75ad2d5-c059-4218-adfe-89823d98a762 from this chassis (sb_readonly=0)
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00441|binding|INFO|Setting lport e75ad2d5-c059-4218-adfe-89823d98a762 down in Southbound
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00442|binding|INFO|Removing iface tape75ad2d5-c0 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.714 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.719 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:d2:12 10.1.1.51'], port_security=['fa:16:3e:a4:d2:12 10.1.1.51'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TaggedBootDevicesTest_v242-504651905', 'neutron:cidrs': '10.1.1.51/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TaggedBootDevicesTest_v242-504651905', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e99a8f8e-4511-419c-93e3-9e8ca4222a4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e75ad2d5-c059-4218-adfe-89823d98a762) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 kernel: tap79d4ff96-89 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.7287] device (tap79d4ff96-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 kernel: tap5598d21a-d2 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.7473] device (tap5598d21a-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00443|binding|INFO|Releasing lport 79d4ff96-8918-4a77-9ba5-62ac2bc78903 from this chassis (sb_readonly=0)
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00444|binding|INFO|Setting lport 79d4ff96-8918-4a77-9ba5-62ac2bc78903 down in Southbound
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.749 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00445|binding|INFO|Removing iface tap79d4ff96-89 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.752 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.757 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:85:d8 10.1.1.184'], port_security=['fa:16:3e:0a:85:d8 10.1.1.184'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.184/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=79d4ff96-8918-4a77-9ba5-62ac2bc78903) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 kernel: tapa98412a1-03 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.7740] device (tapa98412a1-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00446|binding|INFO|Releasing lport 5598d21a-d2b3-4fe1-ae77-65ec863bd48e from this chassis (sb_readonly=0)
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00447|binding|INFO|Setting lport 5598d21a-d2b3-4fe1-ae77-65ec863bd48e down in Southbound
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00448|binding|INFO|Removing iface tap5598d21a-d2 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.787 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 kernel: tap6d043e19-50 (unregistering): left promiscuous mode
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.800 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:b8:70 10.1.1.102'], port_security=['fa:16:3e:c2:b8:70 10.1.1.102'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.1.102/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c166c58f-c448-497c-a470-dba6713fe726', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f0d1561f-5200-4636-8708-07c79c136e52, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5598d21a-d2b3-4fe1-ae77-65ec863bd48e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.8037] device (tap6d043e19-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00449|binding|INFO|Releasing lport a98412a1-0341-4835-956a-c4201becd7ef from this chassis (sb_readonly=0)
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.819 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00450|binding|INFO|Setting lport a98412a1-0341-4835-956a-c4201becd7ef down in Southbound
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00451|binding|INFO|Removing iface tapa98412a1-03 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00452|binding|INFO|Releasing lport 6d043e19-5004-46eb-b144-cc1b3476f21f from this chassis (sb_readonly=1)
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.854 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00453|if_status|INFO|Dropped 2 log messages in last 274 seconds (most recently, 274 seconds ago) due to excessive rate
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00454|if_status|INFO|Not setting lport 6d043e19-5004-46eb-b144-cc1b3476f21f down as sb is readonly
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00455|binding|INFO|Removing iface tap6d043e19-50 ovn-installed in OVS
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [NOTICE]   (259420) : haproxy version is 2.8.14-c23fe91
Jan 23 05:00:01 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [NOTICE]   (259420) : path to executable is /usr/sbin/haproxy
Jan 23 05:00:01 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [WARNING]  (259420) : Exiting Master process...
Jan 23 05:00:01 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [ALERT]    (259420) : Current worker (259422) exited with code 143 (Terminated)
Jan 23 05:00:01 np0005593233 neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd[259415]: [WARNING]  (259420) : All workers exited. Exiting... (0)
Jan 23 05:00:01 np0005593233 systemd[1]: libpod-a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992.scope: Deactivated successfully.
Jan 23 05:00:01 np0005593233 nova_compute[222017]: 2026-01-23 10:00:01.869 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:01 np0005593233 podman[260285]: 2026-01-23 10:00:01.870678774 +0000 UTC m=+0.071876115 container died a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:00:01 np0005593233 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000059.scope: Deactivated successfully.
Jan 23 05:00:01 np0005593233 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000059.scope: Consumed 18.525s CPU time.
Jan 23 05:00:01 np0005593233 systemd-machined[190954]: Machine qemu-45-instance-00000059 terminated.
Jan 23 05:00:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:00:01Z|00456|binding|INFO|Setting lport 6d043e19-5004-46eb-b144-cc1b3476f21f down in Southbound
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.887 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:eb:a0 10.2.2.100'], port_security=['fa:16:3e:eb:eb:a0 10.2.2.100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.100/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f53c3172-2247-4d09-9d71-c69e6158a9f6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=a98412a1-0341-4835-956a-c4201becd7ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:01.896 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:ac:0a 10.2.2.200'], port_security=['fa:16:3e:9e:ac:0a 10.2.2.200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.2.2.200/24', 'neutron:device_id': '6d680830-de0e-445d-9d57-b3b0724cb5a8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8924c80a71a94fdeb114c6bdbdb2939c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02743f37-2b9a-4823-b968-b5d7e1cbeb57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f53c3172-2247-4d09-9d71-c69e6158a9f6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=6d043e19-5004-46eb-b144-cc1b3476f21f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:01 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992-userdata-shm.mount: Deactivated successfully.
Jan 23 05:00:01 np0005593233 systemd[1]: var-lib-containers-storage-overlay-3794ea3b2c83101d50c549cb1e3f3aa769a3c4b6aab8c190b1fc70a0a30ac333-merged.mount: Deactivated successfully.
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.9220] manager: (tap446c7501-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Jan 23 05:00:01 np0005593233 podman[260285]: 2026-01-23 10:00:01.92368646 +0000 UTC m=+0.124883801 container cleanup a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.9362] manager: (tap14f0f686-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.9447] manager: (tape75ad2d5-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Jan 23 05:00:01 np0005593233 systemd[1]: libpod-conmon-a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992.scope: Deactivated successfully.
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.9614] manager: (tap79d4ff96-89): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.9722] manager: (tap5598d21a-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Jan 23 05:00:01 np0005593233 NetworkManager[48871]: <info>  [1769162401.9862] manager: (tapa98412a1-03): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.009 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.020 222021 INFO nova.virt.libvirt.driver [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Instance destroyed successfully.#033[00m
Jan 23 05:00:02 np0005593233 podman[260339]: 2026-01-23 10:00:02.020422062 +0000 UTC m=+0.063865281 container remove a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.020 222021 DEBUG nova.objects.instance [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lazy-loading 'resources' on Instance uuid 6d680830-de0e-445d-9d57-b3b0724cb5a8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.033 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7515c9c7-2299-4fdd-a20f-07486c582f30]: (4, ('Fri Jan 23 10:00:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd (a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992)\na60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992\nFri Jan 23 10:00:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd (a60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992)\na60b8be9379a7d4f6e29d5475c49e95ef86ba214b99690392609284be3e6f992\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.035 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d115fa19-d967-4323-a164-36fd9d8d73e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.036 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bd04a8e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.038 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.041 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.042 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.043 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.044 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.046 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.046 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap446c7501-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.048 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.051 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.056 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 kernel: tap9bd04a8e-30: left promiscuous mode
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.083 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:a6:73,bridge_name='br-int',has_traffic_filtering=True,id=446c7501-f73f-4cbf-8a43-e78948d8bec1,network=Network(9bd04a8e-3b21-48a4-942d-6ede17d32ccd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap446c7501-f7')#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.084 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.084 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.083 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3d2f3f-74c0-48c9-b8d4-c656daa6a5e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.085 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.085 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.088 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14f0f686-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.090 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.093 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.099 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0f18689f-d06c-4f03-bfd5-4697ec29903b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.102 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[97ca290c-ad67-44c2-9543-5ec2eac8bb5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.121 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.123 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d3:23,bridge_name='br-int',has_traffic_filtering=True,id=14f0f686-d5a3-4f53-a3d8-30c646ece1c3,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap14f0f686-d5')#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.124 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.125 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.125 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.126 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.127 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.126 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c0dd9ae-a69b-457f-b477-7eca931105d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627028, 'reachable_time': 22726, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260438, 'error': None, 'target': 'ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.127 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape75ad2d5-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.129 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.131 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 systemd[1]: run-netns-ovnmeta\x2d9bd04a8e\x2d3b21\x2d48a4\x2d942d\x2d6ede17d32ccd.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.132 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bd04a8e-3b21-48a4-942d-6ede17d32ccd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.132 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c83cd91e-d3e2-4284-abdb-21a9afac4cff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.133 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 14f0f686-d5a3-4f53-a3d8-30c646ece1c3 in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.136 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c166c58f-c448-497c-a470-dba6713fe726, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.137 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f67a6960-6bd3-4e1c-888e-0907a05a7665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.137 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c166c58f-c448-497c-a470-dba6713fe726 namespace which is not needed anymore#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.159 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.163 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:d2:12,bridge_name='br-int',has_traffic_filtering=True,id=e75ad2d5-c059-4218-adfe-89823d98a762,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape75ad2d5-c0')#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.164 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.164 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.165 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.165 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.167 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.167 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79d4ff96-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.169 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.171 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.178 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.182 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:85:d8,bridge_name='br-int',has_traffic_filtering=True,id=79d4ff96-8918-4a77-9ba5-62ac2bc78903,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d4ff96-89')#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.183 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.184 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.184 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.185 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.186 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.186 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5598d21a-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.187 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.189 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.195 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.198 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:b8:70,bridge_name='br-int',has_traffic_filtering=True,id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e,network=Network(c166c58f-c448-497c-a470-dba6713fe726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5598d21a-d2')#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.199 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.199 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.199 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.200 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.201 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.201 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa98412a1-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.202 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.207 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.211 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:eb:a0,bridge_name='br-int',has_traffic_filtering=True,id=a98412a1-0341-4835-956a-c4201becd7ef,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa98412a1-03')#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.212 222021 DEBUG nova.virt.libvirt.vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-135862627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-device-tagging-server-135862627',id=89,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDh4FieLz2bnYCsuaarY3Ac1hh0loRH+PaFGHCiieHB9NjhrRZfzcMi7ZUHIDZYCV83WMqwK4oFO+yqIrF+iL1IO0RboDLVpEqw8VipY/91NpY1H8k61wH/EO04ontnXzA==',key_name='tempest-keypair-954535776',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8924c80a71a94fdeb114c6bdbdb2939c',ramdisk_id='',reservation_id='r-9fcvb5t6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TaggedBootDevicesTest_v242-1752959028',owner_user_name='tempest-TaggedBootDevicesTest_v242-1752959028-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='35b29e4a06884f7d88683d00f85d4630',uuid=6d680830-de0e-445d-9d57-b3b0724cb5a8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.213 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converting VIF {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.214 222021 DEBUG nova.network.os_vif_util [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.214 222021 DEBUG os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.217 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.217 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d043e19-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.222 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.225 222021 INFO os_vif [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:ac:0a,bridge_name='br-int',has_traffic_filtering=True,id=6d043e19-5004-46eb-b144-cc1b3476f21f,network=Network(dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d043e19-50')#033[00m
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [NOTICE]   (259494) : haproxy version is 2.8.14-c23fe91
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [NOTICE]   (259494) : path to executable is /usr/sbin/haproxy
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [WARNING]  (259494) : Exiting Master process...
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [ALERT]    (259494) : Current worker (259496) exited with code 143 (Terminated)
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726[259490]: [WARNING]  (259494) : All workers exited. Exiting... (0)
Jan 23 05:00:02 np0005593233 systemd[1]: libpod-593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc.scope: Deactivated successfully.
Jan 23 05:00:02 np0005593233 podman[260472]: 2026-01-23 10:00:02.305676588 +0000 UTC m=+0.058056188 container died 593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:00:02 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc-userdata-shm.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 systemd[1]: var-lib-containers-storage-overlay-01cdf5c626a41c62d7a4320133484d11f3d12b5f7b8375a62b188d73465c9632-merged.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.356 222021 DEBUG nova.compute.manager [req-8977b48d-4b00-42d4-8492-c891a4aabf89 req-cf10c93a-1694-42f3-a93a-defab1db73d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.356 222021 DEBUG oslo_concurrency.lockutils [req-8977b48d-4b00-42d4-8492-c891a4aabf89 req-cf10c93a-1694-42f3-a93a-defab1db73d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.357 222021 DEBUG oslo_concurrency.lockutils [req-8977b48d-4b00-42d4-8492-c891a4aabf89 req-cf10c93a-1694-42f3-a93a-defab1db73d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.357 222021 DEBUG oslo_concurrency.lockutils [req-8977b48d-4b00-42d4-8492-c891a4aabf89 req-cf10c93a-1694-42f3-a93a-defab1db73d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.358 222021 DEBUG nova.compute.manager [req-8977b48d-4b00-42d4-8492-c891a4aabf89 req-cf10c93a-1694-42f3-a93a-defab1db73d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.358 222021 DEBUG nova.compute.manager [req-8977b48d-4b00-42d4-8492-c891a4aabf89 req-cf10c93a-1694-42f3-a93a-defab1db73d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:02 np0005593233 podman[260472]: 2026-01-23 10:00:02.359192468 +0000 UTC m=+0.111572068 container cleanup 593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.361 222021 DEBUG nova.compute.manager [req-37a7450e-9770-4217-92f1-10bc02972351 req-5f46aa61-ed23-4b15-900a-99e753926a7c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.361 222021 DEBUG oslo_concurrency.lockutils [req-37a7450e-9770-4217-92f1-10bc02972351 req-5f46aa61-ed23-4b15-900a-99e753926a7c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.361 222021 DEBUG oslo_concurrency.lockutils [req-37a7450e-9770-4217-92f1-10bc02972351 req-5f46aa61-ed23-4b15-900a-99e753926a7c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.361 222021 DEBUG oslo_concurrency.lockutils [req-37a7450e-9770-4217-92f1-10bc02972351 req-5f46aa61-ed23-4b15-900a-99e753926a7c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.362 222021 DEBUG nova.compute.manager [req-37a7450e-9770-4217-92f1-10bc02972351 req-5f46aa61-ed23-4b15-900a-99e753926a7c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.362 222021 DEBUG nova.compute.manager [req-37a7450e-9770-4217-92f1-10bc02972351 req-5f46aa61-ed23-4b15-900a-99e753926a7c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:02 np0005593233 systemd[1]: libpod-conmon-593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc.scope: Deactivated successfully.
Jan 23 05:00:02 np0005593233 podman[260521]: 2026-01-23 10:00:02.429984722 +0000 UTC m=+0.043813599 container remove 593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.437 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6e96b907-3d63-4e11-88ad-ce089ca2d7c8]: (4, ('Fri Jan 23 10:00:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726 (593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc)\n593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc\nFri Jan 23 10:00:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c166c58f-c448-497c-a470-dba6713fe726 (593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc)\n593dea7fb81f00c4fc61cae3c3e38c74f2125698ccfd529cd2874c961804cdbc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.439 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[47dc72be-79c6-430d-993e-6fb06847adff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.440 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc166c58f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 kernel: tapc166c58f-c0: left promiscuous mode
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.442 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.458 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4135f406-c26f-4354-b313-3b532f41bb17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.477 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[09ac6dea-9777-4230-a1c6-cacf74020c63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.479 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5b0038-e815-470b-879b-b4b9202eb10d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.480 222021 INFO nova.virt.libvirt.driver [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Deleting instance files /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8_del#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.481 222021 INFO nova.virt.libvirt.driver [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Deletion of /var/lib/nova/instances/6d680830-de0e-445d-9d57-b3b0724cb5a8_del complete#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.496 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[15c94efe-5896-4ac8-a0ba-4b3f13905e6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627145, 'reachable_time': 28422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260536, 'error': None, 'target': 'ovnmeta-c166c58f-c448-497c-a470-dba6713fe726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.498 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c166c58f-c448-497c-a470-dba6713fe726 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.498 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[9c254c66-57ee-4ef5-b1ba-e90e722b187c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.499 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e75ad2d5-c059-4218-adfe-89823d98a762 in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.501 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c166c58f-c448-497c-a470-dba6713fe726, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.501 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d28e08ee-0806-4011-8798-3ad8346a4a5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.502 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 79d4ff96-8918-4a77-9ba5-62ac2bc78903 in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.503 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c166c58f-c448-497c-a470-dba6713fe726, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.503 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aed6e7c8-726c-4748-863f-102f6a882e50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.504 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5598d21a-d2b3-4fe1-ae77-65ec863bd48e in datapath c166c58f-c448-497c-a470-dba6713fe726 unbound from our chassis#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.505 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c166c58f-c448-497c-a470-dba6713fe726, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.505 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1b404846-a209-4d0c-a2f2-55513cb08b09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.506 140224 INFO neutron.agent.ovn.metadata.agent [-] Port a98412a1-0341-4835-956a-c4201becd7ef in datapath dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 unbound from our chassis#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.507 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.508 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d2184e-b958-417e-b376-604f71c13d47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.508 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 namespace which is not needed anymore#033[00m
Jan 23 05:00:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:00:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:00:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.620 222021 INFO nova.compute.manager [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Took 1.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.620 222021 DEBUG oslo.service.loopingcall [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.621 222021 DEBUG nova.compute.manager [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.622 222021 DEBUG nova.network.neutron [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [NOTICE]   (259590) : haproxy version is 2.8.14-c23fe91
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [NOTICE]   (259590) : path to executable is /usr/sbin/haproxy
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [WARNING]  (259590) : Exiting Master process...
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [ALERT]    (259590) : Current worker (259592) exited with code 143 (Terminated)
Jan 23 05:00:02 np0005593233 neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3[259586]: [WARNING]  (259590) : All workers exited. Exiting... (0)
Jan 23 05:00:02 np0005593233 systemd[1]: libpod-d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506.scope: Deactivated successfully.
Jan 23 05:00:02 np0005593233 podman[260553]: 2026-01-23 10:00:02.658080066 +0000 UTC m=+0.054808327 container died d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:00:02 np0005593233 podman[260553]: 2026-01-23 10:00:02.707858762 +0000 UTC m=+0.104587063 container cleanup d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:00:02 np0005593233 systemd[1]: libpod-conmon-d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506.scope: Deactivated successfully.
Jan 23 05:00:02 np0005593233 podman[260582]: 2026-01-23 10:00:02.774994523 +0000 UTC m=+0.040704072 container remove d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.782 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9edbfe09-5e8b-459c-9940-252288ad7bf0]: (4, ('Fri Jan 23 10:00:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 (d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506)\nd4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506\nFri Jan 23 10:00:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 (d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506)\nd4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.785 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f3ac8248-345c-4552-9dbb-2bfd7e8c7aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.786 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcd83a8b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.788 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 kernel: tapdcd83a8b-30: left promiscuous mode
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.807 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.811 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c033e2d0-58b9-4c44-9f8a-9fa1f6d8dde1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.824 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34aa1612-b9f8-4fa9-8787-74f4bbd491c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.826 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c605dcfd-c84f-4b6e-83a2-467516dd8deb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.845 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e11e4f23-841a-44f8-ab04-6aed61972502]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627301, 'reachable_time': 36037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260597, 'error': None, 'target': 'ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.848 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.848 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[247c005a-3172-44d0-904a-de4f18833937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.849 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 6d043e19-5004-46eb-b144-cc1b3476f21f in datapath dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3 unbound from our chassis#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.851 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:02.852 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c4ec7d-61af-41cf-9491-bfd9553b7851]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:02 np0005593233 systemd[1]: var-lib-containers-storage-overlay-62d5d3493acace1641cd29182f3a40e30188e1f0e633b26106eba08e38528d2b-merged.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d4cc0b21f6f7f8dae8421c3188cf401be6ba9908ddf3f4a463711ce094cc8506-userdata-shm.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 systemd[1]: run-netns-ovnmeta\x2ddcd83a8b\x2d3a0f\x2d4178\x2da0e3\x2d112ef9a89aa3.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 systemd[1]: run-netns-ovnmeta\x2dc166c58f\x2dc448\x2d497c\x2da470\x2ddba6713fe726.mount: Deactivated successfully.
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.988 222021 DEBUG nova.compute.manager [req-9c2ed9c2-965d-4c11-83b9-08d064aa8341 req-ab74aa58-124c-4ad7-a971-55443c90c9e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.989 222021 DEBUG oslo_concurrency.lockutils [req-9c2ed9c2-965d-4c11-83b9-08d064aa8341 req-ab74aa58-124c-4ad7-a971-55443c90c9e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.989 222021 DEBUG oslo_concurrency.lockutils [req-9c2ed9c2-965d-4c11-83b9-08d064aa8341 req-ab74aa58-124c-4ad7-a971-55443c90c9e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.989 222021 DEBUG oslo_concurrency.lockutils [req-9c2ed9c2-965d-4c11-83b9-08d064aa8341 req-ab74aa58-124c-4ad7-a971-55443c90c9e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.990 222021 DEBUG nova.compute.manager [req-9c2ed9c2-965d-4c11-83b9-08d064aa8341 req-ab74aa58-124c-4ad7-a971-55443c90c9e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:02 np0005593233 nova_compute[222017]: 2026-01-23 10:00:02.990 222021 DEBUG nova.compute.manager [req-9c2ed9c2-965d-4c11-83b9-08d064aa8341 req-ab74aa58-124c-4ad7-a971-55443c90c9e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:03.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:03.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:03.713 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.592 222021 DEBUG nova.compute.manager [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.593 222021 DEBUG oslo_concurrency.lockutils [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.593 222021 DEBUG oslo_concurrency.lockutils [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.593 222021 DEBUG oslo_concurrency.lockutils [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.594 222021 DEBUG nova.compute.manager [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.594 222021 WARNING nova.compute.manager [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-79d4ff96-8918-4a77-9ba5-62ac2bc78903 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.594 222021 DEBUG nova.compute.manager [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-6d043e19-5004-46eb-b144-cc1b3476f21f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.595 222021 DEBUG oslo_concurrency.lockutils [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.595 222021 DEBUG oslo_concurrency.lockutils [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.595 222021 DEBUG oslo_concurrency.lockutils [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.595 222021 DEBUG nova.compute.manager [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-6d043e19-5004-46eb-b144-cc1b3476f21f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:04 np0005593233 nova_compute[222017]: 2026-01-23 10:00:04.596 222021 DEBUG nova.compute.manager [req-b0f34781-de5a-45ab-9805-4c52b4bcf162 req-4399ae17-9e59-4737-bf95-785246c35586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-6d043e19-5004-46eb-b144-cc1b3476f21f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:05.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:05.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:05 np0005593233 nova_compute[222017]: 2026-01-23 10:00:05.609 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.226 222021 DEBUG nova.compute.manager [req-dd762153-5a70-4a57-8614-74f3e2a366ce req-73c683ba-792d-459c-9d8b-f578defbc25a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.226 222021 DEBUG oslo_concurrency.lockutils [req-dd762153-5a70-4a57-8614-74f3e2a366ce req-73c683ba-792d-459c-9d8b-f578defbc25a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.227 222021 DEBUG oslo_concurrency.lockutils [req-dd762153-5a70-4a57-8614-74f3e2a366ce req-73c683ba-792d-459c-9d8b-f578defbc25a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.227 222021 DEBUG oslo_concurrency.lockutils [req-dd762153-5a70-4a57-8614-74f3e2a366ce req-73c683ba-792d-459c-9d8b-f578defbc25a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.227 222021 DEBUG nova.compute.manager [req-dd762153-5a70-4a57-8614-74f3e2a366ce req-73c683ba-792d-459c-9d8b-f578defbc25a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.227 222021 WARNING nova.compute.manager [req-dd762153-5a70-4a57-8614-74f3e2a366ce req-73c683ba-792d-459c-9d8b-f578defbc25a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-6d043e19-5004-46eb-b144-cc1b3476f21f for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:07.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:07.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.456 222021 DEBUG nova.compute.manager [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.457 222021 DEBUG oslo_concurrency.lockutils [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.457 222021 DEBUG oslo_concurrency.lockutils [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.458 222021 DEBUG oslo_concurrency.lockutils [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.458 222021 DEBUG nova.compute.manager [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.458 222021 WARNING nova.compute.manager [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-446c7501-f73f-4cbf-8a43-e78948d8bec1 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.458 222021 DEBUG nova.compute.manager [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.459 222021 DEBUG oslo_concurrency.lockutils [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.459 222021 DEBUG oslo_concurrency.lockutils [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.459 222021 DEBUG oslo_concurrency.lockutils [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.459 222021 DEBUG nova.compute.manager [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.459 222021 DEBUG nova.compute.manager [req-e250e656-6157-45eb-b724-9d01a280c44c req-8458b84a-3cb5-4f33-ab4e-ecae5d609f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.461 222021 DEBUG nova.compute.manager [req-b35ba04b-632a-4c74-a2b1-35d84b5898b2 req-8c9f9fee-18c9-4c7f-8b33-876008e7353b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.461 222021 DEBUG oslo_concurrency.lockutils [req-b35ba04b-632a-4c74-a2b1-35d84b5898b2 req-8c9f9fee-18c9-4c7f-8b33-876008e7353b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.462 222021 DEBUG oslo_concurrency.lockutils [req-b35ba04b-632a-4c74-a2b1-35d84b5898b2 req-8c9f9fee-18c9-4c7f-8b33-876008e7353b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.462 222021 DEBUG oslo_concurrency.lockutils [req-b35ba04b-632a-4c74-a2b1-35d84b5898b2 req-8c9f9fee-18c9-4c7f-8b33-876008e7353b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.462 222021 DEBUG nova.compute.manager [req-b35ba04b-632a-4c74-a2b1-35d84b5898b2 req-8c9f9fee-18c9-4c7f-8b33-876008e7353b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:07 np0005593233 nova_compute[222017]: 2026-01-23 10:00:07.462 222021 WARNING nova.compute.manager [req-b35ba04b-632a-4c74-a2b1-35d84b5898b2 req-8c9f9fee-18c9-4c7f-8b33-876008e7353b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-5598d21a-d2b3-4fe1-ae77-65ec863bd48e for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:00:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:00:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:09.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:09.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.783 222021 DEBUG nova.compute.manager [req-6ebc9d25-792d-43db-8698-ee7b84a88b21 req-909819af-deb3-48e5-a296-180b06d4c380 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-a98412a1-0341-4835-956a-c4201becd7ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.784 222021 DEBUG oslo_concurrency.lockutils [req-6ebc9d25-792d-43db-8698-ee7b84a88b21 req-909819af-deb3-48e5-a296-180b06d4c380 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.784 222021 DEBUG oslo_concurrency.lockutils [req-6ebc9d25-792d-43db-8698-ee7b84a88b21 req-909819af-deb3-48e5-a296-180b06d4c380 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.785 222021 DEBUG oslo_concurrency.lockutils [req-6ebc9d25-792d-43db-8698-ee7b84a88b21 req-909819af-deb3-48e5-a296-180b06d4c380 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.785 222021 DEBUG nova.compute.manager [req-6ebc9d25-792d-43db-8698-ee7b84a88b21 req-909819af-deb3-48e5-a296-180b06d4c380 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-a98412a1-0341-4835-956a-c4201becd7ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.785 222021 DEBUG nova.compute.manager [req-6ebc9d25-792d-43db-8698-ee7b84a88b21 req-909819af-deb3-48e5-a296-180b06d4c380 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-a98412a1-0341-4835-956a-c4201becd7ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.968 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.968 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.969 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.969 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.969 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.969 222021 WARNING nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-14f0f686-d5a3-4f53-a3d8-30c646ece1c3 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.970 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-e75ad2d5-c059-4218-adfe-89823d98a762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.970 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.971 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.971 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.972 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-unplugged-e75ad2d5-c059-4218-adfe-89823d98a762 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.972 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-unplugged-e75ad2d5-c059-4218-adfe-89823d98a762 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.972 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.973 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.973 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.973 222021 DEBUG oslo_concurrency.lockutils [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.974 222021 DEBUG nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:09 np0005593233 nova_compute[222017]: 2026-01-23 10:00:09.974 222021 WARNING nova.compute.manager [req-4e823a5a-528e-4830-ae2e-41d1ef2380ef req-12242831-cdd7-4cd6-8732-7b6e59c2560e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-e75ad2d5-c059-4218-adfe-89823d98a762 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:10 np0005593233 nova_compute[222017]: 2026-01-23 10:00:10.612 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.058 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "address": "fa:16:3e:c2:b8:70", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.102", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5598d21a-d2", "ovs_interfaceid": "5598d21a-d2b3-4fe1-ae77-65ec863bd48e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.191 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-6d680830-de0e-445d-9d57-b3b0724cb5a8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.191 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:00:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:11.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:11.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.963 222021 DEBUG nova.compute.manager [req-aeca91fa-1438-45f4-b937-378899771738 req-8f49cf2b-4a4a-4347-a19f-f9430f97b2e7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.964 222021 DEBUG oslo_concurrency.lockutils [req-aeca91fa-1438-45f4-b937-378899771738 req-8f49cf2b-4a4a-4347-a19f-f9430f97b2e7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.964 222021 DEBUG oslo_concurrency.lockutils [req-aeca91fa-1438-45f4-b937-378899771738 req-8f49cf2b-4a4a-4347-a19f-f9430f97b2e7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.964 222021 DEBUG oslo_concurrency.lockutils [req-aeca91fa-1438-45f4-b937-378899771738 req-8f49cf2b-4a4a-4347-a19f-f9430f97b2e7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.965 222021 DEBUG nova.compute.manager [req-aeca91fa-1438-45f4-b937-378899771738 req-8f49cf2b-4a4a-4347-a19f-f9430f97b2e7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] No waiting events found dispatching network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:11 np0005593233 nova_compute[222017]: 2026-01-23 10:00:11.965 222021 WARNING nova.compute.manager [req-aeca91fa-1438-45f4-b937-378899771738 req-8f49cf2b-4a4a-4347-a19f-f9430f97b2e7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received unexpected event network-vif-plugged-a98412a1-0341-4835-956a-c4201becd7ef for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:12 np0005593233 nova_compute[222017]: 2026-01-23 10:00:12.225 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:13 np0005593233 podman[260648]: 2026-01-23 10:00:13.117577394 +0000 UTC m=+0.103974245 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:00:13 np0005593233 nova_compute[222017]: 2026-01-23 10:00:13.186 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:13 np0005593233 nova_compute[222017]: 2026-01-23 10:00:13.187 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:13.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:13.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:15.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:15.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:15 np0005593233 nova_compute[222017]: 2026-01-23 10:00:15.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:17 np0005593233 nova_compute[222017]: 2026-01-23 10:00:17.016 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162402.0136058, 6d680830-de0e-445d-9d57-b3b0724cb5a8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:00:17 np0005593233 nova_compute[222017]: 2026-01-23 10:00:17.016 222021 INFO nova.compute.manager [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:00:17 np0005593233 nova_compute[222017]: 2026-01-23 10:00:17.119 222021 DEBUG nova.compute.manager [None req-c569cc85-a2fa-4d47-a5c6-335864f91bf7 - - - - - -] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:17 np0005593233 nova_compute[222017]: 2026-01-23 10:00:17.228 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:17.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:17.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:19.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:19 np0005593233 nova_compute[222017]: 2026-01-23 10:00:19.361 222021 DEBUG nova.compute.manager [req-5e22b1eb-aa9a-4f4d-ab4a-1bfdd8b5b30b req-27ac701c-3619-472f-8b7b-06e156fc393f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-deleted-5598d21a-d2b3-4fe1-ae77-65ec863bd48e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:19 np0005593233 nova_compute[222017]: 2026-01-23 10:00:19.362 222021 INFO nova.compute.manager [req-5e22b1eb-aa9a-4f4d-ab4a-1bfdd8b5b30b req-27ac701c-3619-472f-8b7b-06e156fc393f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Neutron deleted interface 5598d21a-d2b3-4fe1-ae77-65ec863bd48e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:00:19 np0005593233 nova_compute[222017]: 2026-01-23 10:00:19.362 222021 DEBUG nova.network.neutron [req-5e22b1eb-aa9a-4f4d-ab4a-1bfdd8b5b30b req-27ac701c-3619-472f-8b7b-06e156fc393f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "address": "fa:16:3e:7f:a6:73", "network": {"id": "9bd04a8e-3b21-48a4-942d-6ede17d32ccd", "bridge": "br-int", "label": "tempest-TaggedBootDevicesTest_v242-282900962-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap446c7501-f7", "ovs_interfaceid": "446c7501-f73f-4cbf-8a43-e78948d8bec1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:19.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:19 np0005593233 nova_compute[222017]: 2026-01-23 10:00:19.553 222021 DEBUG nova.compute.manager [req-5e22b1eb-aa9a-4f4d-ab4a-1bfdd8b5b30b req-27ac701c-3619-472f-8b7b-06e156fc393f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Detach interface failed, port_id=5598d21a-d2b3-4fe1-ae77-65ec863bd48e, reason: Instance 6d680830-de0e-445d-9d57-b3b0724cb5a8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:00:19 np0005593233 nova_compute[222017]: 2026-01-23 10:00:19.595 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:19 np0005593233 nova_compute[222017]: 2026-01-23 10:00:19.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:20 np0005593233 nova_compute[222017]: 2026-01-23 10:00:20.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:21.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:21.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:21 np0005593233 nova_compute[222017]: 2026-01-23 10:00:21.647 222021 DEBUG nova.compute.manager [req-7498c411-ca9d-4ce6-92ab-95cccf6d44b0 req-80586ec6-d534-4556-ac53-5028d850b714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-deleted-446c7501-f73f-4cbf-8a43-e78948d8bec1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:21 np0005593233 nova_compute[222017]: 2026-01-23 10:00:21.648 222021 INFO nova.compute.manager [req-7498c411-ca9d-4ce6-92ab-95cccf6d44b0 req-80586ec6-d534-4556-ac53-5028d850b714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Neutron deleted interface 446c7501-f73f-4cbf-8a43-e78948d8bec1; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:00:21 np0005593233 nova_compute[222017]: 2026-01-23 10:00:21.648 222021 DEBUG nova.network.neutron [req-7498c411-ca9d-4ce6-92ab-95cccf6d44b0 req-80586ec6-d534-4556-ac53-5028d850b714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "address": "fa:16:3e:0a:85:d8", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.184", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d4ff96-89", "ovs_interfaceid": "79d4ff96-8918-4a77-9ba5-62ac2bc78903", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:21 np0005593233 nova_compute[222017]: 2026-01-23 10:00:21.690 222021 DEBUG nova.compute.manager [req-7498c411-ca9d-4ce6-92ab-95cccf6d44b0 req-80586ec6-d534-4556-ac53-5028d850b714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Detach interface failed, port_id=446c7501-f73f-4cbf-8a43-e78948d8bec1, reason: Instance 6d680830-de0e-445d-9d57-b3b0724cb5a8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:00:22 np0005593233 nova_compute[222017]: 2026-01-23 10:00:22.232 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:00:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:23.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:00:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:23.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:24 np0005593233 nova_compute[222017]: 2026-01-23 10:00:24.016 222021 DEBUG nova.compute.manager [req-162a03c2-30ba-4c1c-a59f-ddbf01313c87 req-0a3602de-2a2f-48fb-8905-72e75507822e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-deleted-79d4ff96-8918-4a77-9ba5-62ac2bc78903 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:24 np0005593233 nova_compute[222017]: 2026-01-23 10:00:24.017 222021 INFO nova.compute.manager [req-162a03c2-30ba-4c1c-a59f-ddbf01313c87 req-0a3602de-2a2f-48fb-8905-72e75507822e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Neutron deleted interface 79d4ff96-8918-4a77-9ba5-62ac2bc78903; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:00:24 np0005593233 nova_compute[222017]: 2026-01-23 10:00:24.017 222021 DEBUG nova.network.neutron [req-162a03c2-30ba-4c1c-a59f-ddbf01313c87 req-0a3602de-2a2f-48fb-8905-72e75507822e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [{"id": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "address": "fa:16:3e:9c:d3:23", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap14f0f686-d5", "ovs_interfaceid": "14f0f686-d5a3-4f53-a3d8-30c646ece1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e75ad2d5-c059-4218-adfe-89823d98a762", "address": "fa:16:3e:a4:d2:12", "network": {"id": "c166c58f-c448-497c-a470-dba6713fe726", "bridge": "br-int", "label": "tempest-device-tagging-net1-671529903", "subnets": [{"cidr": "10.1.1.0/24", "dns": [], "gateway": {"address": "10.1.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.1.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75ad2d5-c0", "ovs_interfaceid": "e75ad2d5-c059-4218-adfe-89823d98a762", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "a98412a1-0341-4835-956a-c4201becd7ef", "address": "fa:16:3e:eb:eb:a0", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa98412a1-03", "ovs_interfaceid": "a98412a1-0341-4835-956a-c4201becd7ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6d043e19-5004-46eb-b144-cc1b3476f21f", "address": "fa:16:3e:9e:ac:0a", "network": {"id": "dcd83a8b-3a0f-4178-a0e3-112ef9a89aa3", "bridge": "br-int", "label": "tempest-device-tagging-net2-1874800537", "subnets": [{"cidr": "10.2.2.0/24", "dns": [], "gateway": {"address": "10.2.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.2.2.200", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8924c80a71a94fdeb114c6bdbdb2939c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d043e19-50", "ovs_interfaceid": "6d043e19-5004-46eb-b144-cc1b3476f21f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:24 np0005593233 nova_compute[222017]: 2026-01-23 10:00:24.083 222021 DEBUG nova.compute.manager [req-162a03c2-30ba-4c1c-a59f-ddbf01313c87 req-0a3602de-2a2f-48fb-8905-72e75507822e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Detach interface failed, port_id=79d4ff96-8918-4a77-9ba5-62ac2bc78903, reason: Instance 6d680830-de0e-445d-9d57-b3b0724cb5a8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:00:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:25.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:25 np0005593233 nova_compute[222017]: 2026-01-23 10:00:25.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:25.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:25 np0005593233 nova_compute[222017]: 2026-01-23 10:00:25.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593233 nova_compute[222017]: 2026-01-23 10:00:25.649 222021 DEBUG nova.network.neutron [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:25 np0005593233 nova_compute[222017]: 2026-01-23 10:00:25.686 222021 INFO nova.compute.manager [-] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Took 23.06 seconds to deallocate network for instance.#033[00m
Jan 23 05:00:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:26 np0005593233 podman[260676]: 2026-01-23 10:00:26.045881315 +0000 UTC m=+0.057408776 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:00:26 np0005593233 nova_compute[222017]: 2026-01-23 10:00:26.539 222021 DEBUG nova.compute.manager [req-d902750f-cdb8-450d-a7e4-53c2ac096a61 req-9c5a28ce-3575-452a-95cd-6a5cccc25c90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-deleted-a98412a1-0341-4835-956a-c4201becd7ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:26 np0005593233 nova_compute[222017]: 2026-01-23 10:00:26.539 222021 DEBUG nova.compute.manager [req-d902750f-cdb8-450d-a7e4-53c2ac096a61 req-9c5a28ce-3575-452a-95cd-6a5cccc25c90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Received event network-vif-deleted-6d043e19-5004-46eb-b144-cc1b3476f21f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:27 np0005593233 nova_compute[222017]: 2026-01-23 10:00:27.235 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:27.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:27 np0005593233 nova_compute[222017]: 2026-01-23 10:00:27.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:27.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:28 np0005593233 nova_compute[222017]: 2026-01-23 10:00:28.584 222021 INFO nova.compute.manager [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] [instance: 6d680830-de0e-445d-9d57-b3b0724cb5a8] Took 2.90 seconds to detach 3 volumes for instance.#033[00m
Jan 23 05:00:28 np0005593233 nova_compute[222017]: 2026-01-23 10:00:28.841 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:28 np0005593233 nova_compute[222017]: 2026-01-23 10:00:28.842 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:28 np0005593233 nova_compute[222017]: 2026-01-23 10:00:28.982 222021 DEBUG oslo_concurrency.processutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:29.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:00:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1395936862' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.471 222021 DEBUG oslo_concurrency.processutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:29.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.480 222021 DEBUG nova.compute.provider_tree [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.507 222021 DEBUG nova.scheduler.client.report [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.546 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.609 222021 INFO nova.scheduler.client.report [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Deleted allocations for instance 6d680830-de0e-445d-9d57-b3b0724cb5a8#033[00m
Jan 23 05:00:29 np0005593233 nova_compute[222017]: 2026-01-23 10:00:29.752 222021 DEBUG oslo_concurrency.lockutils [None req-4c414ac7-79a6-4099-8029-87c707f6a4ab 35b29e4a06884f7d88683d00f85d4630 8924c80a71a94fdeb114c6bdbdb2939c - - default default] Lock "6d680830-de0e-445d-9d57-b3b0724cb5a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 28.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.418 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.419 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.419 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.419 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.419 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:00:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/448409966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:00:30 np0005593233 nova_compute[222017]: 2026-01-23 10:00:30.937 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.171 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.174 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4612MB free_disk=20.921916961669922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.175 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.175 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:31.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:31.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.671 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.673 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:00:31 np0005593233 nova_compute[222017]: 2026-01-23 10:00:31.699 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:00:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/225902992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:00:32 np0005593233 nova_compute[222017]: 2026-01-23 10:00:32.199 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:32 np0005593233 nova_compute[222017]: 2026-01-23 10:00:32.207 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:00:32 np0005593233 nova_compute[222017]: 2026-01-23 10:00:32.230 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:00:32 np0005593233 nova_compute[222017]: 2026-01-23 10:00:32.239 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593233 nova_compute[222017]: 2026-01-23 10:00:32.290 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:00:32 np0005593233 nova_compute[222017]: 2026-01-23 10:00:32.290 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:33.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:33.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:34 np0005593233 nova_compute[222017]: 2026-01-23 10:00:34.291 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:34 np0005593233 nova_compute[222017]: 2026-01-23 10:00:34.292 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:34 np0005593233 nova_compute[222017]: 2026-01-23 10:00:34.292 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:00:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:35.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:35.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:35 np0005593233 nova_compute[222017]: 2026-01-23 10:00:35.630 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:37 np0005593233 nova_compute[222017]: 2026-01-23 10:00:37.244 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:37.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:37 np0005593233 nova_compute[222017]: 2026-01-23 10:00:37.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:37 np0005593233 nova_compute[222017]: 2026-01-23 10:00:37.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:00:37 np0005593233 nova_compute[222017]: 2026-01-23 10:00:37.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:00:37 np0005593233 nova_compute[222017]: 2026-01-23 10:00:37.414 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:00:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:37.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:39.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:39 np0005593233 nova_compute[222017]: 2026-01-23 10:00:39.408 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:39.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:40 np0005593233 nova_compute[222017]: 2026-01-23 10:00:40.632 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:00:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:41.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:00:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:41.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:42 np0005593233 nova_compute[222017]: 2026-01-23 10:00:42.248 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:42.663 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:42.663 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:00:42.663 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:43.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:43.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:44 np0005593233 podman[260765]: 2026-01-23 10:00:44.101288891 +0000 UTC m=+0.105897012 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:00:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:45.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:45.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:45 np0005593233 nova_compute[222017]: 2026-01-23 10:00:45.633 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:47 np0005593233 nova_compute[222017]: 2026-01-23 10:00:47.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:47.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:47.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:49.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:49.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:50 np0005593233 nova_compute[222017]: 2026-01-23 10:00:50.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:51.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:51.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:52 np0005593233 nova_compute[222017]: 2026-01-23 10:00:52.255 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:53.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:53.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:00:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:55.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:00:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:55.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:55 np0005593233 nova_compute[222017]: 2026-01-23 10:00:55.636 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:00:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/724490874' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:00:57 np0005593233 podman[260791]: 2026-01-23 10:00:57.042699856 +0000 UTC m=+0.054507815 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:00:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:00:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/724490874' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:00:57 np0005593233 nova_compute[222017]: 2026-01-23 10:00:57.258 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:57.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:57.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:59.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:00:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:59.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:00 np0005593233 nova_compute[222017]: 2026-01-23 10:01:00.640 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:01.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:01.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:02 np0005593233 nova_compute[222017]: 2026-01-23 10:01:02.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:03.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:03.901 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:01:03 np0005593233 nova_compute[222017]: 2026-01-23 10:01:03.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:03.902 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:01:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:05.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:05.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:05 np0005593233 nova_compute[222017]: 2026-01-23 10:01:05.640 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:06.905 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:07 np0005593233 nova_compute[222017]: 2026-01-23 10:01:07.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:07.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:07.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:09.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:09.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:10 np0005593233 nova_compute[222017]: 2026-01-23 10:01:10.643 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:11.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:11.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:01:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.254 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.254 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.267 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.378 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.625 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.626 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.636 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:01:12 np0005593233 nova_compute[222017]: 2026-01-23 10:01:12.637 222021 INFO nova.compute.claims [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.028 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:13.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:13.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.555 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.565 222021 DEBUG nova.compute.provider_tree [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.640 222021 DEBUG nova.scheduler.client.report [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.680 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.681 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.755 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.756 222021 DEBUG nova.network.neutron [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.786 222021 INFO nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:01:13 np0005593233 nova_compute[222017]: 2026-01-23 10:01:13.954 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.129 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.130 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.131 222021 INFO nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Creating image(s)#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.170 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.208 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.243 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.248 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.324 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.325 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.326 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.326 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.358 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.363 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 51a66602-3548-4341-add1-988bd6c7aa57_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.404 222021 DEBUG nova.policy [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '29710db389c842df836944048225740f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.848 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 51a66602-3548-4341-add1-988bd6c7aa57_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:14 np0005593233 nova_compute[222017]: 2026-01-23 10:01:14.938 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] resizing rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.071 222021 DEBUG nova.objects.instance [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'migration_context' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.100 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.101 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Ensure instance console log exists: /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.102 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.102 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.102 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:15 np0005593233 podman[261238]: 2026-01-23 10:01:15.114146891 +0000 UTC m=+0.116087966 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:01:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:15.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:15.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:15 np0005593233 nova_compute[222017]: 2026-01-23 10:01:15.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:17 np0005593233 nova_compute[222017]: 2026-01-23 10:01:17.270 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:17.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:17.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:17 np0005593233 nova_compute[222017]: 2026-01-23 10:01:17.624 222021 DEBUG nova.network.neutron [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Successfully created port: 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.8 total, 600.0 interval#012Cumulative writes: 30K writes, 117K keys, 30K commit groups, 1.0 writes per commit group, ingest: 0.11 GB, 0.03 MB/s#012Cumulative WAL: 30K writes, 10K syncs, 2.82 writes per sync, written: 0.11 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4438 writes, 16K keys, 4438 commit groups, 1.0 writes per commit group, ingest: 15.04 MB, 0.03 MB/s#012Interval WAL: 4438 writes, 1788 syncs, 2.48 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:01:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:01:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:19.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:01:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:19.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:20 np0005593233 nova_compute[222017]: 2026-01-23 10:01:20.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:21.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:21.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.275 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.528 222021 DEBUG nova.network.neutron [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Successfully updated port: 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.670 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.670 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.670 222021 DEBUG nova.network.neutron [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.780 222021 DEBUG nova.compute.manager [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-changed-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.781 222021 DEBUG nova.compute.manager [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Refreshing instance network info cache due to event network-changed-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:01:22 np0005593233 nova_compute[222017]: 2026-01-23 10:01:22.781 222021 DEBUG oslo_concurrency.lockutils [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:23.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:23.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:23 np0005593233 nova_compute[222017]: 2026-01-23 10:01:23.692 222021 DEBUG nova.network.neutron [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:01:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:25.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:25.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:25 np0005593233 nova_compute[222017]: 2026-01-23 10:01:25.649 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.541 222021 DEBUG nova.network.neutron [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.597 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.598 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Instance network_info: |[{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.599 222021 DEBUG oslo_concurrency.lockutils [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.599 222021 DEBUG nova.network.neutron [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Refreshing network info cache for port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.602 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Start _get_guest_xml network_info=[{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.607 222021 WARNING nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.630 222021 DEBUG nova.virt.libvirt.host [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.631 222021 DEBUG nova.virt.libvirt.host [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.637 222021 DEBUG nova.virt.libvirt.host [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.637 222021 DEBUG nova.virt.libvirt.host [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.639 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.639 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.639 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.640 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.640 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.640 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.640 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.640 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.641 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.641 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.641 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.641 222021 DEBUG nova.virt.hardware [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:01:26 np0005593233 nova_compute[222017]: 2026-01-23 10:01:26.645 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:01:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2924492294' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.153 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.188 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.193 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.415 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:27.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:27.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:01:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2840674231' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.714 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.717 222021 DEBUG nova.virt.libvirt.vif [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:01:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-953780140',display_name='tempest-ServerActionsTestOtherA-server-953780140',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-953780140',id=97,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPDPMbCnqcp11s7OR05vsDdiZlZSU5ZbBJSLaqQpawTODCANj+91AmOb6Hdh0FgzlQPvmSu+VYXOLfZik0SA3L4m61/nruOol9dJ9Mz34f8cV2NJKksVR2Ar2t+W5r4M6w==',key_name='tempest-keypair-2078677939',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-502m022b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:01:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='29710db389c842df836944048225740f',uuid=51a66602-3548-4341-add1-988bd6c7aa57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.717 222021 DEBUG nova.network.os_vif_util [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.719 222021 DEBUG nova.network.os_vif_util [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.721 222021 DEBUG nova.objects.instance [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.888 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <uuid>51a66602-3548-4341-add1-988bd6c7aa57</uuid>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <name>instance-00000061</name>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerActionsTestOtherA-server-953780140</nova:name>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:01:26</nova:creationTime>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:user uuid="29710db389c842df836944048225740f">tempest-ServerActionsTestOtherA-882763067-project-member</nova:user>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:project uuid="8c16cd713fa74a88b43e4edf01c273bd">tempest-ServerActionsTestOtherA-882763067</nova:project>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <nova:port uuid="0ef6c9e2-66fb-4264-8cb0-7cd8ba296828">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <entry name="serial">51a66602-3548-4341-add1-988bd6c7aa57</entry>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <entry name="uuid">51a66602-3548-4341-add1-988bd6c7aa57</entry>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/51a66602-3548-4341-add1-988bd6c7aa57_disk">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/51a66602-3548-4341-add1-988bd6c7aa57_disk.config">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:de:9c:08"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <target dev="tap0ef6c9e2-66"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/console.log" append="off"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:01:27 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:01:27 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:01:27 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:01:27 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.890 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Preparing to wait for external event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.891 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.891 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.892 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.893 222021 DEBUG nova.virt.libvirt.vif [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:01:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-953780140',display_name='tempest-ServerActionsTestOtherA-server-953780140',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-953780140',id=97,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPDPMbCnqcp11s7OR05vsDdiZlZSU5ZbBJSLaqQpawTODCANj+91AmOb6Hdh0FgzlQPvmSu+VYXOLfZik0SA3L4m61/nruOol9dJ9Mz34f8cV2NJKksVR2Ar2t+W5r4M6w==',key_name='tempest-keypair-2078677939',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-502m022b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:01:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='29710db389c842df836944048225740f',uuid=51a66602-3548-4341-add1-988bd6c7aa57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.893 222021 DEBUG nova.network.os_vif_util [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.894 222021 DEBUG nova.network.os_vif_util [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.894 222021 DEBUG os_vif [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.896 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.896 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.902 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ef6c9e2-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.903 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ef6c9e2-66, col_values=(('external_ids', {'iface-id': '0ef6c9e2-66fb-4264-8cb0-7cd8ba296828', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:9c:08', 'vm-uuid': '51a66602-3548-4341-add1-988bd6c7aa57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:27 np0005593233 NetworkManager[48871]: <info>  [1769162487.9074] manager: (tap0ef6c9e2-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.909 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.916 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:27 np0005593233 nova_compute[222017]: 2026-01-23 10:01:27.918 222021 INFO os_vif [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66')#033[00m
Jan 23 05:01:28 np0005593233 podman[261400]: 2026-01-23 10:01:28.06674215 +0000 UTC m=+0.067324513 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:01:28 np0005593233 nova_compute[222017]: 2026-01-23 10:01:28.472 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:01:28 np0005593233 nova_compute[222017]: 2026-01-23 10:01:28.472 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:01:28 np0005593233 nova_compute[222017]: 2026-01-23 10:01:28.473 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No VIF found with MAC fa:16:3e:de:9c:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:01:28 np0005593233 nova_compute[222017]: 2026-01-23 10:01:28.473 222021 INFO nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Using config drive#033[00m
Jan 23 05:01:28 np0005593233 nova_compute[222017]: 2026-01-23 10:01:28.597 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:29.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:29.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.651 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.746 222021 INFO nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Creating config drive at /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/disk.config#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.755 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe8b046c7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.911 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe8b046c7" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.948 222021 DEBUG nova.storage.rbd_utils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image 51a66602-3548-4341-add1-988bd6c7aa57_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:30 np0005593233 nova_compute[222017]: 2026-01-23 10:01:30.951 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/disk.config 51a66602-3548-4341-add1-988bd6c7aa57_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.309 222021 DEBUG oslo_concurrency.processutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/disk.config 51a66602-3548-4341-add1-988bd6c7aa57_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.358s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.311 222021 INFO nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Deleting local config drive /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57/disk.config because it was imported into RBD.#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:31 np0005593233 kernel: tap0ef6c9e2-66: entered promiscuous mode
Jan 23 05:01:31 np0005593233 NetworkManager[48871]: <info>  [1769162491.3917] manager: (tap0ef6c9e2-66): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Jan 23 05:01:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:31Z|00457|binding|INFO|Claiming lport 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 for this chassis.
Jan 23 05:01:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:31Z|00458|binding|INFO|0ef6c9e2-66fb-4264-8cb0-7cd8ba296828: Claiming fa:16:3e:de:9c:08 10.100.0.6
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.393 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.400 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 systemd-udevd[261487]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:31 np0005593233 NetworkManager[48871]: <info>  [1769162491.4522] device (tap0ef6c9e2-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:01:31 np0005593233 NetworkManager[48871]: <info>  [1769162491.4532] device (tap0ef6c9e2-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:01:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:31.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.456 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:9c:08 10.100.0.6'], port_security=['fa:16:3e:de:9c:08 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '51a66602-3548-4341-add1-988bd6c7aa57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9910180-8b38-41b2-8cb3-4e4af7eb2c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.458 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b bound to our chassis#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.459 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8575e824-4be0-4206-873e-2f9a3d1ded0b#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.459 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:31Z|00459|binding|INFO|Setting lport 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 ovn-installed in OVS
Jan 23 05:01:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:31Z|00460|binding|INFO|Setting lport 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 up in Southbound
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.468 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 systemd-machined[190954]: New machine qemu-47-instance-00000061.
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.475 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c4636fea-7463-4688-b44e-5f8f98084cf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.476 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8575e824-41 in ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.479 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8575e824-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.479 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9a593b4f-1d2b-462c-a36c-768a6de70f2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.480 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[88ae4924-91f2-4e32-b5cc-32bad452e5b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 systemd[1]: Started Virtual Machine qemu-47-instance-00000061.
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.496 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[910d2b24-a3ad-433b-bc9d-928cf71663b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.511 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c7aa1b94-42f0-4e9b-a601-8f125d196082]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.535 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.536 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.536 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.536 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.537 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.555 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6f4a80-7df2-4338-b6e0-b92116b8ce54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 systemd-udevd[261491]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:31 np0005593233 NetworkManager[48871]: <info>  [1769162491.5639] manager: (tap8575e824-40): new Veth device (/org/freedesktop/NetworkManager/Devices/219)
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.563 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34e17e5f-86f2-4bc3-9149-b25f988f2b57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:31.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.622 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[62d6372d-2997-4805-b9c2-d264dfd4d52f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.627 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[76bce167-c867-4153-b1a8-c81a6e0e46e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 NetworkManager[48871]: <info>  [1769162491.6639] device (tap8575e824-40): carrier: link connected
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.672 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[bfabf87e-1570-4d61-b48a-76d273217b2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.691 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ddedfd18-09a5-4abd-89ce-b34e220dcd3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640429, 'reachable_time': 42751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261525, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.709 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a293f32e-c27e-4bad-9a9c-5c3e5d3f19c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:16ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 640429, 'tstamp': 640429}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261542, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.730 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[898d7161-1636-48a1-8639-ec00311f2fd6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640429, 'reachable_time': 42751, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261545, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.772 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f40af8-55f5-4bbc-919c-53658cb6e951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.860 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c983f5e9-1b2e-43b0-9b71-55bdb8ffb9c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.861 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.862 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.862 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8575e824-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:31 np0005593233 NetworkManager[48871]: <info>  [1769162491.8650] manager: (tap8575e824-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.865 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 kernel: tap8575e824-40: entered promiscuous mode
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.873 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8575e824-40, col_values=(('external_ids', {'iface-id': 'f7023d86-3158-4cc4-b690-f57bb76e92b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:31Z|00461|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.876 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 nova_compute[222017]: 2026-01-23 10:01:31.898 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.899 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.901 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[776dbb1b-f1c7-4700-8ffb-d5a678fd3667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.901 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:01:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:31.903 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'env', 'PROCESS_TAG=haproxy-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8575e824-4be0-4206-873e-2f9a3d1ded0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:01:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:01:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3933134386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.027 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.055 222021 DEBUG nova.network.neutron [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated VIF entry in instance network info cache for port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.056 222021 DEBUG nova.network.neutron [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.108 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162492.107502, 51a66602-3548-4341-add1-988bd6c7aa57 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.109 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] VM Started (Lifecycle Event)#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.180 222021 DEBUG oslo_concurrency.lockutils [req-2193633f-bff8-4590-b006-305114243ba9 req-bf2df0a6-14d7-4a39-b4c4-729ed48de01b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.182 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.186 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162492.107996, 51a66602-3548-4341-add1-988bd6c7aa57 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.186 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.316 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.322 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.322 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.323 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:01:32 np0005593233 podman[261620]: 2026-01-23 10:01:32.395355885 +0000 UTC m=+0.080132631 container create 93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.407 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:01:32 np0005593233 podman[261620]: 2026-01-23 10:01:32.349673478 +0000 UTC m=+0.034450354 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:01:32 np0005593233 systemd[1]: Started libpod-conmon-93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44.scope.
Jan 23 05:01:32 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:01:32 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/458a2ef67bc34737649f0557f0123cb318adab6330bc1c29c9e6f131863e2bd2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.522 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.525 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4490MB free_disk=20.921844482421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.525 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:32 np0005593233 podman[261620]: 2026-01-23 10:01:32.525749341 +0000 UTC m=+0.210526117 container init 93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.525 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:32 np0005593233 podman[261620]: 2026-01-23 10:01:32.530985047 +0000 UTC m=+0.215761793 container start 93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:01:32 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [NOTICE]   (261640) : New worker (261642) forked
Jan 23 05:01:32 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [NOTICE]   (261640) : Loading success.
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.688 222021 DEBUG nova.compute.manager [req-977aeff0-3b43-4961-9833-ed0c5bb3d7db req-8839043f-a1b2-4df1-b8c2-ebb2e9017257 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.688 222021 DEBUG oslo_concurrency.lockutils [req-977aeff0-3b43-4961-9833-ed0c5bb3d7db req-8839043f-a1b2-4df1-b8c2-ebb2e9017257 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.689 222021 DEBUG oslo_concurrency.lockutils [req-977aeff0-3b43-4961-9833-ed0c5bb3d7db req-8839043f-a1b2-4df1-b8c2-ebb2e9017257 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.689 222021 DEBUG oslo_concurrency.lockutils [req-977aeff0-3b43-4961-9833-ed0c5bb3d7db req-8839043f-a1b2-4df1-b8c2-ebb2e9017257 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.689 222021 DEBUG nova.compute.manager [req-977aeff0-3b43-4961-9833-ed0c5bb3d7db req-8839043f-a1b2-4df1-b8c2-ebb2e9017257 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Processing event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.690 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.694 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162492.693804, 51a66602-3548-4341-add1-988bd6c7aa57 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.695 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.699 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.702 222021 INFO nova.virt.libvirt.driver [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Instance spawned successfully.#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.702 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.789 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.789 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.790 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.791 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.791 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.792 222021 DEBUG nova.virt.libvirt.driver [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.832 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.842 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.889 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 51a66602-3548-4341-add1-988bd6c7aa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.891 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.891 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.905 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.919 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.972 222021 INFO nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Took 18.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:01:32 np0005593233 nova_compute[222017]: 2026-01-23 10:01:32.973 222021 DEBUG nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:01:33 np0005593233 nova_compute[222017]: 2026-01-23 10:01:33.011 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:33 np0005593233 nova_compute[222017]: 2026-01-23 10:01:33.217 222021 INFO nova.compute.manager [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Took 20.64 seconds to build instance.#033[00m
Jan 23 05:01:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:33.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:01:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/126903498' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:01:33 np0005593233 nova_compute[222017]: 2026-01-23 10:01:33.557 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:33 np0005593233 nova_compute[222017]: 2026-01-23 10:01:33.563 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:01:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:33.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.160 222021 DEBUG oslo_concurrency.lockutils [None req-3575cd37-7b1a-44df-8435-d1cc32bf1028 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.329 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.636 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.636 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.937 222021 DEBUG nova.compute.manager [req-e6b18231-856b-479f-9180-04124f4f5dfc req-f6bb2adb-1053-4767-8b0c-807a5e668b60 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.938 222021 DEBUG oslo_concurrency.lockutils [req-e6b18231-856b-479f-9180-04124f4f5dfc req-f6bb2adb-1053-4767-8b0c-807a5e668b60 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.938 222021 DEBUG oslo_concurrency.lockutils [req-e6b18231-856b-479f-9180-04124f4f5dfc req-f6bb2adb-1053-4767-8b0c-807a5e668b60 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.938 222021 DEBUG oslo_concurrency.lockutils [req-e6b18231-856b-479f-9180-04124f4f5dfc req-f6bb2adb-1053-4767-8b0c-807a5e668b60 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.938 222021 DEBUG nova.compute.manager [req-e6b18231-856b-479f-9180-04124f4f5dfc req-f6bb2adb-1053-4767-8b0c-807a5e668b60 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] No waiting events found dispatching network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:01:34 np0005593233 nova_compute[222017]: 2026-01-23 10:01:34.939 222021 WARNING nova.compute.manager [req-e6b18231-856b-479f-9180-04124f4f5dfc req-f6bb2adb-1053-4767-8b0c-807a5e668b60 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received unexpected event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:01:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:35.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:35.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:35 np0005593233 nova_compute[222017]: 2026-01-23 10:01:35.637 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:35 np0005593233 nova_compute[222017]: 2026-01-23 10:01:35.638 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:35 np0005593233 nova_compute[222017]: 2026-01-23 10:01:35.638 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:01:35 np0005593233 nova_compute[222017]: 2026-01-23 10:01:35.655 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000084s ======
Jan 23 05:01:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:37.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Jan 23 05:01:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:37.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:37 np0005593233 nova_compute[222017]: 2026-01-23 10:01:37.978 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.388 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.388 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.936 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.937 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.938 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:01:38 np0005593233 nova_compute[222017]: 2026-01-23 10:01:38.938 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:01:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:39.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:39.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:40 np0005593233 nova_compute[222017]: 2026-01-23 10:01:40.658 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:41.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:41.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:42.664 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:42.665 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:01:42.665 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:42 np0005593233 NetworkManager[48871]: <info>  [1769162502.7246] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Jan 23 05:01:42 np0005593233 NetworkManager[48871]: <info>  [1769162502.7253] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Jan 23 05:01:42 np0005593233 nova_compute[222017]: 2026-01-23 10:01:42.722 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:42Z|00462|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:01:42 np0005593233 nova_compute[222017]: 2026-01-23 10:01:42.799 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:42 np0005593233 nova_compute[222017]: 2026-01-23 10:01:42.813 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:42 np0005593233 nova_compute[222017]: 2026-01-23 10:01:42.981 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:43.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:43.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:44 np0005593233 nova_compute[222017]: 2026-01-23 10:01:44.382 222021 DEBUG nova.compute.manager [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-changed-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:01:44 np0005593233 nova_compute[222017]: 2026-01-23 10:01:44.382 222021 DEBUG nova.compute.manager [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Refreshing instance network info cache due to event network-changed-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:01:44 np0005593233 nova_compute[222017]: 2026-01-23 10:01:44.382 222021 DEBUG oslo_concurrency.lockutils [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:44 np0005593233 nova_compute[222017]: 2026-01-23 10:01:44.566 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:01:45 np0005593233 nova_compute[222017]: 2026-01-23 10:01:45.212 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:01:45 np0005593233 nova_compute[222017]: 2026-01-23 10:01:45.213 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:01:45 np0005593233 nova_compute[222017]: 2026-01-23 10:01:45.213 222021 DEBUG oslo_concurrency.lockutils [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:45 np0005593233 nova_compute[222017]: 2026-01-23 10:01:45.213 222021 DEBUG nova.network.neutron [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Refreshing network info cache for port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:01:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:45.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:45.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:45 np0005593233 nova_compute[222017]: 2026-01-23 10:01:45.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:46 np0005593233 podman[261675]: 2026-01-23 10:01:46.094011059 +0000 UTC m=+0.098474214 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 23 05:01:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:47Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:de:9c:08 10.100.0.6
Jan 23 05:01:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:01:47Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:9c:08 10.100.0.6
Jan 23 05:01:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:47.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:47.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:47 np0005593233 nova_compute[222017]: 2026-01-23 10:01:47.984 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:48 np0005593233 nova_compute[222017]: 2026-01-23 10:01:48.207 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:49 np0005593233 nova_compute[222017]: 2026-01-23 10:01:49.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:49.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:49.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:50 np0005593233 nova_compute[222017]: 2026-01-23 10:01:50.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.083 222021 DEBUG nova.network.neutron [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated VIF entry in instance network info cache for port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.084 222021 DEBUG nova.network.neutron [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.122 222021 DEBUG oslo_concurrency.lockutils [req-e07cab8b-71b5-4454-bebf-bdee1e145c12 req-aa825593-bc06-453d-b123-88e827d5d70b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.224 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.225 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.255 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:01:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:51.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.508 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.509 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.518 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.519 222021 INFO nova.compute.claims [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:01:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:01:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:51.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:01:51 np0005593233 nova_compute[222017]: 2026-01-23 10:01:51.721 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:01:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/14336678' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.199 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.210 222021 DEBUG nova.compute.provider_tree [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.249 222021 DEBUG nova.scheduler.client.report [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.307 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.309 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.468 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.468 222021 DEBUG nova.network.neutron [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.501 222021 INFO nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.526 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.692 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.695 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.696 222021 INFO nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Creating image(s)#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.732 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.768 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.800 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.805 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.881 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.882 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.882 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.883 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.915 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.919 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:52 np0005593233 nova_compute[222017]: 2026-01-23 10:01:52.986 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:53 np0005593233 nova_compute[222017]: 2026-01-23 10:01:53.008 222021 DEBUG nova.policy [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c09e682996b940dc97c866f9e4f1e74e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:01:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:53.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:53.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.293 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.390 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] resizing rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.513 222021 DEBUG nova.objects.instance [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'migration_context' on Instance uuid 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.562 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.563 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Ensure instance console log exists: /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.563 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.564 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.564 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:54 np0005593233 nova_compute[222017]: 2026-01-23 10:01:54.842 222021 DEBUG nova.network.neutron [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Successfully created port: 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:01:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:55.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:55.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:55 np0005593233 nova_compute[222017]: 2026-01-23 10:01:55.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 9715 writes, 50K keys, 9715 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 9715 writes, 9715 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1541 writes, 7826 keys, 1541 commit groups, 1.0 writes per commit group, ingest: 15.76 MB, 0.03 MB/s#012Interval WAL: 1541 writes, 1541 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     45.4      1.35              0.27        30    0.045       0      0       0.0       0.0#012  L6      1/0    8.72 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.3     85.9     71.7      3.73              0.90        29    0.129    170K    16K       0.0       0.0#012 Sum      1/0    8.72 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.3     63.0     64.7      5.08              1.17        59    0.086    170K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     49.7     49.8      1.45              0.24        12    0.121     44K   3119       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0     85.9     71.7      3.73              0.90        29    0.129    170K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     45.5      1.35              0.27        29    0.047       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.060, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.32 GB write, 0.09 MB/s write, 0.31 GB read, 0.09 MB/s read, 5.1 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 35.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000368 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2054,34.24 MB,11.2638%) FilterBlock(59,489.55 KB,0.157261%) IndexBlock(59,846.81 KB,0.272028%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:01:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:57.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:57.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:01:57 np0005593233 nova_compute[222017]: 2026-01-23 10:01:57.990 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:59 np0005593233 podman[261891]: 2026-01-23 10:01:59.060292709 +0000 UTC m=+0.065936024 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.122 222021 DEBUG nova.network.neutron [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Successfully updated port: 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.168 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "refresh_cache-0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.168 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquired lock "refresh_cache-0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.169 222021 DEBUG nova.network.neutron [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.339 222021 DEBUG nova.compute.manager [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-changed-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.340 222021 DEBUG nova.compute.manager [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Refreshing instance network info cache due to event network-changed-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:01:59 np0005593233 nova_compute[222017]: 2026-01-23 10:01:59.340 222021 DEBUG oslo_concurrency.lockutils [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:01:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:59.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:01:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:01:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:01:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:59.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:00 np0005593233 nova_compute[222017]: 2026-01-23 10:02:00.099 222021 DEBUG nova.network.neutron [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:02:00 np0005593233 nova_compute[222017]: 2026-01-23 10:02:00.385 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:00 np0005593233 nova_compute[222017]: 2026-01-23 10:02:00.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:01.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:01.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:01.804 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:01.805 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:02:01 np0005593233 nova_compute[222017]: 2026-01-23 10:02:01.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:02 np0005593233 nova_compute[222017]: 2026-01-23 10:02:02.994 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:03.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:03.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.283 222021 DEBUG nova.network.neutron [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Updating instance_info_cache with network_info: [{"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.353 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Releasing lock "refresh_cache-0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.353 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Instance network_info: |[{"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.354 222021 DEBUG oslo_concurrency.lockutils [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.355 222021 DEBUG nova.network.neutron [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Refreshing network info cache for port 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.359 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Start _get_guest_xml network_info=[{"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.365 222021 WARNING nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.374 222021 DEBUG nova.virt.libvirt.host [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.375 222021 DEBUG nova.virt.libvirt.host [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.381 222021 DEBUG nova.virt.libvirt.host [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.382 222021 DEBUG nova.virt.libvirt.host [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.383 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.383 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.384 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.384 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.384 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.384 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.384 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.385 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.385 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.385 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.386 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.386 222021 DEBUG nova.virt.hardware [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.389 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1960478362' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.895 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.932 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:04 np0005593233 nova_compute[222017]: 2026-01-23 10:02:04.938 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2823693882' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.463 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.465 222021 DEBUG nova.virt.libvirt.vif [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:01:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1887091937',display_name='tempest-ListServerFiltersTestJSON-instance-1887091937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1887091937',id=100,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-m0c5gvil',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:01:52Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.466 222021 DEBUG nova.network.os_vif_util [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.467 222021 DEBUG nova.network.os_vif_util [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.468 222021 DEBUG nova.objects.instance [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:05.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.547 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <uuid>0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc</uuid>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <name>instance-00000064</name>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <memory>196608</memory>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1887091937</nova:name>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:02:04</nova:creationTime>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.micro">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:memory>192</nova:memory>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:user uuid="c09e682996b940dc97c866f9e4f1e74e">tempest-ListServerFiltersTestJSON-1524131674-project-member</nova:user>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:project uuid="0f5ca0233c1a490aa2d596b88a0ec503">tempest-ListServerFiltersTestJSON-1524131674</nova:project>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <nova:port uuid="98a4c1db-f302-41c5-8f32-8dd7bcfddf9b">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <entry name="serial">0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc</entry>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <entry name="uuid">0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc</entry>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk.config">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:7f:8a:76"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <target dev="tap98a4c1db-f3"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/console.log" append="off"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:02:05 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:02:05 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:02:05 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:02:05 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.548 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Preparing to wait for external event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.549 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.549 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.550 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.551 222021 DEBUG nova.virt.libvirt.vif [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:01:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1887091937',display_name='tempest-ListServerFiltersTestJSON-instance-1887091937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1887091937',id=100,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-m0c5gvil',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:01:52Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.552 222021 DEBUG nova.network.os_vif_util [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.553 222021 DEBUG nova.network.os_vif_util [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.553 222021 DEBUG os_vif [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.555 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.556 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.561 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.561 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98a4c1db-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.562 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap98a4c1db-f3, col_values=(('external_ids', {'iface-id': '98a4c1db-f302-41c5-8f32-8dd7bcfddf9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:8a:76', 'vm-uuid': '0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.565 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:05 np0005593233 NetworkManager[48871]: <info>  [1769162525.5667] manager: (tap98a4c1db-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.568 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.575 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.577 222021 INFO os_vif [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3')#033[00m
Jan 23 05:02:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:05.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.651 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.652 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.652 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] No VIF found with MAC fa:16:3e:7f:8a:76, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.652 222021 INFO nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Using config drive#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.686 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:05 np0005593233 nova_compute[222017]: 2026-01-23 10:02:05.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.513 222021 INFO nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Creating config drive at /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/disk.config#033[00m
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.518 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe7f2vlck execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:07.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:07.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.662 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe7f2vlck" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.699 222021 DEBUG nova.storage.rbd_utils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.704 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/disk.config 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.899 222021 DEBUG oslo_concurrency.processutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/disk.config 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.900 222021 INFO nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Deleting local config drive /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc/disk.config because it was imported into RBD.#033[00m
Jan 23 05:02:07 np0005593233 NetworkManager[48871]: <info>  [1769162527.9691] manager: (tap98a4c1db-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Jan 23 05:02:07 np0005593233 kernel: tap98a4c1db-f3: entered promiscuous mode
Jan 23 05:02:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:07Z|00463|binding|INFO|Claiming lport 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b for this chassis.
Jan 23 05:02:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:07Z|00464|binding|INFO|98a4c1db-f302-41c5-8f32-8dd7bcfddf9b: Claiming fa:16:3e:7f:8a:76 10.100.0.11
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:07 np0005593233 nova_compute[222017]: 2026-01-23 10:02:07.978 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:07.995 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:8a:76 10.100.0.11'], port_security=['fa:16:3e:7f:8a:76 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ad8a7362-692a-4044-8393-1c10014f8bab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83406af9-ea42-4cda-96ee-b8c04ab0651a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:07.997 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b in datapath 969bd83a-7542-46e3-90f0-1a81f26ba6b8 bound to our chassis#033[00m
Jan 23 05:02:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:07.998 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 969bd83a-7542-46e3-90f0-1a81f26ba6b8#033[00m
Jan 23 05:02:08 np0005593233 systemd-udevd[262044]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:02:08 np0005593233 systemd-machined[190954]: New machine qemu-48-instance-00000064.
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.019 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[54d6d4aa-080e-4394-a61e-57759405d924]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.021 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap969bd83a-71 in ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.025 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap969bd83a-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.025 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aadb725a-2fda-42a8-90ac-291d6815ab66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 NetworkManager[48871]: <info>  [1769162528.0278] device (tap98a4c1db-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.026 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a883778-4020-4d19-9f78-86e21bc4d570]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 NetworkManager[48871]: <info>  [1769162528.0291] device (tap98a4c1db-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:02:08 np0005593233 systemd[1]: Started Virtual Machine qemu-48-instance-00000064.
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:08Z|00465|binding|INFO|Setting lport 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b ovn-installed in OVS
Jan 23 05:02:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:08Z|00466|binding|INFO|Setting lport 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b up in Southbound
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.042 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.042 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[95797c5c-898b-4660-a70a-6b441812e07f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.064 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3406e8ea-0db5-4847-bf8a-195e7a9c8251]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.108 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[40a6e80c-4ef7-4bf7-a61d-b1f49ab3d0a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.116 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f975718d-21dd-483c-adf2-5064592ba407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 NetworkManager[48871]: <info>  [1769162528.1193] manager: (tap969bd83a-70): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.162 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbb7d1b-a407-402f-b4e3-7a61a078108c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.166 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[75002914-0389-41c6-ae71-a70928b805c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 NetworkManager[48871]: <info>  [1769162528.1983] device (tap969bd83a-70): carrier: link connected
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.205 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[652bafd4-a228-4b14-82d2-efdf3513676b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.227 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9617c880-a713-4cd0-8ba4-a03ffe9463c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap969bd83a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fe:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644082, 'reachable_time': 34853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262078, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.250 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b678de36-4a68-44d6-ba74-9d97c850061f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:fef5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 644082, 'tstamp': 644082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262079, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.275 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[beec8968-05ee-4d15-923e-c166c32cbcf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap969bd83a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fe:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644082, 'reachable_time': 34853, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262080, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.320 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c250dc74-d457-42ce-98df-98f638313465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.395 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9ea885-3551-4387-bd65-46b77b1fdd4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.397 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap969bd83a-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.398 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.398 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap969bd83a-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.402 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:08 np0005593233 NetworkManager[48871]: <info>  [1769162528.4031] manager: (tap969bd83a-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 23 05:02:08 np0005593233 kernel: tap969bd83a-70: entered promiscuous mode
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.406 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.407 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap969bd83a-70, col_values=(('external_ids', {'iface-id': '9ee89271-3ee7-4672-8800-56bb900c4dd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:08Z|00467|binding|INFO|Releasing lport 9ee89271-3ee7-4672-8800-56bb900c4dd0 from this chassis (sb_readonly=0)
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.423 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.427 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.428 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[46e3c1e8-1f80-48f9-9d0e-40dfe37085a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.429 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-969bd83a-7542-46e3-90f0-1a81f26ba6b8
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 969bd83a-7542-46e3-90f0-1a81f26ba6b8
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:02:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:08.429 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'env', 'PROCESS_TAG=haproxy-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/969bd83a-7542-46e3-90f0-1a81f26ba6b8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.560 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162528.559974, 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.562 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] VM Started (Lifecycle Event)#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.610 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.615 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162528.5613203, 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.616 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.676 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.682 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:08 np0005593233 nova_compute[222017]: 2026-01-23 10:02:08.725 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:08 np0005593233 podman[262154]: 2026-01-23 10:02:08.834442175 +0000 UTC m=+0.027221222 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:02:08 np0005593233 podman[262154]: 2026-01-23 10:02:08.937332461 +0000 UTC m=+0.130111478 container create c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:08 np0005593233 systemd[1]: Started libpod-conmon-c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d.scope.
Jan 23 05:02:09 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:02:09 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff3e200e037d714a1bb98c2b8b51aa693b553c0694afc5c0cce3b09fa148902/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:09 np0005593233 podman[262154]: 2026-01-23 10:02:09.041996297 +0000 UTC m=+0.234775334 container init c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:02:09 np0005593233 podman[262154]: 2026-01-23 10:02:09.049175037 +0000 UTC m=+0.241954054 container start c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:09 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [NOTICE]   (262173) : New worker (262175) forked
Jan 23 05:02:09 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [NOTICE]   (262173) : Loading success.
Jan 23 05:02:09 np0005593233 nova_compute[222017]: 2026-01-23 10:02:09.148 222021 DEBUG nova.network.neutron [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Updated VIF entry in instance network info cache for port 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:02:09 np0005593233 nova_compute[222017]: 2026-01-23 10:02:09.149 222021 DEBUG nova.network.neutron [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Updating instance_info_cache with network_info: [{"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:09 np0005593233 nova_compute[222017]: 2026-01-23 10:02:09.171 222021 DEBUG oslo_concurrency.lockutils [req-1ee08023-ffa1-468e-9fbd-d6c6e62a80bc req-180c78e9-595b-43f5-b570-8a55b18625e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:09.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:09.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:10 np0005593233 nova_compute[222017]: 2026-01-23 10:02:10.622 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:10 np0005593233 nova_compute[222017]: 2026-01-23 10:02:10.689 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:10.808 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:11.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:11.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.861 222021 DEBUG nova.compute.manager [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.862 222021 DEBUG oslo_concurrency.lockutils [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.863 222021 DEBUG oslo_concurrency.lockutils [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.863 222021 DEBUG oslo_concurrency.lockutils [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.864 222021 DEBUG nova.compute.manager [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Processing event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.864 222021 DEBUG nova.compute.manager [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.864 222021 DEBUG oslo_concurrency.lockutils [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.865 222021 DEBUG oslo_concurrency.lockutils [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.865 222021 DEBUG oslo_concurrency.lockutils [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.866 222021 DEBUG nova.compute.manager [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] No waiting events found dispatching network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.866 222021 WARNING nova.compute.manager [req-5096538d-8a79-4b6b-8771-1964d8a9ddcc req-975ff61f-7f77-4ca3-9a7e-28aa8277c3b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received unexpected event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.867 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.887 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162531.887262, 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.887 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.889 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.895 222021 INFO nova.virt.libvirt.driver [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Instance spawned successfully.#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.895 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.929 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.935 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.936 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.937 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.938 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.939 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.939 222021 DEBUG nova.virt.libvirt.driver [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.943 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:11 np0005593233 nova_compute[222017]: 2026-01-23 10:02:11.988 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:12 np0005593233 nova_compute[222017]: 2026-01-23 10:02:12.011 222021 INFO nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Took 19.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:02:12 np0005593233 nova_compute[222017]: 2026-01-23 10:02:12.011 222021 DEBUG nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:12 np0005593233 nova_compute[222017]: 2026-01-23 10:02:12.135 222021 INFO nova.compute.manager [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Took 20.72 seconds to build instance.#033[00m
Jan 23 05:02:12 np0005593233 nova_compute[222017]: 2026-01-23 10:02:12.225 222021 DEBUG oslo_concurrency.lockutils [None req-f77f4407-4708-4e64-8233-ab7ef149e87a c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:13.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:13.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:15.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:15 np0005593233 nova_compute[222017]: 2026-01-23 10:02:15.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:15.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:15 np0005593233 nova_compute[222017]: 2026-01-23 10:02:15.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:17 np0005593233 podman[262184]: 2026-01-23 10:02:17.1301765 +0000 UTC m=+0.134162391 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:02:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:17.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:17.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:19.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:19.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.619 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.620 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.631 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.677 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.822 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.823 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.839 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:02:20 np0005593233 nova_compute[222017]: 2026-01-23 10:02:20.840 222021 INFO nova.compute.claims [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:02:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:21 np0005593233 nova_compute[222017]: 2026-01-23 10:02:21.245 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.402487012 +0000 UTC m=+0.058851656 container create 1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_thompson, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 23 05:02:21 np0005593233 systemd[1]: Started libpod-conmon-1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc.scope.
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.376261449 +0000 UTC m=+0.032626123 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:02:21 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.537886807 +0000 UTC m=+0.194251481 container init 1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_thompson, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:02:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:21.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.552217728 +0000 UTC m=+0.208582372 container start 1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_thompson, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.556856858 +0000 UTC m=+0.213221502 container attach 1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_thompson, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Jan 23 05:02:21 np0005593233 dazzling_thompson[262520]: 167 167
Jan 23 05:02:21 np0005593233 systemd[1]: libpod-1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc.scope: Deactivated successfully.
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.564320776 +0000 UTC m=+0.220685420 container died 1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 23 05:02:21 np0005593233 systemd[1]: var-lib-containers-storage-overlay-9e44db9ba11cd74ac878879f574e4459adf970ce966fb7169a37329b95c6e826-merged.mount: Deactivated successfully.
Jan 23 05:02:21 np0005593233 podman[262484]: 2026-01-23 10:02:21.615701163 +0000 UTC m=+0.272065807 container remove 1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dazzling_thompson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:02:21 np0005593233 systemd[1]: libpod-conmon-1bfca4ed096526423460203bc5d4ad5ffff6b4c1f185ad522eb169aa71177efc.scope: Deactivated successfully.
Jan 23 05:02:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:21.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:21 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2734779104' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:21 np0005593233 nova_compute[222017]: 2026-01-23 10:02:21.800 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:21 np0005593233 nova_compute[222017]: 2026-01-23 10:02:21.812 222021 DEBUG nova.compute.provider_tree [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:02:21 np0005593233 podman[262543]: 2026-01-23 10:02:21.855619649 +0000 UTC m=+0.059012680 container create 8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_euclid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True)
Jan 23 05:02:21 np0005593233 nova_compute[222017]: 2026-01-23 10:02:21.888 222021 DEBUG nova.scheduler.client.report [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:02:21 np0005593233 systemd[1]: Started libpod-conmon-8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83.scope.
Jan 23 05:02:21 np0005593233 podman[262543]: 2026-01-23 10:02:21.83382334 +0000 UTC m=+0.037216391 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:02:21 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:02:21 np0005593233 nova_compute[222017]: 2026-01-23 10:02:21.936 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:21 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad2ca142bc31a2b30788255bca6b8c71cd2748c224899d334d6c0065fbaf37c5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:21 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad2ca142bc31a2b30788255bca6b8c71cd2748c224899d334d6c0065fbaf37c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:21 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad2ca142bc31a2b30788255bca6b8c71cd2748c224899d334d6c0065fbaf37c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:21 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad2ca142bc31a2b30788255bca6b8c71cd2748c224899d334d6c0065fbaf37c5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:21 np0005593233 nova_compute[222017]: 2026-01-23 10:02:21.953 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:02:21 np0005593233 podman[262543]: 2026-01-23 10:02:21.95651664 +0000 UTC m=+0.159909681 container init 8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_euclid, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:02:21 np0005593233 podman[262543]: 2026-01-23 10:02:21.966690224 +0000 UTC m=+0.170083255 container start 8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_euclid, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:02:21 np0005593233 podman[262543]: 2026-01-23 10:02:21.970799329 +0000 UTC m=+0.174192380 container attach 8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_euclid, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:02:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.121 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.121 222021 DEBUG nova.network.neutron [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.232 222021 INFO nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.266 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.526 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.527 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.528 222021 INFO nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Creating image(s)#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.557 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.592 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.624 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.628 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.716 222021 DEBUG nova.policy [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9127d08a3bf5404e8cb8c84ed7152834', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '449f402258804f41b10f91a13da1176d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.723 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.724 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.726 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.726 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.767 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:22 np0005593233 nova_compute[222017]: 2026-01-23 10:02:22.772 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 e7f91a83-ca8d-4833-817c-282f4d8aec99_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.388 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 e7f91a83-ca8d-4833-817c-282f4d8aec99_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.616s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.509 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] resizing rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]: [
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:    {
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "available": false,
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "ceph_device": false,
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "lsm_data": {},
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "lvs": [],
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "path": "/dev/sr0",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "rejected_reasons": [
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "Has a FileSystem",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "Insufficient space (<5GB)"
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        ],
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        "sys_api": {
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "actuators": null,
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "device_nodes": "sr0",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "devname": "sr0",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "human_readable_size": "482.00 KB",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "id_bus": "ata",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "model": "QEMU DVD-ROM",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "nr_requests": "2",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "parent": "/dev/sr0",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "partitions": {},
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "path": "/dev/sr0",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "removable": "1",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "rev": "2.5+",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "ro": "0",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "rotational": "1",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "sas_address": "",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "sas_device_handle": "",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "scheduler_mode": "mq-deadline",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "sectors": 0,
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "sectorsize": "2048",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "size": 493568.0,
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "support_discard": "2048",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "type": "disk",
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:            "vendor": "QEMU"
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:        }
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]:    }
Jan 23 05:02:23 np0005593233 optimistic_euclid[262562]: ]
Jan 23 05:02:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:23.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:23 np0005593233 systemd[1]: libpod-8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83.scope: Deactivated successfully.
Jan 23 05:02:23 np0005593233 podman[262543]: 2026-01-23 10:02:23.573844812 +0000 UTC m=+1.777237844 container died 8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:02:23 np0005593233 systemd[1]: libpod-8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83.scope: Consumed 1.528s CPU time.
Jan 23 05:02:23 np0005593233 systemd[1]: var-lib-containers-storage-overlay-ad2ca142bc31a2b30788255bca6b8c71cd2748c224899d334d6c0065fbaf37c5-merged.mount: Deactivated successfully.
Jan 23 05:02:23 np0005593233 podman[262543]: 2026-01-23 10:02:23.63354642 +0000 UTC m=+1.836939451 container remove 8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_euclid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:02:23 np0005593233 systemd[1]: libpod-conmon-8ad134061ecbd98899578393a6b17a6ffab8bafbdcc59df616db9c40356a5e83.scope: Deactivated successfully.
Jan 23 05:02:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:23.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.670 222021 DEBUG nova.objects.instance [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lazy-loading 'migration_context' on Instance uuid e7f91a83-ca8d-4833-817c-282f4d8aec99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.720 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.720 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Ensure instance console log exists: /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.721 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.721 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:23 np0005593233 nova_compute[222017]: 2026-01-23 10:02:23.721 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:02:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:25.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:25 np0005593233 nova_compute[222017]: 2026-01-23 10:02:25.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:25.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:25 np0005593233 nova_compute[222017]: 2026-01-23 10:02:25.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:26 np0005593233 nova_compute[222017]: 2026-01-23 10:02:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:26Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:8a:76 10.100.0.11
Jan 23 05:02:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:26Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:8a:76 10.100.0.11
Jan 23 05:02:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:27.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:27.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:28 np0005593233 nova_compute[222017]: 2026-01-23 10:02:28.293 222021 DEBUG nova.network.neutron [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Successfully created port: 04adf614-da27-47e4-b969-26b9c2003d95 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:02:29 np0005593233 nova_compute[222017]: 2026-01-23 10:02:29.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:29.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:29.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:30 np0005593233 podman[263888]: 2026-01-23 10:02:30.070125663 +0000 UTC m=+0.071229282 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:02:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:30 np0005593233 nova_compute[222017]: 2026-01-23 10:02:30.639 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:30 np0005593233 nova_compute[222017]: 2026-01-23 10:02:30.701 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:31 np0005593233 nova_compute[222017]: 2026-01-23 10:02:31.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:31 np0005593233 nova_compute[222017]: 2026-01-23 10:02:31.529 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:31 np0005593233 nova_compute[222017]: 2026-01-23 10:02:31.530 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:31 np0005593233 nova_compute[222017]: 2026-01-23 10:02:31.530 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:31 np0005593233 nova_compute[222017]: 2026-01-23 10:02:31.530 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:02:31 np0005593233 nova_compute[222017]: 2026-01-23 10:02:31.531 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:31.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:31.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4000816430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.041 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.221 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.222 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.225 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.226 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000064 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.421 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.422 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4096MB free_disk=20.656776428222656GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.603 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 51a66602-3548-4341-add1-988bd6c7aa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.604 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.604 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance e7f91a83-ca8d-4833-817c-282f4d8aec99 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.604 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.605 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:02:32 np0005593233 nova_compute[222017]: 2026-01-23 10:02:32.756 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/947866104' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:33 np0005593233 nova_compute[222017]: 2026-01-23 10:02:33.239 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:33 np0005593233 nova_compute[222017]: 2026-01-23 10:02:33.246 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:02:33 np0005593233 nova_compute[222017]: 2026-01-23 10:02:33.275 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:02:33 np0005593233 nova_compute[222017]: 2026-01-23 10:02:33.313 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:02:33 np0005593233 nova_compute[222017]: 2026-01-23 10:02:33.314 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.891s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:33.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:33.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:34 np0005593233 nova_compute[222017]: 2026-01-23 10:02:34.313 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:34 np0005593233 nova_compute[222017]: 2026-01-23 10:02:34.314 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:34 np0005593233 nova_compute[222017]: 2026-01-23 10:02:34.314 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:34 np0005593233 nova_compute[222017]: 2026-01-23 10:02:34.314 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:02:34 np0005593233 nova_compute[222017]: 2026-01-23 10:02:34.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.123 222021 DEBUG nova.network.neutron [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Successfully updated port: 04adf614-da27-47e4-b969-26b9c2003d95 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.167 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "refresh_cache-e7f91a83-ca8d-4833-817c-282f4d8aec99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.168 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquired lock "refresh_cache-e7f91a83-ca8d-4833-817c-282f4d8aec99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.168 222021 DEBUG nova.network.neutron [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:02:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:02:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:35.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.596 222021 DEBUG nova.compute.manager [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received event network-changed-04adf614-da27-47e4-b969-26b9c2003d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.597 222021 DEBUG nova.compute.manager [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Refreshing instance network info cache due to event network-changed-04adf614-da27-47e4-b969-26b9c2003d95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.597 222021 DEBUG oslo_concurrency.lockutils [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-e7f91a83-ca8d-4833-817c-282f4d8aec99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.643 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:35.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:35 np0005593233 nova_compute[222017]: 2026-01-23 10:02:35.704 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:36 np0005593233 nova_compute[222017]: 2026-01-23 10:02:36.263 222021 DEBUG nova.network.neutron [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:02:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:37.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:37.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:39.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:02:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:39.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:02:40 np0005593233 nova_compute[222017]: 2026-01-23 10:02:40.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:40 np0005593233 nova_compute[222017]: 2026-01-23 10:02:40.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:02:40 np0005593233 nova_compute[222017]: 2026-01-23 10:02:40.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:02:40 np0005593233 nova_compute[222017]: 2026-01-23 10:02:40.466 222021 DEBUG nova.network.neutron [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Updating instance_info_cache with network_info: [{"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:40 np0005593233 nova_compute[222017]: 2026-01-23 10:02:40.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:40 np0005593233 nova_compute[222017]: 2026-01-23 10:02:40.707 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.071 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.232 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Releasing lock "refresh_cache-e7f91a83-ca8d-4833-817c-282f4d8aec99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.233 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Instance network_info: |[{"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.234 222021 DEBUG oslo_concurrency.lockutils [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-e7f91a83-ca8d-4833-817c-282f4d8aec99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.234 222021 DEBUG nova.network.neutron [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Refreshing network info cache for port 04adf614-da27-47e4-b969-26b9c2003d95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.237 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Start _get_guest_xml network_info=[{"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.242 222021 WARNING nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.250 222021 DEBUG nova.virt.libvirt.host [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.251 222021 DEBUG nova.virt.libvirt.host [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.255 222021 DEBUG nova.virt.libvirt.host [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.256 222021 DEBUG nova.virt.libvirt.host [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.258 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.258 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.258 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.259 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.259 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.259 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.259 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.260 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.260 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.260 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.261 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.261 222021 DEBUG nova.virt.hardware [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.265 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.521 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.522 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.523 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.523 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:41.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:41.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3369048120' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.768 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.799 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:41 np0005593233 nova_compute[222017]: 2026-01-23 10:02:41.805 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.293 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.295 222021 DEBUG nova.virt.libvirt.vif [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:02:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1630527270',display_name='tempest-ListServersNegativeTestJSON-server-1630527270-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1630527270-2',id=104,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='449f402258804f41b10f91a13da1176d',ramdisk_id='',reservation_id='r-2yft4oee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-2075106169',owner_user_name='tempest-ListServersNegativeTestJSON-2075106169-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:22Z,user_data=None,user_id='9127d08a3bf5404e8cb8c84ed7152834',uuid=e7f91a83-ca8d-4833-817c-282f4d8aec99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.295 222021 DEBUG nova.network.os_vif_util [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Converting VIF {"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.296 222021 DEBUG nova.network.os_vif_util [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.298 222021 DEBUG nova.objects.instance [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lazy-loading 'pci_devices' on Instance uuid e7f91a83-ca8d-4833-817c-282f4d8aec99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.502 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <uuid>e7f91a83-ca8d-4833-817c-282f4d8aec99</uuid>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <name>instance-00000068</name>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1630527270-2</nova:name>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:02:41</nova:creationTime>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:user uuid="9127d08a3bf5404e8cb8c84ed7152834">tempest-ListServersNegativeTestJSON-2075106169-project-member</nova:user>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:project uuid="449f402258804f41b10f91a13da1176d">tempest-ListServersNegativeTestJSON-2075106169</nova:project>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <nova:port uuid="04adf614-da27-47e4-b969-26b9c2003d95">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <entry name="serial">e7f91a83-ca8d-4833-817c-282f4d8aec99</entry>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <entry name="uuid">e7f91a83-ca8d-4833-817c-282f4d8aec99</entry>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/e7f91a83-ca8d-4833-817c-282f4d8aec99_disk">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/e7f91a83-ca8d-4833-817c-282f4d8aec99_disk.config">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:9e:fb:b5"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <target dev="tap04adf614-da"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/console.log" append="off"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:02:42 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:02:42 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:02:42 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:02:42 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.504 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Preparing to wait for external event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.504 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.504 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.505 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.505 222021 DEBUG nova.virt.libvirt.vif [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:02:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1630527270',display_name='tempest-ListServersNegativeTestJSON-server-1630527270-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1630527270-2',id=104,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='449f402258804f41b10f91a13da1176d',ramdisk_id='',reservation_id='r-2yft4oee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-2075106169',owner_user_name='tempest-ListServersNegativeTestJSON-2075106169-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:22Z,user_data=None,user_id='9127d08a3bf5404e8cb8c84ed7152834',uuid=e7f91a83-ca8d-4833-817c-282f4d8aec99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.505 222021 DEBUG nova.network.os_vif_util [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Converting VIF {"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.506 222021 DEBUG nova.network.os_vif_util [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.506 222021 DEBUG os_vif [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.507 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.508 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.508 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.513 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.513 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04adf614-da, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.513 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04adf614-da, col_values=(('external_ids', {'iface-id': '04adf614-da27-47e4-b969-26b9c2003d95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:fb:b5', 'vm-uuid': 'e7f91a83-ca8d-4833-817c-282f4d8aec99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.549 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:42 np0005593233 NetworkManager[48871]: <info>  [1769162562.5514] manager: (tap04adf614-da): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.554 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.560 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.562 222021 INFO os_vif [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da')#033[00m
Jan 23 05:02:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:42.665 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:42.665 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:42.666 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.736 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.737 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.737 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] No VIF found with MAC fa:16:3e:9e:fb:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.738 222021 INFO nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Using config drive#033[00m
Jan 23 05:02:42 np0005593233 nova_compute[222017]: 2026-01-23 10:02:42.772 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:43.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:43.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.262 222021 INFO nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Creating config drive at /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/disk.config#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.272 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4rshl_wl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.421 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4rshl_wl" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.460 222021 DEBUG nova.storage.rbd_utils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] rbd image e7f91a83-ca8d-4833-817c-282f4d8aec99_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.466 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/disk.config e7f91a83-ca8d-4833-817c-282f4d8aec99_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.683 222021 DEBUG oslo_concurrency.processutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/disk.config e7f91a83-ca8d-4833-817c-282f4d8aec99_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.684 222021 INFO nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Deleting local config drive /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99/disk.config because it was imported into RBD.#033[00m
Jan 23 05:02:44 np0005593233 kernel: tap04adf614-da: entered promiscuous mode
Jan 23 05:02:44 np0005593233 NetworkManager[48871]: <info>  [1769162564.7415] manager: (tap04adf614-da): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Jan 23 05:02:44 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:44Z|00468|binding|INFO|Claiming lport 04adf614-da27-47e4-b969-26b9c2003d95 for this chassis.
Jan 23 05:02:44 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:44Z|00469|binding|INFO|04adf614-da27-47e4-b969-26b9c2003d95: Claiming fa:16:3e:9e:fb:b5 10.100.0.14
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:44 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:44Z|00470|binding|INFO|Setting lport 04adf614-da27-47e4-b969-26b9c2003d95 ovn-installed in OVS
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:44 np0005593233 nova_compute[222017]: 2026-01-23 10:02:44.766 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:44 np0005593233 systemd-machined[190954]: New machine qemu-49-instance-00000068.
Jan 23 05:02:44 np0005593233 systemd[1]: Started Virtual Machine qemu-49-instance-00000068.
Jan 23 05:02:44 np0005593233 systemd-udevd[264139]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:02:44 np0005593233 NetworkManager[48871]: <info>  [1769162564.8157] device (tap04adf614-da): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:02:44 np0005593233 NetworkManager[48871]: <info>  [1769162564.8162] device (tap04adf614-da): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:02:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:45Z|00471|binding|INFO|Setting lport 04adf614-da27-47e4-b969-26b9c2003d95 up in Southbound
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.193 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:fb:b5 10.100.0.14'], port_security=['fa:16:3e:9e:fb:b5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e7f91a83-ca8d-4833-817c-282f4d8aec99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '449f402258804f41b10f91a13da1176d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07d82268-5230-4e65-b516-4629d542718c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c66279a-df2c-4cb2-9aaa-3c8f92544e07, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=04adf614-da27-47e4-b969-26b9c2003d95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.197 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 04adf614-da27-47e4-b969-26b9c2003d95 in datapath 5d65fb2c-55c9-4b50-aff7-9502add4a8c8 bound to our chassis#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.201 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d65fb2c-55c9-4b50-aff7-9502add4a8c8#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.219 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd75cf0-38cc-4b7a-ae57-65d778317038]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.220 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d65fb2c-51 in ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.224 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d65fb2c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.224 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8d174713-4065-46a1-b0a9-44fb4047d3b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.226 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0a95e3-9bd1-4bfe-88e8-cf7e98dd82f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.247 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6be97f-9139-4ea2-b4c4-ab06c5d64e5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.270 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a82bdbb6-5fea-4dfd-9c88-096824362269]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.318 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[53211a79-6f88-417b-a443-e8eebf13b113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.326 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a2dd1f4f-0616-4e13-bb8b-bf090461113a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 NetworkManager[48871]: <info>  [1769162565.3281] manager: (tap5d65fb2c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.389 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ee31902b-ba77-4452-abf7-d47e53e8d69a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.393 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[646f95f6-1bc9-4d22-b460-91916b1b1637]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.411 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:45 np0005593233 NetworkManager[48871]: <info>  [1769162565.4413] device (tap5d65fb2c-50): carrier: link connected
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.450 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb481ae-7603-4b74-a5dc-b126d8549ab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.476 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce14b26a-1153-419b-a198-683ccca6c205]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d65fb2c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:89:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647807, 'reachable_time': 32124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264172, 'error': None, 'target': 'ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.497 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[02808360-960f-41b8-91a2-ed8933ede05b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:8995'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647807, 'tstamp': 647807}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264173, 'error': None, 'target': 'ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.518 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.526 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.526 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d3975305-2d9e-45c3-8983-5874faddaef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d65fb2c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:89:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647807, 'reachable_time': 32124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264174, 'error': None, 'target': 'ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.578 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1358b87e-0f20-4d63-9c79-029e5bb70aca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:45.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.656 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[46267e40-c5ed-458a-a378-3aac3cd603f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.658 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d65fb2c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.659 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.659 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d65fb2c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:45 np0005593233 kernel: tap5d65fb2c-50: entered promiscuous mode
Jan 23 05:02:45 np0005593233 NetworkManager[48871]: <info>  [1769162565.6628] manager: (tap5d65fb2c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.664 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d65fb2c-50, col_values=(('external_ids', {'iface-id': 'd4b096a2-d3ab-4a1a-b849-ee44a5218036'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:02:45Z|00472|binding|INFO|Releasing lport d4b096a2-d3ab-4a1a-b849-ee44a5218036 from this chassis (sb_readonly=0)
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.669 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d65fb2c-55c9-4b50-aff7-9502add4a8c8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d65fb2c-55c9-4b50-aff7-9502add4a8c8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.670 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ac38107f-41ae-4037-81ef-2c80369decb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.671 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-5d65fb2c-55c9-4b50-aff7-9502add4a8c8
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/5d65fb2c-55c9-4b50-aff7-9502add4a8c8.pid.haproxy
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 5d65fb2c-55c9-4b50-aff7-9502add4a8c8
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:02:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:02:45.671 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'env', 'PROCESS_TAG=haproxy-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d65fb2c-55c9-4b50-aff7-9502add4a8c8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:45.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:45 np0005593233 nova_compute[222017]: 2026-01-23 10:02:45.741 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:46 np0005593233 podman[264223]: 2026-01-23 10:02:46.121186216 +0000 UTC m=+0.062754145 container create 049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:02:46 np0005593233 systemd[1]: Started libpod-conmon-049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3.scope.
Jan 23 05:02:46 np0005593233 podman[264223]: 2026-01-23 10:02:46.088518323 +0000 UTC m=+0.030086272 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:02:46 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:02:46 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc0981334f59a34631eedce7995a63344fd1c7fc8d0c386ab1d94fb9d10c8427/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.220 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162566.219371, e7f91a83-ca8d-4833-817c-282f4d8aec99 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.221 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] VM Started (Lifecycle Event)#033[00m
Jan 23 05:02:46 np0005593233 podman[264223]: 2026-01-23 10:02:46.224880165 +0000 UTC m=+0.166448124 container init 049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:02:46 np0005593233 podman[264223]: 2026-01-23 10:02:46.231107099 +0000 UTC m=+0.172675028 container start 049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:02:46 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [NOTICE]   (264266) : New worker (264268) forked
Jan 23 05:02:46 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [NOTICE]   (264266) : Loading success.
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.279 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.284 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162566.2197602, e7f91a83-ca8d-4833-817c-282f4d8aec99 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.284 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.382 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.386 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:46 np0005593233 nova_compute[222017]: 2026-01-23 10:02:46.453 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.498 222021 DEBUG nova.network.neutron [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Updated VIF entry in instance network info cache for port 04adf614-da27-47e4-b969-26b9c2003d95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.499 222021 DEBUG nova.network.neutron [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Updating instance_info_cache with network_info: [{"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.521 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:47.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.633 222021 DEBUG nova.compute.manager [req-8a1a21be-f3c6-43b0-b9d7-c72f4d394785 req-f4180387-205e-42f0-8405-4b6ffe548b4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.633 222021 DEBUG oslo_concurrency.lockutils [req-8a1a21be-f3c6-43b0-b9d7-c72f4d394785 req-f4180387-205e-42f0-8405-4b6ffe548b4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.634 222021 DEBUG oslo_concurrency.lockutils [req-8a1a21be-f3c6-43b0-b9d7-c72f4d394785 req-f4180387-205e-42f0-8405-4b6ffe548b4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.634 222021 DEBUG oslo_concurrency.lockutils [req-8a1a21be-f3c6-43b0-b9d7-c72f4d394785 req-f4180387-205e-42f0-8405-4b6ffe548b4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.634 222021 DEBUG nova.compute.manager [req-8a1a21be-f3c6-43b0-b9d7-c72f4d394785 req-f4180387-205e-42f0-8405-4b6ffe548b4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Processing event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.635 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.641 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162567.6412997, e7f91a83-ca8d-4833-817c-282f4d8aec99 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.642 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.646 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.651 222021 INFO nova.virt.libvirt.driver [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Instance spawned successfully.#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.651 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.657 222021 DEBUG oslo_concurrency.lockutils [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-e7f91a83-ca8d-4833-817c-282f4d8aec99" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:47.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.740 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.746 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.747 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.748 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.748 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.749 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.749 222021 DEBUG nova.virt.libvirt.driver [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.758 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:47 np0005593233 nova_compute[222017]: 2026-01-23 10:02:47.937 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:48 np0005593233 nova_compute[222017]: 2026-01-23 10:02:48.052 222021 INFO nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Took 25.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:02:48 np0005593233 nova_compute[222017]: 2026-01-23 10:02:48.052 222021 DEBUG nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:48 np0005593233 podman[264277]: 2026-01-23 10:02:48.101024483 +0000 UTC m=+0.098580597 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:02:48 np0005593233 nova_compute[222017]: 2026-01-23 10:02:48.294 222021 INFO nova.compute.manager [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Took 27.53 seconds to build instance.#033[00m
Jan 23 05:02:48 np0005593233 nova_compute[222017]: 2026-01-23 10:02:48.426 222021 DEBUG oslo_concurrency.lockutils [None req-9660114f-14e4-4f33-88bf-1d67e2d45046 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 27.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:49.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:02:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:49.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.252 222021 DEBUG nova.compute.manager [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.253 222021 DEBUG oslo_concurrency.lockutils [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.253 222021 DEBUG oslo_concurrency.lockutils [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.253 222021 DEBUG oslo_concurrency.lockutils [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.253 222021 DEBUG nova.compute.manager [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] No waiting events found dispatching network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.254 222021 WARNING nova.compute.manager [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received unexpected event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:02:50 np0005593233 nova_compute[222017]: 2026-01-23 10:02:50.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:51.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:51.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:52 np0005593233 nova_compute[222017]: 2026-01-23 10:02:52.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:53.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:53.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.385 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.386 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.386 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.387 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.387 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.387 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.440 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.441 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Image id 84c0ef19-7f67-4bd3-95d8-507c3e0942ed yields fingerprint a6f655456a04e1d13ef2e44ed4544c38917863a2 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.441 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): checking#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.441 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 23 05:02:54 np0005593233 nova_compute[222017]: 2026-01-23 10:02:54.443 222021 INFO oslo.privsep.daemon [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmph0m324q0/privsep.sock']#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.207 222021 INFO oslo.privsep.daemon [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.083 264307 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.090 264307 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.094 264307 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.094 264307 INFO oslo.privsep.daemon [-] privsep daemon running as pid 264307#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.333 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.334 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] 51a66602-3548-4341-add1-988bd6c7aa57 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.334 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.335 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] e7f91a83-ca8d-4833-817c-282f4d8aec99 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.335 222021 WARNING nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.335 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Active base files: /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.335 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Removable base files: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.336 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.336 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.336 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.337 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 23 05:02:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:55.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:55.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:02:55 np0005593233 nova_compute[222017]: 2026-01-23 10:02:55.745 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:57 np0005593233 nova_compute[222017]: 2026-01-23 10:02:57.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:02:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:57.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:02:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:57.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:59.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:02:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:02:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:59.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:00 np0005593233 nova_compute[222017]: 2026-01-23 10:03:00.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593233 podman[264309]: 2026-01-23 10:03:01.095583543 +0000 UTC m=+0.094482403 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:03:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:01 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Jan 23 05:03:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:01.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:01.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.342 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.391 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 51a66602-3548-4341-add1-988bd6c7aa57 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.391 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.391 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid e7f91a83-ca8d-4833-817c-282f4d8aec99 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.392 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.392 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "51a66602-3548-4341-add1-988bd6c7aa57" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.392 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.393 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.393 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.394 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.472 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.490 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.491 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "51a66602-3548-4341-add1-988bd6c7aa57" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:02Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:fb:b5 10.100.0.14
Jan 23 05:03:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:02Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:fb:b5 10.100.0.14
Jan 23 05:03:02 np0005593233 nova_compute[222017]: 2026-01-23 10:03:02.672 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:03.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:03.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.232 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.232 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.232 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.233 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.233 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.234 222021 INFO nova.compute.manager [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Terminating instance#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.235 222021 DEBUG nova.compute.manager [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:04 np0005593233 kernel: tap98a4c1db-f3 (unregistering): left promiscuous mode
Jan 23 05:03:04 np0005593233 NetworkManager[48871]: <info>  [1769162584.4217] device (tap98a4c1db-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:03:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:04Z|00473|binding|INFO|Releasing lport 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b from this chassis (sb_readonly=0)
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:04Z|00474|binding|INFO|Setting lport 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b down in Southbound
Jan 23 05:03:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:04Z|00475|binding|INFO|Removing iface tap98a4c1db-f3 ovn-installed in OVS
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.463 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:8a:76 10.100.0.11'], port_security=['fa:16:3e:7f:8a:76 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ad8a7362-692a-4044-8393-1c10014f8bab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83406af9-ea42-4cda-96ee-b8c04ab0651a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.465 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 98a4c1db-f302-41c5-8f32-8dd7bcfddf9b in datapath 969bd83a-7542-46e3-90f0-1a81f26ba6b8 unbound from our chassis#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.466 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 969bd83a-7542-46e3-90f0-1a81f26ba6b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.468 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[110af5e1-9159-4904-9138-111abd0b8e60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.468 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 namespace which is not needed anymore#033[00m
Jan 23 05:03:04 np0005593233 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000064.scope: Deactivated successfully.
Jan 23 05:03:04 np0005593233 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000064.scope: Consumed 16.110s CPU time.
Jan 23 05:03:04 np0005593233 systemd-machined[190954]: Machine qemu-48-instance-00000064 terminated.
Jan 23 05:03:04 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [NOTICE]   (262173) : haproxy version is 2.8.14-c23fe91
Jan 23 05:03:04 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [NOTICE]   (262173) : path to executable is /usr/sbin/haproxy
Jan 23 05:03:04 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [WARNING]  (262173) : Exiting Master process...
Jan 23 05:03:04 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [ALERT]    (262173) : Current worker (262175) exited with code 143 (Terminated)
Jan 23 05:03:04 np0005593233 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[262169]: [WARNING]  (262173) : All workers exited. Exiting... (0)
Jan 23 05:03:04 np0005593233 systemd[1]: libpod-c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d.scope: Deactivated successfully.
Jan 23 05:03:04 np0005593233 podman[264350]: 2026-01-23 10:03:04.621028775 +0000 UTC m=+0.049064463 container died c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:03:04 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:03:04 np0005593233 systemd[1]: var-lib-containers-storage-overlay-7ff3e200e037d714a1bb98c2b8b51aa693b553c0694afc5c0cce3b09fa148902-merged.mount: Deactivated successfully.
Jan 23 05:03:04 np0005593233 podman[264350]: 2026-01-23 10:03:04.661650411 +0000 UTC m=+0.089686049 container cleanup c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:03:04 np0005593233 systemd[1]: libpod-conmon-c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d.scope: Deactivated successfully.
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.683 222021 INFO nova.virt.libvirt.driver [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Instance destroyed successfully.#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.684 222021 DEBUG nova.objects.instance [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'resources' on Instance uuid 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:04 np0005593233 podman[264383]: 2026-01-23 10:03:04.729791036 +0000 UTC m=+0.044119975 container remove c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.738 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[25595152-f67e-4fbc-b795-672e4891f0a9]: (4, ('Fri Jan 23 10:03:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 (c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d)\nc3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d\nFri Jan 23 10:03:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 (c3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d)\nc3d6f67c0bd233618da86be182b7332a2a2509b5ffa567f57e805f0b4615a32d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.740 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[baaa10dc-198b-46f8-afb5-f0817aeb0076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.741 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap969bd83a-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 kernel: tap969bd83a-70: left promiscuous mode
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.763 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.768 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[460eb0e6-30d8-4507-b9ab-5ebc6deef567]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.776 222021 DEBUG nova.virt.libvirt.vif [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:01:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1887091937',display_name='tempest-ListServerFiltersTestJSON-instance-1887091937',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1887091937',id=100,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-m0c5gvil',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:02:12Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.777 222021 DEBUG nova.network.os_vif_util [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "address": "fa:16:3e:7f:8a:76", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a4c1db-f3", "ovs_interfaceid": "98a4c1db-f302-41c5-8f32-8dd7bcfddf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.779 222021 DEBUG nova.network.os_vif_util [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.779 222021 DEBUG os_vif [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.782 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[35788566-293a-468f-be9c-9215ab121b5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.783 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f0cbd22b-3f28-434d-94b0-d54a162d292b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.784 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98a4c1db-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.786 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.788 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593233 nova_compute[222017]: 2026-01-23 10:03:04.792 222021 INFO os_vif [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:8a:76,bridge_name='br-int',has_traffic_filtering=True,id=98a4c1db-f302-41c5-8f32-8dd7bcfddf9b,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a4c1db-f3')#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.801 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bc17a785-ee9c-4bc0-86d5-4ed2266e6cb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644073, 'reachable_time': 22563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264405, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593233 systemd[1]: run-netns-ovnmeta\x2d969bd83a\x2d7542\x2d46e3\x2d90f0\x2d1a81f26ba6b8.mount: Deactivated successfully.
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.805 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:03:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:04.807 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[06c01eca-5f78-4009-95d7-c252e11b4515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.254 222021 INFO nova.virt.libvirt.driver [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Deleting instance files /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_del#033[00m
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.255 222021 INFO nova.virt.libvirt.driver [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Deletion of /var/lib/nova/instances/0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc_del complete#033[00m
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.350 222021 INFO nova.compute.manager [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Took 1.11 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.350 222021 DEBUG oslo.service.loopingcall [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.351 222021 DEBUG nova.compute.manager [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.351 222021 DEBUG nova.network.neutron [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:05.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:05.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:05 np0005593233 nova_compute[222017]: 2026-01-23 10:03:05.750 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.769376) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585769498, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 2357, "num_deletes": 251, "total_data_size": 5624491, "memory_usage": 5705936, "flush_reason": "Manual Compaction"}
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585906382, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 3639383, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49272, "largest_seqno": 51624, "table_properties": {"data_size": 3629919, "index_size": 5894, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20166, "raw_average_key_size": 20, "raw_value_size": 3610814, "raw_average_value_size": 3665, "num_data_blocks": 257, "num_entries": 985, "num_filter_entries": 985, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162390, "oldest_key_time": 1769162390, "file_creation_time": 1769162585, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 137049 microseconds, and 16900 cpu microseconds.
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.906439) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 3639383 bytes OK
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.906463) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.993058) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.993114) EVENT_LOG_v1 {"time_micros": 1769162585993100, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.993176) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 5613885, prev total WAL file size 5613885, number of live WAL files 2.
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.996763) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(3554KB)], [99(8925KB)]
Jan 23 05:03:05 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585996812, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 12779080, "oldest_snapshot_seqno": -1}
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7436 keys, 10854035 bytes, temperature: kUnknown
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162586188126, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 10854035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10805330, "index_size": 29009, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18629, "raw_key_size": 191941, "raw_average_key_size": 25, "raw_value_size": 10673462, "raw_average_value_size": 1435, "num_data_blocks": 1144, "num_entries": 7436, "num_filter_entries": 7436, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162585, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.191768) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 10854035 bytes
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.194097) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 66.8 rd, 56.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.7 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 7955, records dropped: 519 output_compression: NoCompression
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.194144) EVENT_LOG_v1 {"time_micros": 1769162586194127, "job": 62, "event": "compaction_finished", "compaction_time_micros": 191444, "compaction_time_cpu_micros": 35561, "output_level": 6, "num_output_files": 1, "total_output_size": 10854035, "num_input_records": 7955, "num_output_records": 7436, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162586195887, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162586200016, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:05.996536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.200205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.200213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.200217) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.200222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:03:06.200226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.591 222021 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-vif-unplugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.592 222021 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.592 222021 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.592 222021 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.592 222021 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] No waiting events found dispatching network-vif-unplugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.593 222021 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-vif-unplugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:03:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:06.984 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:06 np0005593233 nova_compute[222017]: 2026-01-23 10:03:06.984 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:06.985 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.018 222021 DEBUG nova.network.neutron [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.046 222021 INFO nova.compute.manager [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Took 1.69 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.107 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.108 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.163 222021 DEBUG nova.compute.manager [req-6af26d5c-d4da-432f-a608-d940468f30d8 req-33e4f6e3-d745-45b8-a9dc-9fc9f896703f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-vif-deleted-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.229 222021 DEBUG oslo_concurrency.processutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.339 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.340 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.340 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.340 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.341 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.343 222021 INFO nova.compute.manager [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Terminating instance#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.344 222021 DEBUG nova.compute.manager [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:07 np0005593233 kernel: tap04adf614-da (unregistering): left promiscuous mode
Jan 23 05:03:07 np0005593233 NetworkManager[48871]: <info>  [1769162587.4085] device (tap04adf614-da): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:03:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:07Z|00476|binding|INFO|Releasing lport 04adf614-da27-47e4-b969-26b9c2003d95 from this chassis (sb_readonly=0)
Jan 23 05:03:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:07Z|00477|binding|INFO|Setting lport 04adf614-da27-47e4-b969-26b9c2003d95 down in Southbound
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:07Z|00478|binding|INFO|Removing iface tap04adf614-da ovn-installed in OVS
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593233 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000068.scope: Consumed 15.920s CPU time.
Jan 23 05:03:07 np0005593233 systemd-machined[190954]: Machine qemu-49-instance-00000068 terminated.
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.556 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:fb:b5 10.100.0.14'], port_security=['fa:16:3e:9e:fb:b5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e7f91a83-ca8d-4833-817c-282f4d8aec99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '449f402258804f41b10f91a13da1176d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '07d82268-5230-4e65-b516-4629d542718c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c66279a-df2c-4cb2-9aaa-3c8f92544e07, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=04adf614-da27-47e4-b969-26b9c2003d95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.558 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 04adf614-da27-47e4-b969-26b9c2003d95 in datapath 5d65fb2c-55c9-4b50-aff7-9502add4a8c8 unbound from our chassis#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.559 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d65fb2c-55c9-4b50-aff7-9502add4a8c8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.560 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a844298a-3ec4-4e5b-8511-191c5b256cb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.561 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8 namespace which is not needed anymore#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.575 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.583 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.591 222021 INFO nova.virt.libvirt.driver [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Instance destroyed successfully.#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.593 222021 DEBUG nova.objects.instance [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lazy-loading 'resources' on Instance uuid e7f91a83-ca8d-4833-817c-282f4d8aec99 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:07.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:03:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:07.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.741 222021 DEBUG nova.virt.libvirt.vif [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:02:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1630527270',display_name='tempest-ListServersNegativeTestJSON-server-1630527270-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1630527270-2',id=104,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-23T10:02:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='449f402258804f41b10f91a13da1176d',ramdisk_id='',reservation_id='r-2yft4oee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-2075106169',owner_user_name='tempest-ListServersNegativeTestJSON-2075106169-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:02:48Z,user_data=None,user_id='9127d08a3bf5404e8cb8c84ed7152834',uuid=e7f91a83-ca8d-4833-817c-282f4d8aec99,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.742 222021 DEBUG nova.network.os_vif_util [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Converting VIF {"id": "04adf614-da27-47e4-b969-26b9c2003d95", "address": "fa:16:3e:9e:fb:b5", "network": {"id": "5d65fb2c-55c9-4b50-aff7-9502add4a8c8", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1244961874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "449f402258804f41b10f91a13da1176d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04adf614-da", "ovs_interfaceid": "04adf614-da27-47e4-b969-26b9c2003d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.743 222021 DEBUG nova.network.os_vif_util [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.744 222021 DEBUG os_vif [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:03:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2263377619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.748 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04adf614-da, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.749 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.752 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.755 222021 INFO os_vif [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:fb:b5,bridge_name='br-int',has_traffic_filtering=True,id=04adf614-da27-47e4-b969-26b9c2003d95,network=Network(5d65fb2c-55c9-4b50-aff7-9502add4a8c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04adf614-da')#033[00m
Jan 23 05:03:07 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [NOTICE]   (264266) : haproxy version is 2.8.14-c23fe91
Jan 23 05:03:07 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [NOTICE]   (264266) : path to executable is /usr/sbin/haproxy
Jan 23 05:03:07 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [WARNING]  (264266) : Exiting Master process...
Jan 23 05:03:07 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [WARNING]  (264266) : Exiting Master process...
Jan 23 05:03:07 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [ALERT]    (264266) : Current worker (264268) exited with code 143 (Terminated)
Jan 23 05:03:07 np0005593233 neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8[264261]: [WARNING]  (264266) : All workers exited. Exiting... (0)
Jan 23 05:03:07 np0005593233 systemd[1]: libpod-049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.778 222021 DEBUG oslo_concurrency.processutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:07 np0005593233 podman[264478]: 2026-01-23 10:03:07.786265138 +0000 UTC m=+0.114081630 container died 049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.787 222021 DEBUG nova.compute.provider_tree [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:07 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3-userdata-shm.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593233 systemd[1]: var-lib-containers-storage-overlay-bc0981334f59a34631eedce7995a63344fd1c7fc8d0c386ab1d94fb9d10c8427-merged.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593233 podman[264478]: 2026-01-23 10:03:07.827055798 +0000 UTC m=+0.154872300 container cleanup 049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.832 222021 DEBUG nova.scheduler.client.report [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:07 np0005593233 systemd[1]: libpod-conmon-049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593233 podman[264528]: 2026-01-23 10:03:07.899430072 +0000 UTC m=+0.044771573 container remove 049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.907 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d49809a8-77cc-48b4-a6c9-26d2086ddda0]: (4, ('Fri Jan 23 10:03:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8 (049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3)\n049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3\nFri Jan 23 10:03:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8 (049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3)\n049e7e27f1f0f019ace45efd4512097031e92540199b3328a737c7d764457ae3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.909 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[88aad9f5-95d0-467b-bbcd-1bd2fb198d95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.910 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d65fb2c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 kernel: tap5d65fb2c-50: left promiscuous mode
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.963 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.966 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b6da6c-cecc-44db-b259-a68a70092abe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593233 nova_compute[222017]: 2026-01-23 10:03:07.984 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.990 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7a58c4cd-dd32-423a-9c75-0e04f57dd544]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:07.993 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd3cf33-1bee-4f49-ac57-7227ef89f00c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.000 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:08.020 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[985bc57c-e8b1-474e-b4f5-b34cd503f29c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647794, 'reachable_time': 44461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264543, 'error': None, 'target': 'ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:08.023 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d65fb2c-55c9-4b50-aff7-9502add4a8c8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:03:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:08.023 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[43385816-2c2c-4dbc-aaeb-97a7d9a97a66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:08 np0005593233 systemd[1]: run-netns-ovnmeta\x2d5d65fb2c\x2d55c9\x2d4b50\x2daff7\x2d9502add4a8c8.mount: Deactivated successfully.
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.080 222021 INFO nova.scheduler.client.report [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Deleted allocations for instance 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.179 222021 DEBUG oslo_concurrency.lockutils [None req-d4dc55af-fb9f-4873-8b14-6ab9ad07e1f9 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.581 222021 INFO nova.virt.libvirt.driver [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Deleting instance files /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99_del#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.582 222021 INFO nova.virt.libvirt.driver [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Deletion of /var/lib/nova/instances/e7f91a83-ca8d-4833-817c-282f4d8aec99_del complete#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.692 222021 INFO nova.compute.manager [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Took 1.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.693 222021 DEBUG oslo.service.loopingcall [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.694 222021 DEBUG nova.compute.manager [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.694 222021 DEBUG nova.network.neutron [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.822 222021 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.822 222021 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.822 222021 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.823 222021 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.823 222021 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] No waiting events found dispatching network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:08 np0005593233 nova_compute[222017]: 2026-01-23 10:03:08.823 222021 WARNING nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Received unexpected event network-vif-plugged-98a4c1db-f302-41c5-8f32-8dd7bcfddf9b for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:03:09 np0005593233 nova_compute[222017]: 2026-01-23 10:03:09.519 222021 DEBUG nova.network.neutron [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:09 np0005593233 nova_compute[222017]: 2026-01-23 10:03:09.557 222021 INFO nova.compute.manager [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Took 0.86 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:09 np0005593233 nova_compute[222017]: 2026-01-23 10:03:09.610 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:09 np0005593233 nova_compute[222017]: 2026-01-23 10:03:09.611 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:09 np0005593233 nova_compute[222017]: 2026-01-23 10:03:09.660 222021 DEBUG nova.compute.manager [req-5717e6eb-ad41-46f2-bb4e-c0861b176e5d req-67f9ba79-95b6-469e-9a16-aea5610f68e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received event network-vif-deleted-04adf614-da27-47e4-b969-26b9c2003d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:09.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:09 np0005593233 nova_compute[222017]: 2026-01-23 10:03:09.695 222021 DEBUG oslo_concurrency.processutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:03:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:09.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:03:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/548151798' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.166 222021 DEBUG oslo_concurrency.processutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.175 222021 DEBUG nova.compute.provider_tree [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.358 222021 DEBUG nova.scheduler.client.report [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.392 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.450 222021 INFO nova.scheduler.client.report [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Deleted allocations for instance e7f91a83-ca8d-4833-817c-282f4d8aec99#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.753 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.882 222021 DEBUG oslo_concurrency.lockutils [None req-6a7e042c-9e64-468e-a212-b6eade1c3338 9127d08a3bf5404e8cb8c84ed7152834 449f402258804f41b10f91a13da1176d - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.955 222021 DEBUG nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received event network-vif-unplugged-04adf614-da27-47e4-b969-26b9c2003d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.956 222021 DEBUG oslo_concurrency.lockutils [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.957 222021 DEBUG oslo_concurrency.lockutils [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.957 222021 DEBUG oslo_concurrency.lockutils [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.957 222021 DEBUG nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] No waiting events found dispatching network-vif-unplugged-04adf614-da27-47e4-b969-26b9c2003d95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.957 222021 WARNING nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received unexpected event network-vif-unplugged-04adf614-da27-47e4-b969-26b9c2003d95 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.958 222021 DEBUG nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.958 222021 DEBUG oslo_concurrency.lockutils [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.958 222021 DEBUG oslo_concurrency.lockutils [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.958 222021 DEBUG oslo_concurrency.lockutils [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e7f91a83-ca8d-4833-817c-282f4d8aec99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.959 222021 DEBUG nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] No waiting events found dispatching network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:10 np0005593233 nova_compute[222017]: 2026-01-23 10:03:10.959 222021 WARNING nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Received unexpected event network-vif-plugged-04adf614-da27-47e4-b969-26b9c2003d95 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:03:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:10.988 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:11.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:11.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:12 np0005593233 nova_compute[222017]: 2026-01-23 10:03:12.751 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:13.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:13.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:15Z|00479|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:03:15 np0005593233 nova_compute[222017]: 2026-01-23 10:03:15.279 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:15.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:15.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:15 np0005593233 nova_compute[222017]: 2026-01-23 10:03:15.755 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:17.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:17.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:17 np0005593233 nova_compute[222017]: 2026-01-23 10:03:17.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:19 np0005593233 podman[264567]: 2026-01-23 10:03:19.128960961 +0000 UTC m=+0.137899666 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:03:19 np0005593233 nova_compute[222017]: 2026-01-23 10:03:19.681 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162584.6788123, 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:19 np0005593233 nova_compute[222017]: 2026-01-23 10:03:19.682 222021 INFO nova.compute.manager [-] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:19.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:19 np0005593233 nova_compute[222017]: 2026-01-23 10:03:19.709 222021 DEBUG nova.compute.manager [None req-1fadfda5-5363-4522-99a4-872dd27a311d - - - - - -] [instance: 0d5e75b4-f0ae-48f0-8c0b-207c58f6c2dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:19.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:20 np0005593233 nova_compute[222017]: 2026-01-23 10:03:20.757 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:21.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:21.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:22 np0005593233 nova_compute[222017]: 2026-01-23 10:03:22.591 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162587.5901895, e7f91a83-ca8d-4833-817c-282f4d8aec99 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:22 np0005593233 nova_compute[222017]: 2026-01-23 10:03:22.592 222021 INFO nova.compute.manager [-] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:22 np0005593233 nova_compute[222017]: 2026-01-23 10:03:22.621 222021 DEBUG nova.compute.manager [None req-635ae37c-0f86-47fb-a61f-7bddaed32d9f - - - - - -] [instance: e7f91a83-ca8d-4833-817c-282f4d8aec99] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:22 np0005593233 nova_compute[222017]: 2026-01-23 10:03:22.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.004000112s ======
Jan 23 05:03:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:23.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000112s
Jan 23 05:03:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:23.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:25.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:25 np0005593233 nova_compute[222017]: 2026-01-23 10:03:25.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:25.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:27.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:27.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:27 np0005593233 nova_compute[222017]: 2026-01-23 10:03:27.810 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.223 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.224 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.247 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.354 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.355 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.363 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.364 222021 INFO nova.compute.claims [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.437 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:28 np0005593233 nova_compute[222017]: 2026-01-23 10:03:28.777 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/829345005' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.259 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.267 222021 DEBUG nova.compute.provider_tree [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.287 222021 DEBUG nova.scheduler.client.report [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.323 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.324 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.406 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.437 222021 INFO nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.475 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.624 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.625 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.626 222021 INFO nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Creating image(s)#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.671 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.705 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:29.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.741 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.747 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:29.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.829 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.831 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.832 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.832 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.862 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:29 np0005593233 nova_compute[222017]: 2026-01-23 10:03:29.867 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.174 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.307s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.270 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] resizing rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.390 222021 DEBUG nova.objects.instance [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lazy-loading 'migration_context' on Instance uuid a145fd79-f99f-49e9-9e3e-bb7605cb8e1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.410 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.410 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Ensure instance console log exists: /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.411 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.412 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.412 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.413 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.419 222021 WARNING nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.424 222021 DEBUG nova.virt.libvirt.host [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.424 222021 DEBUG nova.virt.libvirt.host [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.428 222021 DEBUG nova.virt.libvirt.host [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.429 222021 DEBUG nova.virt.libvirt.host [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.430 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.431 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.431 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.431 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.432 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.432 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.432 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.432 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.433 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.433 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.433 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.433 222021 DEBUG nova.virt.hardware [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.437 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/511920656' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.941 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.970 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:30 np0005593233 nova_compute[222017]: 2026-01-23 10:03:30.976 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:31 np0005593233 podman[264955]: 2026-01-23 10:03:31.297495679 +0000 UTC m=+0.087833406 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1280766226' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.490 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.494 222021 DEBUG nova.objects.instance [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lazy-loading 'pci_devices' on Instance uuid a145fd79-f99f-49e9-9e3e-bb7605cb8e1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.524 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <uuid>a145fd79-f99f-49e9-9e3e-bb7605cb8e1e</uuid>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <name>instance-0000006a</name>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersAaction247Test-server-224488971</nova:name>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:03:30</nova:creationTime>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:user uuid="175c8371597942d18b2ec02e7010ddbf">tempest-ServersAaction247Test-1955277360-project-member</nova:user>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <nova:project uuid="fc506511c3ec4a89b6bdb7d64fa0df08">tempest-ServersAaction247Test-1955277360</nova:project>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <entry name="serial">a145fd79-f99f-49e9-9e3e-bb7605cb8e1e</entry>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <entry name="uuid">a145fd79-f99f-49e9-9e3e-bb7605cb8e1e</entry>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk.config">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/console.log" append="off"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:03:31 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:03:31 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:03:31 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:03:31 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:03:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:31.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.713 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.713 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.714 222021 INFO nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Using config drive#033[00m
Jan 23 05:03:31 np0005593233 nova_compute[222017]: 2026-01-23 10:03:31.748 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:31.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.001 222021 INFO nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Creating config drive at /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/disk.config#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.007 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp14ftsl_6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.152 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp14ftsl_6" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.188 222021 DEBUG nova.storage.rbd_utils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] rbd image a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.193 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/disk.config a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.373 222021 DEBUG oslo_concurrency.processutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/disk.config a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.375 222021 INFO nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Deleting local config drive /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e/disk.config because it was imported into RBD.#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:32 np0005593233 systemd-machined[190954]: New machine qemu-50-instance-0000006a.
Jan 23 05:03:32 np0005593233 systemd[1]: Started Virtual Machine qemu-50-instance-0000006a.
Jan 23 05:03:32 np0005593233 nova_compute[222017]: 2026-01-23 10:03:32.814 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e263 e263: 3 total, 3 up, 3 in
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.210 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162613.2098696, a145fd79-f99f-49e9-9e3e-bb7605cb8e1e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.210 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.213 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.213 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.217 222021 INFO nova.virt.libvirt.driver [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Instance spawned successfully.#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.217 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.245 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.251 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.252 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.252 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.253 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.253 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.254 222021 DEBUG nova.virt.libvirt.driver [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.258 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.291 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.292 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162613.2129686, a145fd79-f99f-49e9-9e3e-bb7605cb8e1e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.292 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] VM Started (Lifecycle Event)#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.328 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.333 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.357 222021 INFO nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Took 3.73 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.357 222021 DEBUG nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.396 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.401 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.478 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.478 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.479 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.479 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.479 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.536 222021 INFO nova.compute.manager [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Took 5.21 seconds to build instance.#033[00m
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.580 222021 DEBUG oslo_concurrency.lockutils [None req-51d0a8fa-7973-4234-86e8-fc2ca0140762 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:33.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:33.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/8282457' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:33 np0005593233 nova_compute[222017]: 2026-01-23 10:03:33.926 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.026 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.026 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.030 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000006a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.030 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000006a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.224 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.225 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4250MB free_disk=20.896987915039062GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.226 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.226 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.349 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 51a66602-3548-4341-add1-988bd6c7aa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.349 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance a145fd79-f99f-49e9-9e3e-bb7605cb8e1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.350 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.350 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.466 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1909147706' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.920 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.926 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:34 np0005593233 nova_compute[222017]: 2026-01-23 10:03:34.985 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:35 np0005593233 nova_compute[222017]: 2026-01-23 10:03:35.036 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:03:35 np0005593233 nova_compute[222017]: 2026-01-23 10:03:35.037 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:35.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:35 np0005593233 nova_compute[222017]: 2026-01-23 10:03:35.764 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:35.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.020 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.021 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.021 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:03:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.235 222021 DEBUG nova.compute.manager [None req-53e5b29c-1c73-4f3b-89ad-7fea6bad0774 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.288 222021 INFO nova.compute.manager [None req-53e5b29c-1c73-4f3b-89ad-7fea6bad0774 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] instance snapshotting#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.289 222021 DEBUG nova.objects.instance [None req-53e5b29c-1c73-4f3b-89ad-7fea6bad0774 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lazy-loading 'flavor' on Instance uuid a145fd79-f99f-49e9-9e3e-bb7605cb8e1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.518 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.519 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.519 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.520 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.520 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.521 222021 INFO nova.compute.manager [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Terminating instance#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.522 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "refresh_cache-a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.522 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquired lock "refresh_cache-a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.523 222021 DEBUG nova.network.neutron [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.853 222021 INFO nova.virt.libvirt.driver [None req-53e5b29c-1c73-4f3b-89ad-7fea6bad0774 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Beginning live snapshot process#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.903 222021 DEBUG nova.compute.manager [None req-53e5b29c-1c73-4f3b-89ad-7fea6bad0774 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Jan 23 05:03:36 np0005593233 nova_compute[222017]: 2026-01-23 10:03:36.922 222021 DEBUG nova.network.neutron [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:03:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:37.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.732 222021 DEBUG nova.network.neutron [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.760 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Releasing lock "refresh_cache-a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.761 222021 DEBUG nova.compute.manager [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:37.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:37 np0005593233 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Jan 23 05:03:37 np0005593233 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006a.scope: Consumed 5.383s CPU time.
Jan 23 05:03:37 np0005593233 systemd-machined[190954]: Machine qemu-50-instance-0000006a terminated.
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.848 222021 DEBUG nova.compute.manager [None req-53e5b29c-1c73-4f3b-89ad-7fea6bad0774 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.984 222021 INFO nova.virt.libvirt.driver [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Instance destroyed successfully.#033[00m
Jan 23 05:03:37 np0005593233 nova_compute[222017]: 2026-01-23 10:03:37.985 222021 DEBUG nova.objects.instance [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lazy-loading 'resources' on Instance uuid a145fd79-f99f-49e9-9e3e-bb7605cb8e1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.443 222021 INFO nova.virt.libvirt.driver [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Deleting instance files /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_del#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.444 222021 INFO nova.virt.libvirt.driver [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Deletion of /var/lib/nova/instances/a145fd79-f99f-49e9-9e3e-bb7605cb8e1e_del complete#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.518 222021 INFO nova.compute.manager [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.519 222021 DEBUG oslo.service.loopingcall [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.519 222021 DEBUG nova.compute.manager [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.519 222021 DEBUG nova.network.neutron [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.753 222021 DEBUG nova.network.neutron [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.770 222021 DEBUG nova.network.neutron [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.789 222021 INFO nova.compute.manager [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Took 0.27 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.858 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.858 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:38 np0005593233 nova_compute[222017]: 2026-01-23 10:03:38.946 222021 DEBUG oslo_concurrency.processutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:39 np0005593233 nova_compute[222017]: 2026-01-23 10:03:39.402 222021 DEBUG oslo_concurrency.processutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:39 np0005593233 nova_compute[222017]: 2026-01-23 10:03:39.410 222021 DEBUG nova.compute.provider_tree [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:39 np0005593233 nova_compute[222017]: 2026-01-23 10:03:39.435 222021 DEBUG nova.scheduler.client.report [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:39 np0005593233 nova_compute[222017]: 2026-01-23 10:03:39.465 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:39 np0005593233 nova_compute[222017]: 2026-01-23 10:03:39.509 222021 INFO nova.scheduler.client.report [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Deleted allocations for instance a145fd79-f99f-49e9-9e3e-bb7605cb8e1e#033[00m
Jan 23 05:03:39 np0005593233 nova_compute[222017]: 2026-01-23 10:03:39.592 222021 DEBUG oslo_concurrency.lockutils [None req-41278d98-0528-4c80-826d-9bb18c258d4c 175c8371597942d18b2ec02e7010ddbf fc506511c3ec4a89b6bdb7d64fa0df08 - - default default] Lock "a145fd79-f99f-49e9-9e3e-bb7605cb8e1e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:39.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:39.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:40 np0005593233 nova_compute[222017]: 2026-01-23 10:03:40.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e264 e264: 3 total, 3 up, 3 in
Jan 23 05:03:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:41.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:41.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:42 np0005593233 nova_compute[222017]: 2026-01-23 10:03:42.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:42 np0005593233 nova_compute[222017]: 2026-01-23 10:03:42.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:03:42 np0005593233 nova_compute[222017]: 2026-01-23 10:03:42.427 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:03:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:42.666 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:42.666 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:42.667 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:42 np0005593233 nova_compute[222017]: 2026-01-23 10:03:42.820 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:43 np0005593233 nova_compute[222017]: 2026-01-23 10:03:43.561 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:43.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:43.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:44 np0005593233 nova_compute[222017]: 2026-01-23 10:03:44.421 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:03:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3452985035' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:03:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:03:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3452985035' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:03:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:45.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:45.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:45 np0005593233 nova_compute[222017]: 2026-01-23 10:03:45.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:47.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:47.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:47 np0005593233 nova_compute[222017]: 2026-01-23 10:03:47.822 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.565 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.566 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.589 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.681 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.682 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.689 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.689 222021 INFO nova.compute.claims [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:03:48 np0005593233 nova_compute[222017]: 2026-01-23 10:03:48.855 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/297843232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.350 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.359 222021 DEBUG nova.compute.provider_tree [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.383 222021 DEBUG nova.scheduler.client.report [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.407 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.408 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.467 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.468 222021 DEBUG nova.network.neutron [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.494 222021 INFO nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.522 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.654 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.656 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.656 222021 INFO nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Creating image(s)#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.688 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.720 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:49.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.756 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.762 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:49.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.836 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.837 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.837 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.838 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.876 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:49 np0005593233 nova_compute[222017]: 2026-01-23 10:03:49.881 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:50 np0005593233 podman[265483]: 2026-01-23 10:03:50.100264315 +0000 UTC m=+0.103393432 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.230 222021 DEBUG nova.policy [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fdaabb8d5b2d4e7caa209f6918f24078', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b6678355ecf9441c916291536c42c2cd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.378 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.468 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.756 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.875s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.837 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.845 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] resizing rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.956 222021 DEBUG nova.objects.instance [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lazy-loading 'migration_context' on Instance uuid c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.975 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.975 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Ensure instance console log exists: /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.976 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.976 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:50 np0005593233 nova_compute[222017]: 2026-01-23 10:03:50.977 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 e265: 3 total, 3 up, 3 in
Jan 23 05:03:51 np0005593233 nova_compute[222017]: 2026-01-23 10:03:51.692 222021 DEBUG nova.network.neutron [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Successfully created port: cd413a72-0d0a-4fd3-8424-f49e75b1bfce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:03:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:51.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:51.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.594 222021 DEBUG nova.network.neutron [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Successfully updated port: cd413a72-0d0a-4fd3-8424-f49e75b1bfce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.612 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "refresh_cache-c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.613 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquired lock "refresh_cache-c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.613 222021 DEBUG nova.network.neutron [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.779 222021 DEBUG nova.compute.manager [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-changed-cd413a72-0d0a-4fd3-8424-f49e75b1bfce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.780 222021 DEBUG nova.compute.manager [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Refreshing instance network info cache due to event network-changed-cd413a72-0d0a-4fd3-8424-f49e75b1bfce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.780 222021 DEBUG oslo_concurrency.lockutils [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.813 222021 DEBUG nova.network.neutron [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.986 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162617.9824667, a145fd79-f99f-49e9-9e3e-bb7605cb8e1e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:52 np0005593233 nova_compute[222017]: 2026-01-23 10:03:52.986 222021 INFO nova.compute.manager [-] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:53 np0005593233 nova_compute[222017]: 2026-01-23 10:03:53.012 222021 DEBUG nova.compute.manager [None req-bde6490c-0f34-4e02-92d1-7aabd45ca0a9 - - - - - -] [instance: a145fd79-f99f-49e9-9e3e-bb7605cb8e1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:53.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.731 222021 DEBUG nova.network.neutron [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Updating instance_info_cache with network_info: [{"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.776 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Releasing lock "refresh_cache-c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.777 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Instance network_info: |[{"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.778 222021 DEBUG oslo_concurrency.lockutils [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.778 222021 DEBUG nova.network.neutron [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Refreshing network info cache for port cd413a72-0d0a-4fd3-8424-f49e75b1bfce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.783 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Start _get_guest_xml network_info=[{"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.789 222021 WARNING nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.795 222021 DEBUG nova.virt.libvirt.host [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.797 222021 DEBUG nova.virt.libvirt.host [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.806 222021 DEBUG nova.virt.libvirt.host [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.807 222021 DEBUG nova.virt.libvirt.host [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.809 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.809 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.810 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.810 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.810 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.810 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.811 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.811 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.811 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.811 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.812 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.812 222021 DEBUG nova.virt.hardware [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:03:54 np0005593233 nova_compute[222017]: 2026-01-23 10:03:54.816 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.020 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/57506671' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.301 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.336 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.341 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:55.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1193405051' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:55.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.834 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.836 222021 DEBUG nova.virt.libvirt.vif [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-597489643',display_name='tempest-NoVNCConsoleTestJSON-server-597489643',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-597489643',id=108,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b6678355ecf9441c916291536c42c2cd',ramdisk_id='',reservation_id='r-684aaivj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-245448120',owner_user_name='tempest-NoVNCConsoleTestJSON-245448120-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:03:49Z,user_data=None,user_id='fdaabb8d5b2d4e7caa209f6918f24078',uuid=c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.836 222021 DEBUG nova.network.os_vif_util [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Converting VIF {"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.838 222021 DEBUG nova.network.os_vif_util [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.839 222021 DEBUG nova.objects.instance [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lazy-loading 'pci_devices' on Instance uuid c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.883 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <uuid>c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0</uuid>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <name>instance-0000006c</name>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:name>tempest-NoVNCConsoleTestJSON-server-597489643</nova:name>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:03:54</nova:creationTime>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:user uuid="fdaabb8d5b2d4e7caa209f6918f24078">tempest-NoVNCConsoleTestJSON-245448120-project-member</nova:user>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:project uuid="b6678355ecf9441c916291536c42c2cd">tempest-NoVNCConsoleTestJSON-245448120</nova:project>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <nova:port uuid="cd413a72-0d0a-4fd3-8424-f49e75b1bfce">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <entry name="serial">c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0</entry>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <entry name="uuid">c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0</entry>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk.config">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:a3:3e:49"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <target dev="tapcd413a72-0d"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/console.log" append="off"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:03:55 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:03:55 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:03:55 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:03:55 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.885 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Preparing to wait for external event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.887 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.887 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.888 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.890 222021 DEBUG nova.virt.libvirt.vif [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-597489643',display_name='tempest-NoVNCConsoleTestJSON-server-597489643',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-597489643',id=108,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b6678355ecf9441c916291536c42c2cd',ramdisk_id='',reservation_id='r-684aaivj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-245448120',owner_user_name='tempest-NoVNCConsoleTestJSON-245448120-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:03:49Z,user_data=None,user_id='fdaabb8d5b2d4e7caa209f6918f24078',uuid=c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.890 222021 DEBUG nova.network.os_vif_util [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Converting VIF {"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.892 222021 DEBUG nova.network.os_vif_util [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.893 222021 DEBUG os_vif [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.894 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.895 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.896 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.901 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd413a72-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.901 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd413a72-0d, col_values=(('external_ids', {'iface-id': 'cd413a72-0d0a-4fd3-8424-f49e75b1bfce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:3e:49', 'vm-uuid': 'c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.903 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593233 NetworkManager[48871]: <info>  [1769162635.9053] manager: (tapcd413a72-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.915 222021 INFO os_vif [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d')#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.997 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.998 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.998 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] No VIF found with MAC fa:16:3e:a3:3e:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:03:55 np0005593233 nova_compute[222017]: 2026-01-23 10:03:55.999 222021 INFO nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Using config drive#033[00m
Jan 23 05:03:56 np0005593233 nova_compute[222017]: 2026-01-23 10:03:56.031 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.216 222021 INFO nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Creating config drive at /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/disk.config#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.221 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpli61zxg0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.360 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpli61zxg0" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.395 222021 DEBUG nova.storage.rbd_utils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] rbd image c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.399 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/disk.config c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.432 222021 DEBUG nova.network.neutron [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Updated VIF entry in instance network info cache for port cd413a72-0d0a-4fd3-8424-f49e75b1bfce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.433 222021 DEBUG nova.network.neutron [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Updating instance_info_cache with network_info: [{"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.472 222021 DEBUG oslo_concurrency.lockutils [req-323c64cb-77a5-4eb5-aec5-7120ee89d178 req-6d6a7575-1bee-47ee-863e-5d6a06363c10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.678 222021 DEBUG oslo_concurrency.processutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/disk.config c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.279s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.679 222021 INFO nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Deleting local config drive /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0/disk.config because it was imported into RBD.#033[00m
Jan 23 05:03:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:57.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:57 np0005593233 kernel: tapcd413a72-0d: entered promiscuous mode
Jan 23 05:03:57 np0005593233 NetworkManager[48871]: <info>  [1769162637.7560] manager: (tapcd413a72-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.756 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:57Z|00480|binding|INFO|Claiming lport cd413a72-0d0a-4fd3-8424-f49e75b1bfce for this chassis.
Jan 23 05:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:57Z|00481|binding|INFO|cd413a72-0d0a-4fd3-8424-f49e75b1bfce: Claiming fa:16:3e:a3:3e:49 10.100.0.13
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.764 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:3e:49 10.100.0.13'], port_security=['fa:16:3e:a3:3e:49 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6197e04-f0af-41f0-9896-46123565af56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6678355ecf9441c916291536c42c2cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f82f8d41-43d2-4e09-a6b4-ffbbab6023ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34528a28-8ef4-420e-8477-52d28e2c519c, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=cd413a72-0d0a-4fd3-8424-f49e75b1bfce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.766 140224 INFO neutron.agent.ovn.metadata.agent [-] Port cd413a72-0d0a-4fd3-8424-f49e75b1bfce in datapath c6197e04-f0af-41f0-9896-46123565af56 bound to our chassis#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.767 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c6197e04-f0af-41f0-9896-46123565af56#033[00m
Jan 23 05:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:57Z|00482|binding|INFO|Setting lport cd413a72-0d0a-4fd3-8424-f49e75b1bfce ovn-installed in OVS
Jan 23 05:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:57Z|00483|binding|INFO|Setting lport cd413a72-0d0a-4fd3-8424-f49e75b1bfce up in Southbound
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.777 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:57 np0005593233 nova_compute[222017]: 2026-01-23 10:03:57.780 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.788 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e99a8cb-7fac-40d2-89b6-4fd908368bf4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.789 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc6197e04-f1 in ovnmeta-c6197e04-f0af-41f0-9896-46123565af56 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.792 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc6197e04-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.793 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[970cccb1-8d64-497a-b66d-d2918d9aed12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.794 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[92fe7962-996b-4840-aa40-4704214fd257]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 systemd-udevd[265722]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:03:57 np0005593233 systemd-machined[190954]: New machine qemu-51-instance-0000006c.
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.810 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbc43f4-aafb-4186-86ca-8b93bb7ecf3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 NetworkManager[48871]: <info>  [1769162637.8188] device (tapcd413a72-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:03:57 np0005593233 NetworkManager[48871]: <info>  [1769162637.8197] device (tapcd413a72-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:03:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:57 np0005593233 systemd[1]: Started Virtual Machine qemu-51-instance-0000006c.
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.842 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[71aa59bf-107c-441e-913f-f87ced0a4b73]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.892 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0af72999-6a85-48a7-ad8d-5033f63eff2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.900 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bd102171-9736-45aa-8ea1-8d98938715c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 NetworkManager[48871]: <info>  [1769162637.9482] manager: (tapc6197e04-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.987 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[902970e6-9353-4928-aa0f-d64b814b844f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:57.992 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f2cc7ed7-b14f-4f02-97fb-31ea843e5d19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 NetworkManager[48871]: <info>  [1769162638.0230] device (tapc6197e04-f0): carrier: link connected
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.030 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7d842240-39d3-449a-875b-70becbb78d9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.052 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff12731-8633-4857-87a0-7cfa338c4f3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc6197e04-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:14:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655065, 'reachable_time': 15420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265756, 'error': None, 'target': 'ovnmeta-c6197e04-f0af-41f0-9896-46123565af56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.074 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b9571921-eed3-49eb-972f-abdf82a95a89]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:142c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655065, 'tstamp': 655065}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265757, 'error': None, 'target': 'ovnmeta-c6197e04-f0af-41f0-9896-46123565af56', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.098 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[edc186ab-c63a-4db6-bc9f-27a65f125bc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc6197e04-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:14:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655065, 'reachable_time': 15420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265758, 'error': None, 'target': 'ovnmeta-c6197e04-f0af-41f0-9896-46123565af56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.144 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[15cda200-764e-44a7-97ad-27e977d63ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.229 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca0dbf5-f19d-46fe-9e8b-7c25772a1ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.232 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6197e04-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.233 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.234 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6197e04-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.237 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:58 np0005593233 kernel: tapc6197e04-f0: entered promiscuous mode
Jan 23 05:03:58 np0005593233 NetworkManager[48871]: <info>  [1769162638.2387] manager: (tapc6197e04-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.240 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.245 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc6197e04-f0, col_values=(('external_ids', {'iface-id': '58ff2900-397f-4c79-903a-e9415ec63a84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:03:58Z|00484|binding|INFO|Releasing lport 58ff2900-397f-4c79-903a-e9415ec63a84 from this chassis (sb_readonly=0)
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.247 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.248 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c6197e04-f0af-41f0-9896-46123565af56.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c6197e04-f0af-41f0-9896-46123565af56.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.249 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7209d15f-a294-4ff3-9ee9-0ec1eb99c989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.250 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-c6197e04-f0af-41f0-9896-46123565af56
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/c6197e04-f0af-41f0-9896-46123565af56.pid.haproxy
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID c6197e04-f0af-41f0-9896-46123565af56
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:03:58.252 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c6197e04-f0af-41f0-9896-46123565af56', 'env', 'PROCESS_TAG=haproxy-c6197e04-f0af-41f0-9896-46123565af56', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c6197e04-f0af-41f0-9896-46123565af56.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.412 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.412 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.437 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.587 222021 DEBUG nova.compute.manager [req-4c164082-97c0-4bcc-a643-de386317ae14 req-8b36bd67-49d1-4a01-8078-ad47afeeaf45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.587 222021 DEBUG oslo_concurrency.lockutils [req-4c164082-97c0-4bcc-a643-de386317ae14 req-8b36bd67-49d1-4a01-8078-ad47afeeaf45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.588 222021 DEBUG oslo_concurrency.lockutils [req-4c164082-97c0-4bcc-a643-de386317ae14 req-8b36bd67-49d1-4a01-8078-ad47afeeaf45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.588 222021 DEBUG oslo_concurrency.lockutils [req-4c164082-97c0-4bcc-a643-de386317ae14 req-8b36bd67-49d1-4a01-8078-ad47afeeaf45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.589 222021 DEBUG nova.compute.manager [req-4c164082-97c0-4bcc-a643-de386317ae14 req-8b36bd67-49d1-4a01-8078-ad47afeeaf45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Processing event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:03:58 np0005593233 podman[265790]: 2026-01-23 10:03:58.672361726 +0000 UTC m=+0.057244982 container create b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:03:58 np0005593233 systemd[1]: Started libpod-conmon-b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3.scope.
Jan 23 05:03:58 np0005593233 podman[265790]: 2026-01-23 10:03:58.642443959 +0000 UTC m=+0.027327245 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:03:58 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:03:58 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7de47ec55cf6d5836f36c8ced69e4f6a860dc8bb029621fd7683399e70776a22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.839 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.841 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162638.839157, c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.841 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] VM Started (Lifecycle Event)#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.844 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.848 222021 INFO nova.virt.libvirt.driver [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Instance spawned successfully.#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.848 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.875 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.880 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.880 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.881 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.881 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.881 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.882 222021 DEBUG nova.virt.libvirt.driver [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.886 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:58 np0005593233 podman[265790]: 2026-01-23 10:03:58.896965384 +0000 UTC m=+0.281848650 container init b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:03:58 np0005593233 podman[265790]: 2026-01-23 10:03:58.904846065 +0000 UTC m=+0.289729321 container start b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.921 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.921 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162638.841654, c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.921 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:03:58 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [NOTICE]   (265852) : New worker (265854) forked
Jan 23 05:03:58 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [NOTICE]   (265852) : Loading success.
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.948 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.950 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162638.8437698, c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.951 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.971 222021 INFO nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Took 9.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.972 222021 DEBUG nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.978 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:58 np0005593233 nova_compute[222017]: 2026-01-23 10:03:58.981 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:59 np0005593233 nova_compute[222017]: 2026-01-23 10:03:59.018 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:03:59 np0005593233 nova_compute[222017]: 2026-01-23 10:03:59.060 222021 INFO nova.compute.manager [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Took 10.40 seconds to build instance.#033[00m
Jan 23 05:03:59 np0005593233 nova_compute[222017]: 2026-01-23 10:03:59.083 222021 DEBUG oslo_concurrency.lockutils [None req-e2aac935-80d7-499d-954d-4bf5d3588b77 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:03:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:59.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:03:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:03:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:59.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.738 222021 DEBUG nova.compute.manager [req-6627982f-4fde-42ff-aa4c-0ac8e5a8c1f3 req-d79aeeac-62ed-49dd-8f42-b1c646f20732 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.739 222021 DEBUG oslo_concurrency.lockutils [req-6627982f-4fde-42ff-aa4c-0ac8e5a8c1f3 req-d79aeeac-62ed-49dd-8f42-b1c646f20732 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.739 222021 DEBUG oslo_concurrency.lockutils [req-6627982f-4fde-42ff-aa4c-0ac8e5a8c1f3 req-d79aeeac-62ed-49dd-8f42-b1c646f20732 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.739 222021 DEBUG oslo_concurrency.lockutils [req-6627982f-4fde-42ff-aa4c-0ac8e5a8c1f3 req-d79aeeac-62ed-49dd-8f42-b1c646f20732 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.739 222021 DEBUG nova.compute.manager [req-6627982f-4fde-42ff-aa4c-0ac8e5a8c1f3 req-d79aeeac-62ed-49dd-8f42-b1c646f20732 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] No waiting events found dispatching network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.739 222021 WARNING nova.compute.manager [req-6627982f-4fde-42ff-aa4c-0ac8e5a8c1f3 req-d79aeeac-62ed-49dd-8f42-b1c646f20732 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received unexpected event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce for instance with vm_state active and task_state None.#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.741 222021 DEBUG nova.compute.manager [None req-71e2f84d-be04-40b3-8525-2d20793b084c fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:00 np0005593233 nova_compute[222017]: 2026-01-23 10:04:00.904 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:01 np0005593233 nova_compute[222017]: 2026-01-23 10:04:01.568 222021 DEBUG nova.compute.manager [None req-ef88a2c8-95a0-4d69-b6b0-4aa36175ac38 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Jan 23 05:04:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:01.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:01.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.046 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.048 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.048 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.048 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.048 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.050 222021 INFO nova.compute.manager [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Terminating instance#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.051 222021 DEBUG nova.compute.manager [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:04:02 np0005593233 podman[265863]: 2026-01-23 10:04:02.059638816 +0000 UTC m=+0.063772383 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:04:02 np0005593233 kernel: tapcd413a72-0d (unregistering): left promiscuous mode
Jan 23 05:04:02 np0005593233 NetworkManager[48871]: <info>  [1769162642.0976] device (tapcd413a72-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:04:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:02Z|00485|binding|INFO|Releasing lport cd413a72-0d0a-4fd3-8424-f49e75b1bfce from this chassis (sb_readonly=0)
Jan 23 05:04:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:02Z|00486|binding|INFO|Setting lport cd413a72-0d0a-4fd3-8424-f49e75b1bfce down in Southbound
Jan 23 05:04:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:02Z|00487|binding|INFO|Removing iface tapcd413a72-0d ovn-installed in OVS
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.111 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:02.117 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:3e:49 10.100.0.13'], port_security=['fa:16:3e:a3:3e:49 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6197e04-f0af-41f0-9896-46123565af56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6678355ecf9441c916291536c42c2cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f82f8d41-43d2-4e09-a6b4-ffbbab6023ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34528a28-8ef4-420e-8477-52d28e2c519c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=cd413a72-0d0a-4fd3-8424-f49e75b1bfce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:02.119 140224 INFO neutron.agent.ovn.metadata.agent [-] Port cd413a72-0d0a-4fd3-8424-f49e75b1bfce in datapath c6197e04-f0af-41f0-9896-46123565af56 unbound from our chassis#033[00m
Jan 23 05:04:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:02.121 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c6197e04-f0af-41f0-9896-46123565af56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:04:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:02.122 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8e78bca6-8fff-4da2-bfed-dba04794cfbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:02.123 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c6197e04-f0af-41f0-9896-46123565af56 namespace which is not needed anymore#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.129 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:02 np0005593233 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Jan 23 05:04:02 np0005593233 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006c.scope: Consumed 4.325s CPU time.
Jan 23 05:04:02 np0005593233 systemd-machined[190954]: Machine qemu-51-instance-0000006c terminated.
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.294 222021 INFO nova.virt.libvirt.driver [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Instance destroyed successfully.#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.295 222021 DEBUG nova.objects.instance [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lazy-loading 'resources' on Instance uuid c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.316 222021 DEBUG nova.virt.libvirt.vif [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-597489643',display_name='tempest-NoVNCConsoleTestJSON-server-597489643',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-597489643',id=108,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:03:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b6678355ecf9441c916291536c42c2cd',ramdisk_id='',reservation_id='r-684aaivj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-245448120',owner_user_name='tempest-NoVNCConsoleTestJSON-245448120-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:03:59Z,user_data=None,user_id='fdaabb8d5b2d4e7caa209f6918f24078',uuid=c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.317 222021 DEBUG nova.network.os_vif_util [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Converting VIF {"id": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "address": "fa:16:3e:a3:3e:49", "network": {"id": "c6197e04-f0af-41f0-9896-46123565af56", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-1354956967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b6678355ecf9441c916291536c42c2cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd413a72-0d", "ovs_interfaceid": "cd413a72-0d0a-4fd3-8424-f49e75b1bfce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.318 222021 DEBUG nova.network.os_vif_util [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.318 222021 DEBUG os_vif [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.321 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.321 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd413a72-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.323 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.325 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.328 222021 INFO os_vif [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:3e:49,bridge_name='br-int',has_traffic_filtering=True,id=cd413a72-0d0a-4fd3-8424-f49e75b1bfce,network=Network(c6197e04-f0af-41f0-9896-46123565af56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd413a72-0d')#033[00m
Jan 23 05:04:02 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [NOTICE]   (265852) : haproxy version is 2.8.14-c23fe91
Jan 23 05:04:02 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [NOTICE]   (265852) : path to executable is /usr/sbin/haproxy
Jan 23 05:04:02 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [WARNING]  (265852) : Exiting Master process...
Jan 23 05:04:02 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [ALERT]    (265852) : Current worker (265854) exited with code 143 (Terminated)
Jan 23 05:04:02 np0005593233 neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56[265842]: [WARNING]  (265852) : All workers exited. Exiting... (0)
Jan 23 05:04:02 np0005593233 systemd[1]: libpod-b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3.scope: Deactivated successfully.
Jan 23 05:04:02 np0005593233 podman[265906]: 2026-01-23 10:04:02.430511714 +0000 UTC m=+0.199423346 container died b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:04:02 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3-userdata-shm.mount: Deactivated successfully.
Jan 23 05:04:02 np0005593233 systemd[1]: var-lib-containers-storage-overlay-7de47ec55cf6d5836f36c8ced69e4f6a860dc8bb029621fd7683399e70776a22-merged.mount: Deactivated successfully.
Jan 23 05:04:02 np0005593233 podman[265906]: 2026-01-23 10:04:02.654458095 +0000 UTC m=+0.423369707 container cleanup b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:04:02 np0005593233 systemd[1]: libpod-conmon-b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3.scope: Deactivated successfully.
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.888 222021 DEBUG nova.compute.manager [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-vif-unplugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.890 222021 DEBUG oslo_concurrency.lockutils [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.890 222021 DEBUG oslo_concurrency.lockutils [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.891 222021 DEBUG oslo_concurrency.lockutils [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.891 222021 DEBUG nova.compute.manager [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] No waiting events found dispatching network-vif-unplugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.891 222021 DEBUG nova.compute.manager [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-vif-unplugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.891 222021 DEBUG nova.compute.manager [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.892 222021 DEBUG oslo_concurrency.lockutils [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.892 222021 DEBUG oslo_concurrency.lockutils [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.892 222021 DEBUG oslo_concurrency.lockutils [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.892 222021 DEBUG nova.compute.manager [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] No waiting events found dispatching network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:02 np0005593233 nova_compute[222017]: 2026-01-23 10:04:02.892 222021 WARNING nova.compute.manager [req-e089975f-0ba6-46b8-a5cc-10047d071860 req-afc22541-72cf-4bd6-8760-d32e1861260d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received unexpected event network-vif-plugged-cd413a72-0d0a-4fd3-8424-f49e75b1bfce for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:04:03 np0005593233 podman[265962]: 2026-01-23 10:04:03.381828709 +0000 UTC m=+0.694406564 container remove b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.390 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3394b3fd-ac7b-42f0-96ac-130eb3beb489]: (4, ('Fri Jan 23 10:04:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56 (b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3)\nb1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3\nFri Jan 23 10:04:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c6197e04-f0af-41f0-9896-46123565af56 (b1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3)\nb1d4741725481c06a53db948eb182d64029e0454ec3bb4e7c0d7f63432715cf3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.395 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a77029a-429c-4448-a72f-7c2d022c9e47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.396 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6197e04-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:03 np0005593233 kernel: tapc6197e04-f0: left promiscuous mode
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.399 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.403 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a42a001f-b656-4a4c-80ee-8d419cf225b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.422 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe894a1-9622-4465-8101-e411b7eab699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.424 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[026da1df-ede5-4c26-8ca1-ef5c9a45d3dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.444 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[77b831b2-2c5c-47aa-b651-9c5685fb428e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655051, 'reachable_time': 42344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265980, 'error': None, 'target': 'ovnmeta-c6197e04-f0af-41f0-9896-46123565af56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.449 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c6197e04-f0af-41f0-9896-46123565af56 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:04:03 np0005593233 systemd[1]: run-netns-ovnmeta\x2dc6197e04\x2df0af\x2d41f0\x2d9896\x2d46123565af56.mount: Deactivated successfully.
Jan 23 05:04:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:03.450 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[394facc0-d0a9-4310-a580-2202f40cef73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:03.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:03.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.915 222021 INFO nova.virt.libvirt.driver [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Deleting instance files /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_del#033[00m
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.916 222021 INFO nova.virt.libvirt.driver [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Deletion of /var/lib/nova/instances/c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0_del complete#033[00m
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.986 222021 INFO nova.compute.manager [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Took 1.93 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.987 222021 DEBUG oslo.service.loopingcall [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.987 222021 DEBUG nova.compute.manager [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:04:03 np0005593233 nova_compute[222017]: 2026-01-23 10:04:03.987 222021 DEBUG nova.network.neutron [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:04:04 np0005593233 nova_compute[222017]: 2026-01-23 10:04:04.952 222021 DEBUG nova.network.neutron [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:04 np0005593233 nova_compute[222017]: 2026-01-23 10:04:04.972 222021 INFO nova.compute.manager [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Took 0.98 seconds to deallocate network for instance.#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.032 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.033 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.054 222021 DEBUG nova.compute.manager [req-f839451f-1fb4-434f-a9a8-77a9c134c64c req-2d084be6-7032-4cc5-afb6-402ecb86f49b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Received event network-vif-deleted-cd413a72-0d0a-4fd3-8424-f49e75b1bfce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.152 222021 DEBUG oslo_concurrency.processutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1822775607' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2551812167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.621 222021 DEBUG oslo_concurrency.processutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.629 222021 DEBUG nova.compute.provider_tree [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.657 222021 DEBUG nova.scheduler.client.report [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.702 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.731 222021 INFO nova.scheduler.client.report [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Deleted allocations for instance c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0#033[00m
Jan 23 05:04:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:05.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.830 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:05.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:05 np0005593233 nova_compute[222017]: 2026-01-23 10:04:05.859 222021 DEBUG oslo_concurrency.lockutils [None req-854665df-7f30-41f1-b098-acdf0ca25da0 fdaabb8d5b2d4e7caa209f6918f24078 b6678355ecf9441c916291536c42c2cd - - default default] Lock "c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:07 np0005593233 nova_compute[222017]: 2026-01-23 10:04:07.325 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:07.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:07.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:08.157 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:08 np0005593233 nova_compute[222017]: 2026-01-23 10:04:08.158 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:08.159 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:04:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:09.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:10.161 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:10 np0005593233 nova_compute[222017]: 2026-01-23 10:04:10.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:11.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:12 np0005593233 nova_compute[222017]: 2026-01-23 10:04:12.329 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:13Z|00488|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:04:13 np0005593233 nova_compute[222017]: 2026-01-23 10:04:13.302 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:13.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:13.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:15.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:15 np0005593233 nova_compute[222017]: 2026-01-23 10:04:15.836 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:04:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:15.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:04:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:17 np0005593233 nova_compute[222017]: 2026-01-23 10:04:17.292 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162642.291121, c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:17 np0005593233 nova_compute[222017]: 2026-01-23 10:04:17.293 222021 INFO nova.compute.manager [-] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:04:17 np0005593233 nova_compute[222017]: 2026-01-23 10:04:17.321 222021 DEBUG nova.compute.manager [None req-102e6457-75b6-4448-a92a-0c84a495b24b - - - - - -] [instance: c9ed8b68-94d2-4ed2-af94-9ee420d2fbb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:17 np0005593233 nova_compute[222017]: 2026-01-23 10:04:17.333 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:17.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:17.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:19.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:19.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:20 np0005593233 nova_compute[222017]: 2026-01-23 10:04:20.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:21 np0005593233 podman[266004]: 2026-01-23 10:04:21.096712489 +0000 UTC m=+0.103886065 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:04:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:21.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:21.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:22 np0005593233 nova_compute[222017]: 2026-01-23 10:04:22.336 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:23Z|00489|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:04:23 np0005593233 nova_compute[222017]: 2026-01-23 10:04:23.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:23.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:23.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:23 np0005593233 nova_compute[222017]: 2026-01-23 10:04:23.966 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:25.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:25.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:25 np0005593233 nova_compute[222017]: 2026-01-23 10:04:25.963 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:27 np0005593233 nova_compute[222017]: 2026-01-23 10:04:27.340 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:27.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:27.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:28 np0005593233 nova_compute[222017]: 2026-01-23 10:04:28.409 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:28 np0005593233 nova_compute[222017]: 2026-01-23 10:04:28.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:29.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:29.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:30 np0005593233 nova_compute[222017]: 2026-01-23 10:04:30.964 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.336 222021 DEBUG nova.compute.manager [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.476 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.476 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.517 222021 DEBUG nova.objects.instance [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_requests' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.544 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.545 222021 INFO nova.compute.claims [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.545 222021 DEBUG nova.objects.instance [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.565 222021 DEBUG nova.objects.instance [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.646 222021 INFO nova.compute.resource_tracker [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating resource usage from migration cb27169a-f251-4da9-9cb2-2425cc564251#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.647 222021 DEBUG nova.compute.resource_tracker [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Starting to track incoming migration cb27169a-f251-4da9-9cb2-2425cc564251 with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.696 222021 DEBUG nova.scheduler.client.report [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:04:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:31.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.837 222021 DEBUG nova.scheduler.client.report [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.838 222021 DEBUG nova.compute.provider_tree [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.858 222021 DEBUG nova.scheduler.client.report [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:04:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:31.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.898 222021 DEBUG nova.scheduler.client.report [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:04:31 np0005593233 nova_compute[222017]: 2026-01-23 10:04:31.956 222021 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:32 np0005593233 nova_compute[222017]: 2026-01-23 10:04:32.343 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1934351677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:32 np0005593233 nova_compute[222017]: 2026-01-23 10:04:32.455 222021 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:32 np0005593233 nova_compute[222017]: 2026-01-23 10:04:32.462 222021 DEBUG nova.compute.provider_tree [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:32 np0005593233 nova_compute[222017]: 2026-01-23 10:04:32.488 222021 DEBUG nova.scheduler.client.report [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:32 np0005593233 nova_compute[222017]: 2026-01-23 10:04:32.516 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:32 np0005593233 nova_compute[222017]: 2026-01-23 10:04:32.517 222021 INFO nova.compute.manager [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Migrating#033[00m
Jan 23 05:04:33 np0005593233 podman[266053]: 2026-01-23 10:04:33.066021378 +0000 UTC m=+0.068879416 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.432 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.433 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:04:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:33.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:04:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/691023125' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:33 np0005593233 nova_compute[222017]: 2026-01-23 10:04:33.890 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:33.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.365 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.366 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.549 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.550 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4332MB free_disk=20.89712142944336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.550 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.550 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:34 np0005593233 nova_compute[222017]: 2026-01-23 10:04:34.738 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration for instance ae2a211d-e923-498b-9ceb-97274a2fd725 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.050 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating resource usage from migration cb27169a-f251-4da9-9cb2-2425cc564251#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.051 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Starting to track incoming migration cb27169a-f251-4da9-9cb2-2425cc564251 with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.100 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 51a66602-3548-4341-add1-988bd6c7aa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.114 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.114 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.137 222021 WARNING nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ae2a211d-e923-498b-9ceb-97274a2fd725 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.151 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.200 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 3fe4245a-5986-4aa5-8762-24329c7e1043 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.200 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.200 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.281 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.314 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:35 np0005593233 systemd-logind[804]: New session 57 of user nova.
Jan 23 05:04:35 np0005593233 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 05:04:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2495449075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:35 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.770 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:35 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.780 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:35 np0005593233 systemd[1]: Starting User Manager for UID 42436...
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.805 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:35.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.833 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.834 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.835 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.843 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.844 222021 INFO nova.compute.claims [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:04:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:35.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:35 np0005593233 systemd[266121]: Queued start job for default target Main User Target.
Jan 23 05:04:35 np0005593233 systemd[266121]: Created slice User Application Slice.
Jan 23 05:04:35 np0005593233 systemd[266121]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:04:35 np0005593233 systemd[266121]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:04:35 np0005593233 systemd[266121]: Reached target Paths.
Jan 23 05:04:35 np0005593233 systemd[266121]: Reached target Timers.
Jan 23 05:04:35 np0005593233 systemd[266121]: Starting D-Bus User Message Bus Socket...
Jan 23 05:04:35 np0005593233 nova_compute[222017]: 2026-01-23 10:04:35.968 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:35 np0005593233 systemd[266121]: Starting Create User's Volatile Files and Directories...
Jan 23 05:04:35 np0005593233 systemd[266121]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:04:35 np0005593233 systemd[266121]: Reached target Sockets.
Jan 23 05:04:35 np0005593233 systemd[266121]: Finished Create User's Volatile Files and Directories.
Jan 23 05:04:35 np0005593233 systemd[266121]: Reached target Basic System.
Jan 23 05:04:35 np0005593233 systemd[266121]: Reached target Main User Target.
Jan 23 05:04:35 np0005593233 systemd[266121]: Startup finished in 161ms.
Jan 23 05:04:35 np0005593233 systemd[1]: Started User Manager for UID 42436.
Jan 23 05:04:35 np0005593233 systemd[1]: Started Session 57 of User nova.
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.046 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:36 np0005593233 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 05:04:36 np0005593233 systemd-logind[804]: Session 57 logged out. Waiting for processes to exit.
Jan 23 05:04:36 np0005593233 systemd-logind[804]: Removed session 57.
Jan 23 05:04:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:36 np0005593233 systemd-logind[804]: New session 59 of user nova.
Jan 23 05:04:36 np0005593233 systemd[1]: Started Session 59 of User nova.
Jan 23 05:04:36 np0005593233 systemd[1]: session-59.scope: Deactivated successfully.
Jan 23 05:04:36 np0005593233 systemd-logind[804]: Session 59 logged out. Waiting for processes to exit.
Jan 23 05:04:36 np0005593233 systemd-logind[804]: Removed session 59.
Jan 23 05:04:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2847392130' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.527 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.537 222021 DEBUG nova.compute.provider_tree [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.560 222021 DEBUG nova.scheduler.client.report [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.600 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.602 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.706 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.707 222021 DEBUG nova.network.neutron [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.761 222021 INFO nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.793 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.835 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.835 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.836 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.836 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.927 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.938 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.939 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.939 222021 INFO nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Creating image(s)#033[00m
Jan 23 05:04:36 np0005593233 nova_compute[222017]: 2026-01-23 10:04:36.970 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.002 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.036 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.042 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.132 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.133 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.134 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.134 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.166 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.171 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3fe4245a-5986-4aa5-8762-24329c7e1043_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.216 222021 DEBUG nova.policy [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '878babe1fbab428f98092e314b2ae0b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ca1cc631a7d348a5ad176273e81495bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.479 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3fe4245a-5986-4aa5-8762-24329c7e1043_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.308s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.583 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] resizing rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.708 222021 DEBUG nova.objects.instance [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'migration_context' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.735 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.736 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Ensure instance console log exists: /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.737 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.738 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:37 np0005593233 nova_compute[222017]: 2026-01-23 10:04:37.738 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:37.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:37.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:38 np0005593233 nova_compute[222017]: 2026-01-23 10:04:38.538 222021 DEBUG nova.network.neutron [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Successfully created port: e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:04:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:39.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:39 np0005593233 nova_compute[222017]: 2026-01-23 10:04:39.899 222021 DEBUG nova.compute.manager [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:39 np0005593233 nova_compute[222017]: 2026-01-23 10:04:39.900 222021 DEBUG oslo_concurrency.lockutils [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:39 np0005593233 nova_compute[222017]: 2026-01-23 10:04:39.900 222021 DEBUG oslo_concurrency.lockutils [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:39 np0005593233 nova_compute[222017]: 2026-01-23 10:04:39.900 222021 DEBUG oslo_concurrency.lockutils [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:39 np0005593233 nova_compute[222017]: 2026-01-23 10:04:39.901 222021 DEBUG nova.compute.manager [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:39 np0005593233 nova_compute[222017]: 2026-01-23 10:04:39.901 222021 WARNING nova.compute.manager [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:04:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:39.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:40 np0005593233 nova_compute[222017]: 2026-01-23 10:04:40.387 222021 INFO nova.network.neutron [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating port 115f68c4-4489-4fc8-bb90-3c2d3011db2d with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:04:40 np0005593233 nova_compute[222017]: 2026-01-23 10:04:40.971 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:04:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:41.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:41 np0005593233 nova_compute[222017]: 2026-01-23 10:04:41.873 222021 DEBUG nova.network.neutron [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Successfully updated port: e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:04:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:41.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.350 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:42.667 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:42.667 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:42.668 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.679 222021 DEBUG nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.680 222021 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.680 222021 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.680 222021 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.680 222021 DEBUG nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.681 222021 WARNING nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.681 222021 DEBUG nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.681 222021 DEBUG nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing instance network info cache due to event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.681 222021 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.681 222021 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.682 222021 DEBUG nova.network.neutron [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:04:42 np0005593233 nova_compute[222017]: 2026-01-23 10:04:42.690 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.018 222021 DEBUG nova.network.neutron [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.425 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.729 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.730 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.730 222021 DEBUG nova.network.neutron [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:43.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.826 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.827 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.827 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:04:43 np0005593233 nova_compute[222017]: 2026-01-23 10:04:43.827 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:43.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.098 222021 DEBUG nova.network.neutron [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.365 222021 DEBUG nova.compute.manager [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.366 222021 DEBUG nova.compute.manager [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing instance network info cache due to event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.366 222021 DEBUG oslo_concurrency.lockutils [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.388 222021 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.389 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.390 222021 DEBUG nova.network.neutron [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.439 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:04:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2525384942' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:04:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:04:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2525384942' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:04:44 np0005593233 nova_compute[222017]: 2026-01-23 10:04:44.895 222021 DEBUG nova.network.neutron [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:04:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:45.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:45.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:45 np0005593233 nova_compute[222017]: 2026-01-23 10:04:45.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:46 np0005593233 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 05:04:46 np0005593233 systemd[266121]: Activating special unit Exit the Session...
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped target Main User Target.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped target Basic System.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped target Paths.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped target Sockets.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped target Timers.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:04:46 np0005593233 systemd[266121]: Closed D-Bus User Message Bus Socket.
Jan 23 05:04:46 np0005593233 systemd[266121]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:04:46 np0005593233 systemd[266121]: Removed slice User Application Slice.
Jan 23 05:04:46 np0005593233 systemd[266121]: Reached target Shutdown.
Jan 23 05:04:46 np0005593233 systemd[266121]: Finished Exit the Session.
Jan 23 05:04:46 np0005593233 systemd[266121]: Reached target Exit the Session.
Jan 23 05:04:46 np0005593233 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 05:04:46 np0005593233 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 05:04:46 np0005593233 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 05:04:46 np0005593233 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 05:04:46 np0005593233 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 05:04:46 np0005593233 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 05:04:46 np0005593233 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 05:04:47 np0005593233 nova_compute[222017]: 2026-01-23 10:04:47.353 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:47.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:47.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.451 222021 DEBUG nova.network.neutron [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.533 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.536 222021 DEBUG oslo_concurrency.lockutils [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.536 222021 DEBUG nova.network.neutron [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.676 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.679 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.680 222021 INFO nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Creating image(s)#033[00m
Jan 23 05:04:49 np0005593233 nova_compute[222017]: 2026-01-23 10:04:49.735 222021 DEBUG nova.storage.rbd_utils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] creating snapshot(nova-resize) on rbd image(ae2a211d-e923-498b-9ceb-97274a2fd725_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:04:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:49.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:49.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e266 e266: 3 total, 3 up, 3 in
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.097 222021 DEBUG nova.objects.instance [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'trusted_certs' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.253 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.253 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Ensure instance console log exists: /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.254 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.255 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.255 222021 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.260 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start _get_guest_xml network_info=[{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.268 222021 WARNING nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.275 222021 DEBUG nova.virt.libvirt.host [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.277 222021 DEBUG nova.virt.libvirt.host [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.280 222021 DEBUG nova.virt.libvirt.host [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.281 222021 DEBUG nova.virt.libvirt.host [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.283 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.284 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.284 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.285 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.285 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.286 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.287 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.287 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.288 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.288 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.288 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.288 222021 DEBUG nova.virt.hardware [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.289 222021 DEBUG nova.objects.instance [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.298 222021 DEBUG nova.network.neutron [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.336 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.347 222021 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.387 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.388 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance network_info: |[{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.390 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Start _get_guest_xml network_info=[{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.396 222021 WARNING nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.401 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.402 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.407 222021 DEBUG nova.virt.libvirt.host [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.408 222021 DEBUG nova.virt.libvirt.host [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.413 222021 DEBUG nova.virt.libvirt.host [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.413 222021 DEBUG nova.virt.libvirt.host [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.415 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.415 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.416 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.416 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.417 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.417 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.417 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.418 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.418 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.418 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.418 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.419 222021 DEBUG nova.virt.hardware [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.423 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3852797021' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.878 222021 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3132737334' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.929 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:50 np0005593233 nova_compute[222017]: 2026-01-23 10:04:50.934 222021 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.001 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.006 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.033 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.396 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/377568894' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.419 222021 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.422 222021 DEBUG nova.virt.libvirt.vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.422 222021 DEBUG nova.network.os_vif_util [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.423 222021 DEBUG nova.network.os_vif_util [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.427 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <uuid>ae2a211d-e923-498b-9ceb-97274a2fd725</uuid>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <name>instance-0000006d</name>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <memory>196608</memory>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerActionsTestJSON-server-782058218</nova:name>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:04:50</nova:creationTime>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.micro">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:memory>192</nova:memory>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:port uuid="115f68c4-4489-4fc8-bb90-3c2d3011db2d">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="serial">ae2a211d-e923-498b-9ceb-97274a2fd725</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="uuid">ae2a211d-e923-498b-9ceb-97274a2fd725</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ae2a211d-e923-498b-9ceb-97274a2fd725_disk">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:e2:de:d3"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <target dev="tap115f68c4-44"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/console.log" append="off"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:04:51 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:04:51 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.433 222021 DEBUG nova.virt.libvirt.vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.433 222021 DEBUG nova.network.os_vif_util [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.434 222021 DEBUG nova.network.os_vif_util [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.434 222021 DEBUG os_vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.436 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.436 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.440 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.440 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap115f68c4-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.441 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap115f68c4-44, col_values=(('external_ids', {'iface-id': '115f68c4-4489-4fc8-bb90-3c2d3011db2d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:de:d3', 'vm-uuid': 'ae2a211d-e923-498b-9ceb-97274a2fd725'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.442 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.4440] manager: (tap115f68c4-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.445 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.454 222021 INFO os_vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44')#033[00m
Jan 23 05:04:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3120520131' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.487 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.488 222021 DEBUG nova.virt.libvirt.vif [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:04:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1541258776',display_name='tempest-ServerRescueTestJSONUnderV235-server-1541258776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1541258776',id=111,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca1cc631a7d348a5ad176273e81495bb',ramdisk_id='',reservation_id='r-be45v9br',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-976372845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-976372845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:36Z,user_data=None,user_id='878babe1fbab428f98092e314b2ae0b1',uuid=3fe4245a-5986-4aa5-8762-24329c7e1043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.489 222021 DEBUG nova.network.os_vif_util [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converting VIF {"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.489 222021 DEBUG nova.network.os_vif_util [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.490 222021 DEBUG nova.objects.instance [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'pci_devices' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.577 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <uuid>3fe4245a-5986-4aa5-8762-24329c7e1043</uuid>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <name>instance-0000006f</name>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1541258776</nova:name>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:04:50</nova:creationTime>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:user uuid="878babe1fbab428f98092e314b2ae0b1">tempest-ServerRescueTestJSONUnderV235-976372845-project-member</nova:user>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:project uuid="ca1cc631a7d348a5ad176273e81495bb">tempest-ServerRescueTestJSONUnderV235-976372845</nova:project>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <nova:port uuid="e9cb7ed5-cf17-45b5-930e-f69b40b4feeb">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="serial">3fe4245a-5986-4aa5-8762-24329c7e1043</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="uuid">3fe4245a-5986-4aa5-8762-24329c7e1043</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3fe4245a-5986-4aa5-8762-24329c7e1043_disk">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:50:26:7e"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <target dev="tape9cb7ed5-cf"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/console.log" append="off"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:04:51 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:04:51 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:04:51 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:04:51 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.578 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Preparing to wait for external event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.578 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.579 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.579 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.580 222021 DEBUG nova.virt.libvirt.vif [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:04:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1541258776',display_name='tempest-ServerRescueTestJSONUnderV235-server-1541258776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1541258776',id=111,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ca1cc631a7d348a5ad176273e81495bb',ramdisk_id='',reservation_id='r-be45v9br',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-976372845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-976372845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:36Z,user_data=None,user_id='878babe1fbab428f98092e314b2ae0b1',uuid=3fe4245a-5986-4aa5-8762-24329c7e1043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.580 222021 DEBUG nova.network.os_vif_util [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converting VIF {"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.581 222021 DEBUG nova.network.os_vif_util [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.582 222021 DEBUG os_vif [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.583 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.583 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.583 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.587 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.587 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9cb7ed5-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.588 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape9cb7ed5-cf, col_values=(('external_ids', {'iface-id': 'e9cb7ed5-cf17-45b5-930e-f69b40b4feeb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:26:7e', 'vm-uuid': '3fe4245a-5986-4aa5-8762-24329c7e1043'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.589 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.5906] manager: (tape9cb7ed5-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.599 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.601 222021 INFO os_vif [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf')#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.621 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.621 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.622 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No VIF found with MAC fa:16:3e:e2:de:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.623 222021 INFO nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Using config drive#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.716 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.717 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.717 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No VIF found with MAC fa:16:3e:50:26:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.717 222021 INFO nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Using config drive#033[00m
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.7290] manager: (tap115f68c4-44): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Jan 23 05:04:51 np0005593233 kernel: tap115f68c4-44: entered promiscuous mode
Jan 23 05:04:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:51Z|00490|binding|INFO|Claiming lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d for this chassis.
Jan 23 05:04:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:51Z|00491|binding|INFO|115f68c4-4489-4fc8-bb90-3c2d3011db2d: Claiming fa:16:3e:e2:de:d3 10.100.0.7
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.788 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.790 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.791 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:04:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:51Z|00492|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d ovn-installed in OVS
Jan 23 05:04:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:51Z|00493|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d up in Southbound
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.805 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[42a75a07-d475-4bf2-8993-59d2f887beef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.806 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.809 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.809 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fac36a39-098c-4e47-b7f4-dd1007490131]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.809 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.810 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea20605-1781-40f2-b591-11dab47677c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 systemd-udevd[266771]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:04:51 np0005593233 nova_compute[222017]: 2026-01-23 10:04:51.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.828 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[4a54739e-5621-4cb2-a40a-44418fba9f8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.8333] device (tap115f68c4-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.8342] device (tap115f68c4-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:04:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:51.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:51 np0005593233 systemd-machined[190954]: New machine qemu-52-instance-0000006d.
Jan 23 05:04:51 np0005593233 systemd[1]: Started Virtual Machine qemu-52-instance-0000006d.
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.847 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[01d4f013-f842-4555-abb2-e5d506554d7e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.889 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b631a7cb-b0c5-4b34-9521-aa3dda8eae8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.895 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[06ec7f59-cdcf-4eec-8280-81112645dcec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.8968] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/238)
Jan 23 05:04:51 np0005593233 systemd-udevd[266780]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:04:51 np0005593233 podman[266749]: 2026-01-23 10:04:51.907537022 +0000 UTC m=+0.110012087 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:04:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:51.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.933 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b40f91-ea3b-4b42-8589-61139869d4ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.937 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[556715ab-ed64-4c8d-987c-7bc89a9263c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 NetworkManager[48871]: <info>  [1769162691.9668] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.972 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b49d1f83-53f3-4887-a47a-47ae8d56e05a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:51.993 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5ba90e-efbb-4def-afbb-ea362382e799]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660459, 'reachable_time': 32379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266827, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.014 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[26ec2c7b-81ed-4137-88ee-fded95de1025]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 660459, 'tstamp': 660459}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 266828, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.035 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c3efbd01-59be-43c0-9a32-1eaedda3aa99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660459, 'reachable_time': 32379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 266829, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.075 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[322abddb-29ff-4537-b115-1edbb2da8121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.146 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab1a2cb-78f6-4d04-8097-b7e4a0e61875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.147 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.148 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.148 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.150 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:52 np0005593233 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:04:52 np0005593233 NetworkManager[48871]: <info>  [1769162692.1506] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.156 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.157 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:52Z|00494|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.179 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.186 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.187 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.188 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[965b1f7f-64cf-47e8-8922-5ba626c4e488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.189 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:04:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:52.192 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.284 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162692.2838562, ae2a211d-e923-498b-9ceb-97274a2fd725 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.284 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.296 222021 DEBUG nova.compute.manager [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.300 222021 INFO nova.virt.libvirt.driver [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance running successfully.#033[00m
Jan 23 05:04:52 np0005593233 virtqemud[221325]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.303 222021 DEBUG nova.virt.libvirt.guest [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.303 222021 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.400 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.404 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.481 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.482 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162692.296325, ae2a211d-e923-498b-9ceb-97274a2fd725 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.482 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Started (Lifecycle Event)#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.532 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.538 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:52 np0005593233 podman[266905]: 2026-01-23 10:04:52.611979945 +0000 UTC m=+0.052113198 container create 3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:04:52 np0005593233 systemd[1]: Started libpod-conmon-3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602.scope.
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.667 222021 DEBUG nova.compute.manager [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.667 222021 DEBUG oslo_concurrency.lockutils [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.667 222021 DEBUG oslo_concurrency.lockutils [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.668 222021 DEBUG oslo_concurrency.lockutils [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.668 222021 DEBUG nova.compute.manager [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:52 np0005593233 nova_compute[222017]: 2026-01-23 10:04:52.668 222021 WARNING nova.compute.manager [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:04:52 np0005593233 podman[266905]: 2026-01-23 10:04:52.585061022 +0000 UTC m=+0.025194295 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:04:52 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:04:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68dc313db5f5aeec10af7e7a96964118484fd358616c2efd03b1f72cead7cdf8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:52 np0005593233 podman[266905]: 2026-01-23 10:04:52.731055073 +0000 UTC m=+0.171188346 container init 3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:52 np0005593233 podman[266905]: 2026-01-23 10:04:52.742077761 +0000 UTC m=+0.182211014 container start 3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:52 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [NOTICE]   (266924) : New worker (266926) forked
Jan 23 05:04:52 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [NOTICE]   (266924) : Loading success.
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.222 222021 INFO nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Creating config drive at /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.226 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4jcpat6i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.366 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4jcpat6i" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.402 222021 DEBUG nova.storage.rbd_utils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.408 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.580 222021 DEBUG oslo_concurrency.processutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.581 222021 INFO nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Deleting local config drive /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config because it was imported into RBD.#033[00m
Jan 23 05:04:53 np0005593233 kernel: tape9cb7ed5-cf: entered promiscuous mode
Jan 23 05:04:53 np0005593233 NetworkManager[48871]: <info>  [1769162693.6348] manager: (tape9cb7ed5-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Jan 23 05:04:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:53Z|00495|binding|INFO|Claiming lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for this chassis.
Jan 23 05:04:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:53Z|00496|binding|INFO|e9cb7ed5-cf17-45b5-930e-f69b40b4feeb: Claiming fa:16:3e:50:26:7e 10.100.0.11
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.642 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:53.651 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:26:7e 10.100.0.11'], port_security=['fa:16:3e:50:26:7e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3fe4245a-5986-4aa5-8762-24329c7e1043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ed922f3-1187-4218-88c3-8aa17da9140a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1cc631a7d348a5ad176273e81495bb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a17a68ae-d91e-463f-82bb-e1b60b7cf4d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d7fa43d-e34b-4327-b645-5d2c8b16d3b6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:53.653 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb in datapath 7ed922f3-1187-4218-88c3-8aa17da9140a bound to our chassis#033[00m
Jan 23 05:04:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:53.654 140224 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7ed922f3-1187-4218-88c3-8aa17da9140a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:04:53 np0005593233 NetworkManager[48871]: <info>  [1769162693.6571] device (tape9cb7ed5-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:04:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:53.656 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0017615a-7aa2-426e-aa7c-f44086c333d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:53 np0005593233 NetworkManager[48871]: <info>  [1769162693.6593] device (tape9cb7ed5-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:04:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:53Z|00497|binding|INFO|Setting lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb ovn-installed in OVS
Jan 23 05:04:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:53Z|00498|binding|INFO|Setting lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb up in Southbound
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:53 np0005593233 nova_compute[222017]: 2026-01-23 10:04:53.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:53 np0005593233 systemd-machined[190954]: New machine qemu-53-instance-0000006f.
Jan 23 05:04:53 np0005593233 systemd[1]: Started Virtual Machine qemu-53-instance-0000006f.
Jan 23 05:04:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:53.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:53.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.142 222021 DEBUG nova.network.neutron [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated VIF entry in instance network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.143 222021 DEBUG nova.network.neutron [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.171 222021 DEBUG oslo_concurrency.lockutils [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.225 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162694.2242007, 3fe4245a-5986-4aa5-8762-24329c7e1043 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.225 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] VM Started (Lifecycle Event)#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.279 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.286 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162694.2257018, 3fe4245a-5986-4aa5-8762-24329c7e1043 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.286 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.312 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.318 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.631 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.888 222021 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.889 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.889 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.889 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.889 222021 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 WARNING nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Processing event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.890 222021 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.891 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.891 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.891 222021 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.891 222021 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.891 222021 WARNING nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received unexpected event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.892 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.895 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162694.8953793, 3fe4245a-5986-4aa5-8762-24329c7e1043 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.895 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.897 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.901 222021 INFO nova.virt.libvirt.driver [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance spawned successfully.#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.901 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.925 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.929 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.953 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.954 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.954 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.955 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.955 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.955 222021 DEBUG nova.virt.libvirt.driver [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:54 np0005593233 nova_compute[222017]: 2026-01-23 10:04:54.990 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.052 222021 INFO nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Took 18.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.052 222021 DEBUG nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.149 222021 INFO nova.compute.manager [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Took 19.91 seconds to build instance.#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.197 222021 DEBUG oslo_concurrency.lockutils [None req-5286f985-e1cc-4971-a5cf-18aab45bb067 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.278 222021 DEBUG nova.network.neutron [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.278 222021 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.279 222021 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.279 222021 DEBUG nova.network.neutron [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:55 np0005593233 nova_compute[222017]: 2026-01-23 10:04:55.977 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 05:04:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:56.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:56.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:56 np0005593233 nova_compute[222017]: 2026-01-23 10:04:56.590 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.013 222021 DEBUG nova.network.neutron [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:04:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:58.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:04:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:04:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:58.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.310 222021 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:58 np0005593233 kernel: tap115f68c4-44 (unregistering): left promiscuous mode
Jan 23 05:04:58 np0005593233 NetworkManager[48871]: <info>  [1769162698.3825] device (tap115f68c4-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:04:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:58Z|00499|binding|INFO|Releasing lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d from this chassis (sb_readonly=0)
Jan 23 05:04:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:58Z|00500|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d down in Southbound
Jan 23 05:04:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:04:58Z|00501|binding|INFO|Removing iface tap115f68c4-44 ovn-installed in OVS
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.415 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.427 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.428 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.431 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.432 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[516d83c4-9fa1-498d-8d56-f34b540d370f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.433 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:04:58 np0005593233 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 23 05:04:58 np0005593233 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006d.scope: Consumed 6.608s CPU time.
Jan 23 05:04:58 np0005593233 systemd-machined[190954]: Machine qemu-52-instance-0000006d terminated.
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.576 222021 INFO nova.virt.libvirt.driver [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance destroyed successfully.#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.578 222021 DEBUG nova.objects.instance [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:58 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [NOTICE]   (266924) : haproxy version is 2.8.14-c23fe91
Jan 23 05:04:58 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [NOTICE]   (266924) : path to executable is /usr/sbin/haproxy
Jan 23 05:04:58 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [WARNING]  (266924) : Exiting Master process...
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.603 222021 DEBUG nova.virt.libvirt.vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:04:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.604 222021 DEBUG nova.network.os_vif_util [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:58 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [ALERT]    (266924) : Current worker (266926) exited with code 143 (Terminated)
Jan 23 05:04:58 np0005593233 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[266920]: [WARNING]  (266924) : All workers exited. Exiting... (0)
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.606 222021 DEBUG nova.network.os_vif_util [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.607 222021 DEBUG os_vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:04:58 np0005593233 systemd[1]: libpod-3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602.scope: Deactivated successfully.
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.609 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.609 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap115f68c4-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.613 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 podman[267062]: 2026-01-23 10:04:58.614894885 +0000 UTC m=+0.062361535 container died 3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.615 222021 INFO os_vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44')#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.621 222021 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.622 222021 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:58 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602-userdata-shm.mount: Deactivated successfully.
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.651 222021 DEBUG nova.objects.instance [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'migration_context' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:58 np0005593233 systemd[1]: var-lib-containers-storage-overlay-68dc313db5f5aeec10af7e7a96964118484fd358616c2efd03b1f72cead7cdf8-merged.mount: Deactivated successfully.
Jan 23 05:04:58 np0005593233 podman[267062]: 2026-01-23 10:04:58.669228904 +0000 UTC m=+0.116695544 container cleanup 3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:58 np0005593233 systemd[1]: libpod-conmon-3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602.scope: Deactivated successfully.
Jan 23 05:04:58 np0005593233 podman[267101]: 2026-01-23 10:04:58.770282278 +0000 UTC m=+0.068744472 container remove 3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.778 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[508c2784-b717-4c83-bd89-4a8b236f883c]: (4, ('Fri Jan 23 10:04:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602)\n3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602\nFri Jan 23 10:04:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602)\n3572bbd7c01c88309844d2e274202f01f85df9271818f50222c352be1ff49602\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.780 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34c6320e-592d-446b-b746-0a4dfd19822f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.782 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.802 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.806 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.808 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9badb5fb-58c0-45c5-ba9f-c42f95b8ebb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.828 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c63ce0d4-e986-49d1-b6d3-432702b57d66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.831 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0311374c-2b0c-4bf2-af14-5c9f5a360440]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.854 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[73309313-ca65-4c0b-923e-0d146ad0fc20]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660451, 'reachable_time': 35255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267117, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.858 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:04:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:04:58.858 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[6065ae2e-e7af-4399-9f08-f9652b8bde37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:58 np0005593233 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.867 222021 DEBUG oslo_concurrency.processutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.963 222021 INFO nova.compute.manager [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Rescuing#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.964 222021 DEBUG oslo_concurrency.lockutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.964 222021 DEBUG oslo_concurrency.lockutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:58 np0005593233 nova_compute[222017]: 2026-01-23 10:04:58.965 222021 DEBUG nova.network.neutron [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.062 222021 DEBUG nova.compute.manager [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.062 222021 DEBUG oslo_concurrency.lockutils [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.063 222021 DEBUG oslo_concurrency.lockutils [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.064 222021 DEBUG oslo_concurrency.lockutils [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.065 222021 DEBUG nova.compute.manager [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.066 222021 WARNING nova.compute.manager [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:04:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3922688747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.355 222021 DEBUG oslo_concurrency.processutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.366 222021 DEBUG nova.compute.provider_tree [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.385 222021 DEBUG nova.scheduler.client.report [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:59 np0005593233 nova_compute[222017]: 2026-01-23 10:04:59.448 222021 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:00.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:00.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:00 np0005593233 nova_compute[222017]: 2026-01-23 10:05:00.979 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:01 np0005593233 nova_compute[222017]: 2026-01-23 10:05:01.295 222021 DEBUG nova.compute.manager [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:01 np0005593233 nova_compute[222017]: 2026-01-23 10:05:01.296 222021 DEBUG oslo_concurrency.lockutils [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:01 np0005593233 nova_compute[222017]: 2026-01-23 10:05:01.296 222021 DEBUG oslo_concurrency.lockutils [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:01 np0005593233 nova_compute[222017]: 2026-01-23 10:05:01.296 222021 DEBUG oslo_concurrency.lockutils [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:01 np0005593233 nova_compute[222017]: 2026-01-23 10:05:01.297 222021 DEBUG nova.compute.manager [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:01 np0005593233 nova_compute[222017]: 2026-01-23 10:05:01.297 222021 WARNING nova.compute.manager [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:05:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:02.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:02.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.178 222021 DEBUG nova.network.neutron [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.220 222021 DEBUG oslo_concurrency.lockutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.640 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.719 222021 DEBUG nova.compute.manager [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.720 222021 DEBUG nova.compute.manager [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing instance network info cache due to event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.722 222021 DEBUG oslo_concurrency.lockutils [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.722 222021 DEBUG oslo_concurrency.lockutils [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:02 np0005593233 nova_compute[222017]: 2026-01-23 10:05:02.723 222021 DEBUG nova.network.neutron [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:03 np0005593233 nova_compute[222017]: 2026-01-23 10:05:03.613 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:04.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:04.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:04 np0005593233 podman[267140]: 2026-01-23 10:05:04.073490189 +0000 UTC m=+0.074712520 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:06 np0005593233 nova_compute[222017]: 2026-01-23 10:05:06.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:06.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:06.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e267 e267: 3 total, 3 up, 3 in
Jan 23 05:05:07 np0005593233 nova_compute[222017]: 2026-01-23 10:05:07.397 222021 DEBUG nova.network.neutron [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated VIF entry in instance network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:07 np0005593233 nova_compute[222017]: 2026-01-23 10:05:07.398 222021 DEBUG nova.network.neutron [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:07 np0005593233 nova_compute[222017]: 2026-01-23 10:05:07.419 222021 DEBUG oslo_concurrency.lockutils [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:08.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:08.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.614 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.629 222021 DEBUG nova.compute.manager [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.630 222021 DEBUG oslo_concurrency.lockutils [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.630 222021 DEBUG oslo_concurrency.lockutils [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.631 222021 DEBUG oslo_concurrency.lockutils [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.631 222021 DEBUG nova.compute.manager [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:08 np0005593233 nova_compute[222017]: 2026-01-23 10:05:08.631 222021 WARNING nova.compute.manager [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:05:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:10.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:10.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:11 np0005593233 nova_compute[222017]: 2026-01-23 10:05:11.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:12.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:12.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:12 np0005593233 nova_compute[222017]: 2026-01-23 10:05:12.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:12.452 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:12.454 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:05:12 np0005593233 nova_compute[222017]: 2026-01-23 10:05:12.710 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 05:05:13 np0005593233 nova_compute[222017]: 2026-01-23 10:05:13.574 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162698.5722756, ae2a211d-e923-498b-9ceb-97274a2fd725 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:13 np0005593233 nova_compute[222017]: 2026-01-23 10:05:13.574 222021 INFO nova.compute.manager [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:05:13 np0005593233 nova_compute[222017]: 2026-01-23 10:05:13.608 222021 DEBUG nova.compute.manager [None req-69d8c932-b3dc-405d-b4fb-66c88d085483 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:13 np0005593233 nova_compute[222017]: 2026-01-23 10:05:13.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:14.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:14.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:15 np0005593233 kernel: tape9cb7ed5-cf (unregistering): left promiscuous mode
Jan 23 05:05:15 np0005593233 NetworkManager[48871]: <info>  [1769162715.3317] device (tape9cb7ed5-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.399 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:15Z|00502|binding|INFO|Releasing lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb from this chassis (sb_readonly=0)
Jan 23 05:05:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:15Z|00503|binding|INFO|Setting lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb down in Southbound
Jan 23 05:05:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:15Z|00504|binding|INFO|Removing iface tape9cb7ed5-cf ovn-installed in OVS
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.401 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.415 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:15.460 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:26:7e 10.100.0.11'], port_security=['fa:16:3e:50:26:7e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3fe4245a-5986-4aa5-8762-24329c7e1043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ed922f3-1187-4218-88c3-8aa17da9140a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1cc631a7d348a5ad176273e81495bb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a17a68ae-d91e-463f-82bb-e1b60b7cf4d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d7fa43d-e34b-4327-b645-5d2c8b16d3b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:15.462 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb in datapath 7ed922f3-1187-4218-88c3-8aa17da9140a unbound from our chassis#033[00m
Jan 23 05:05:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:15.462 140224 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7ed922f3-1187-4218-88c3-8aa17da9140a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:05:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:15.464 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ea620f37-2727-4918-83f7-93a616df6b45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:15 np0005593233 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 23 05:05:15 np0005593233 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006f.scope: Consumed 15.563s CPU time.
Jan 23 05:05:15 np0005593233 systemd-machined[190954]: Machine qemu-53-instance-0000006f terminated.
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.728 222021 INFO nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.734 222021 INFO nova.virt.libvirt.driver [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance destroyed successfully.#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.735 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'numa_topology' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.771 222021 INFO nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Attempting rescue#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.772 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.779 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.779 222021 INFO nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Creating image(s)#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.816 222021 DEBUG nova.storage.rbd_utils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.822 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.840 222021 DEBUG nova.compute.manager [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-unplugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.841 222021 DEBUG oslo_concurrency.lockutils [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.841 222021 DEBUG oslo_concurrency.lockutils [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.841 222021 DEBUG oslo_concurrency.lockutils [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.841 222021 DEBUG nova.compute.manager [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-unplugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.842 222021 WARNING nova.compute.manager [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received unexpected event network-vif-unplugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.877 222021 DEBUG nova.storage.rbd_utils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.922 222021 DEBUG nova.storage.rbd_utils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:15 np0005593233 nova_compute[222017]: 2026-01-23 10:05:15.928 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.024 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.026 222021 DEBUG oslo_concurrency.lockutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.026 222021 DEBUG oslo_concurrency.lockutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.027 222021 DEBUG oslo_concurrency.lockutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.068 222021 DEBUG nova.storage.rbd_utils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.074 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:16.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.109 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.432 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.358s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.433 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'migration_context' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.452 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.453 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Start _get_guest_xml network_info=[{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "vif_mac": "fa:16:3e:50:26:7e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.454 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'resources' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.475 222021 WARNING nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.485 222021 DEBUG nova.virt.libvirt.host [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.486 222021 DEBUG nova.virt.libvirt.host [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.490 222021 DEBUG nova.virt.libvirt.host [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.491 222021 DEBUG nova.virt.libvirt.host [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.492 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.493 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.493 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.494 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.494 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.494 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.494 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.494 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.494 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.495 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.495 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.495 222021 DEBUG nova.virt.hardware [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.495 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:16 np0005593233 nova_compute[222017]: 2026-01-23 10:05:16.518 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 e268: 3 total, 3 up, 3 in
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.654436) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716654506, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1749, "num_deletes": 258, "total_data_size": 3831305, "memory_usage": 3879936, "flush_reason": "Manual Compaction"}
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716679752, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2506998, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 51629, "largest_seqno": 53373, "table_properties": {"data_size": 2499629, "index_size": 4248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16261, "raw_average_key_size": 20, "raw_value_size": 2484539, "raw_average_value_size": 3113, "num_data_blocks": 184, "num_entries": 798, "num_filter_entries": 798, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162587, "oldest_key_time": 1769162587, "file_creation_time": 1769162716, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 25413 microseconds, and 7483 cpu microseconds.
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.679843) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2506998 bytes OK
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.679883) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.682251) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.682277) EVENT_LOG_v1 {"time_micros": 1769162716682270, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.682307) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 3823100, prev total WAL file size 3823100, number of live WAL files 2.
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.683457) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373535' seq:72057594037927935, type:22 .. '6C6F676D0032303037' seq:0, type:0; will stop at (end)
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2448KB)], [102(10MB)]
Jan 23 05:05:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716683525, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13361033, "oldest_snapshot_seqno": -1}
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2533107979' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.036 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.038 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 7698 keys, 13208864 bytes, temperature: kUnknown
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162717311213, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 13208864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13155884, "index_size": 32644, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19269, "raw_key_size": 198558, "raw_average_key_size": 25, "raw_value_size": 13017038, "raw_average_value_size": 1690, "num_data_blocks": 1296, "num_entries": 7698, "num_filter_entries": 7698, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162716, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2035619919' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.527 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.528 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.311540) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 13208864 bytes
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.770586) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 21.3 rd, 21.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 10.4 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(10.6) write-amplify(5.3) OK, records in: 8234, records dropped: 536 output_compression: NoCompression
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.770691) EVENT_LOG_v1 {"time_micros": 1769162717770666, "job": 64, "event": "compaction_finished", "compaction_time_micros": 627800, "compaction_time_cpu_micros": 31834, "output_level": 6, "num_output_files": 1, "total_output_size": 13208864, "num_input_records": 8234, "num_output_records": 7698, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162717771898, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162717777620, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:16.683367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.777708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.777717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.777719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.777721) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:05:17.777724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.974 222021 DEBUG nova.compute.manager [req-f91325a0-f474-4df8-b290-88f772047564 req-b06d55ec-90c9-43b2-a6c3-08e3698a02ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.976 222021 DEBUG oslo_concurrency.lockutils [req-f91325a0-f474-4df8-b290-88f772047564 req-b06d55ec-90c9-43b2-a6c3-08e3698a02ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.977 222021 DEBUG oslo_concurrency.lockutils [req-f91325a0-f474-4df8-b290-88f772047564 req-b06d55ec-90c9-43b2-a6c3-08e3698a02ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.977 222021 DEBUG oslo_concurrency.lockutils [req-f91325a0-f474-4df8-b290-88f772047564 req-b06d55ec-90c9-43b2-a6c3-08e3698a02ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.977 222021 DEBUG nova.compute.manager [req-f91325a0-f474-4df8-b290-88f772047564 req-b06d55ec-90c9-43b2-a6c3-08e3698a02ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:17 np0005593233 nova_compute[222017]: 2026-01-23 10:05:17.978 222021 WARNING nova.compute.manager [req-f91325a0-f474-4df8-b290-88f772047564 req-b06d55ec-90c9-43b2-a6c3-08e3698a02ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received unexpected event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:05:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1199736752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.038 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.040 222021 DEBUG nova.virt.libvirt.vif [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:04:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1541258776',display_name='tempest-ServerRescueTestJSONUnderV235-server-1541258776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1541258776',id=111,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ca1cc631a7d348a5ad176273e81495bb',ramdisk_id='',reservation_id='r-be45v9br',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-976372845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-976372845-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:55Z,user_data=None,user_id='878babe1fbab428f98092e314b2ae0b1',uuid=3fe4245a-5986-4aa5-8762-24329c7e1043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "vif_mac": "fa:16:3e:50:26:7e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.041 222021 DEBUG nova.network.os_vif_util [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converting VIF {"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "vif_mac": "fa:16:3e:50:26:7e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.042 222021 DEBUG nova.network.os_vif_util [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.043 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'pci_devices' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:18.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:18.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.081 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <uuid>3fe4245a-5986-4aa5-8762-24329c7e1043</uuid>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <name>instance-0000006f</name>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1541258776</nova:name>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:05:16</nova:creationTime>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:user uuid="878babe1fbab428f98092e314b2ae0b1">tempest-ServerRescueTestJSONUnderV235-976372845-project-member</nova:user>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:project uuid="ca1cc631a7d348a5ad176273e81495bb">tempest-ServerRescueTestJSONUnderV235-976372845</nova:project>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <nova:port uuid="e9cb7ed5-cf17-45b5-930e-f69b40b4feeb">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <entry name="serial">3fe4245a-5986-4aa5-8762-24329c7e1043</entry>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <entry name="uuid">3fe4245a-5986-4aa5-8762-24329c7e1043</entry>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3fe4245a-5986-4aa5-8762-24329c7e1043_disk.rescue">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3fe4245a-5986-4aa5-8762-24329c7e1043_disk">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config.rescue">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:50:26:7e"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <target dev="tape9cb7ed5-cf"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/console.log" append="off"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:05:18 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:05:18 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:05:18 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:05:18 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.093 222021 INFO nova.virt.libvirt.driver [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance destroyed successfully.#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.166 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.166 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.166 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.166 222021 DEBUG nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] No VIF found with MAC fa:16:3e:50:26:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.167 222021 INFO nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Using config drive#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.193 222021 DEBUG nova.storage.rbd_utils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.223 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.260 222021 DEBUG nova.objects.instance [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'keypairs' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:18 np0005593233 nova_compute[222017]: 2026-01-23 10:05:18.618 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.107 222021 INFO nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Creating config drive at /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config.rescue#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.114 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l0vr0vd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.262 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l0vr0vd" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.297 222021 DEBUG nova.storage.rbd_utils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] rbd image 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.303 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config.rescue 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.485 222021 DEBUG oslo_concurrency.processutils [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config.rescue 3fe4245a-5986-4aa5-8762-24329c7e1043_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.486 222021 INFO nova.virt.libvirt.driver [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Deleting local config drive /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:05:19 np0005593233 kernel: tape9cb7ed5-cf: entered promiscuous mode
Jan 23 05:05:19 np0005593233 NetworkManager[48871]: <info>  [1769162719.5591] manager: (tape9cb7ed5-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Jan 23 05:05:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:19Z|00505|binding|INFO|Claiming lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for this chassis.
Jan 23 05:05:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:19Z|00506|binding|INFO|e9cb7ed5-cf17-45b5-930e-f69b40b4feeb: Claiming fa:16:3e:50:26:7e 10.100.0.11
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.559 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:19.573 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:26:7e 10.100.0.11'], port_security=['fa:16:3e:50:26:7e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3fe4245a-5986-4aa5-8762-24329c7e1043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ed922f3-1187-4218-88c3-8aa17da9140a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1cc631a7d348a5ad176273e81495bb', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a17a68ae-d91e-463f-82bb-e1b60b7cf4d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d7fa43d-e34b-4327-b645-5d2c8b16d3b6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:19.574 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb in datapath 7ed922f3-1187-4218-88c3-8aa17da9140a bound to our chassis#033[00m
Jan 23 05:05:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:19.575 140224 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7ed922f3-1187-4218-88c3-8aa17da9140a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:05:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:19.577 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca86e57b-b1cf-4688-b8a3-d0103491ad30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:19Z|00507|binding|INFO|Setting lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb up in Southbound
Jan 23 05:05:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:19Z|00508|binding|INFO|Setting lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb ovn-installed in OVS
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.581 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:19 np0005593233 nova_compute[222017]: 2026-01-23 10:05:19.586 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:19 np0005593233 systemd-udevd[267404]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:05:19 np0005593233 NetworkManager[48871]: <info>  [1769162719.6124] device (tape9cb7ed5-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:05:19 np0005593233 NetworkManager[48871]: <info>  [1769162719.6141] device (tape9cb7ed5-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:05:19 np0005593233 systemd-machined[190954]: New machine qemu-54-instance-0000006f.
Jan 23 05:05:19 np0005593233 systemd[1]: Started Virtual Machine qemu-54-instance-0000006f.
Jan 23 05:05:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:20.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.114 222021 DEBUG nova.compute.manager [req-df87e497-c63e-483f-81d4-6d958f420ba7 req-d54a81be-10d5-42a0-92be-44dc08ebf410 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.114 222021 DEBUG oslo_concurrency.lockutils [req-df87e497-c63e-483f-81d4-6d958f420ba7 req-d54a81be-10d5-42a0-92be-44dc08ebf410 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.115 222021 DEBUG oslo_concurrency.lockutils [req-df87e497-c63e-483f-81d4-6d958f420ba7 req-d54a81be-10d5-42a0-92be-44dc08ebf410 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.115 222021 DEBUG oslo_concurrency.lockutils [req-df87e497-c63e-483f-81d4-6d958f420ba7 req-d54a81be-10d5-42a0-92be-44dc08ebf410 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.116 222021 DEBUG nova.compute.manager [req-df87e497-c63e-483f-81d4-6d958f420ba7 req-d54a81be-10d5-42a0-92be-44dc08ebf410 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.116 222021 WARNING nova.compute.manager [req-df87e497-c63e-483f-81d4-6d958f420ba7 req-d54a81be-10d5-42a0-92be-44dc08ebf410 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received unexpected event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.229 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 3fe4245a-5986-4aa5-8762-24329c7e1043 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.230 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162720.22874, 3fe4245a-5986-4aa5-8762-24329c7e1043 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.230 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.238 222021 DEBUG nova.compute.manager [None req-f2679522-632f-4150-824e-5f5632c1f790 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.266 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.269 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.325 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.325 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162720.2317975, 3fe4245a-5986-4aa5-8762-24329c7e1043 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.326 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] VM Started (Lifecycle Event)#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.367 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:20 np0005593233 nova_compute[222017]: 2026-01-23 10:05:20.375 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:05:21 np0005593233 nova_compute[222017]: 2026-01-23 10:05:21.094 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:21.456 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:22.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:22 np0005593233 podman[267476]: 2026-01-23 10:05:22.144180001 +0000 UTC m=+0.146577328 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:05:22 np0005593233 nova_compute[222017]: 2026-01-23 10:05:22.223 222021 DEBUG nova.compute.manager [req-744e3307-d85d-4cc8-91c7-1c3b4e9a051f req-f62a7dbe-705a-43c1-84c9-3120eb6eca7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:22 np0005593233 nova_compute[222017]: 2026-01-23 10:05:22.223 222021 DEBUG oslo_concurrency.lockutils [req-744e3307-d85d-4cc8-91c7-1c3b4e9a051f req-f62a7dbe-705a-43c1-84c9-3120eb6eca7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:22 np0005593233 nova_compute[222017]: 2026-01-23 10:05:22.224 222021 DEBUG oslo_concurrency.lockutils [req-744e3307-d85d-4cc8-91c7-1c3b4e9a051f req-f62a7dbe-705a-43c1-84c9-3120eb6eca7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:22 np0005593233 nova_compute[222017]: 2026-01-23 10:05:22.225 222021 DEBUG oslo_concurrency.lockutils [req-744e3307-d85d-4cc8-91c7-1c3b4e9a051f req-f62a7dbe-705a-43c1-84c9-3120eb6eca7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:22 np0005593233 nova_compute[222017]: 2026-01-23 10:05:22.225 222021 DEBUG nova.compute.manager [req-744e3307-d85d-4cc8-91c7-1c3b4e9a051f req-f62a7dbe-705a-43c1-84c9-3120eb6eca7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:22 np0005593233 nova_compute[222017]: 2026-01-23 10:05:22.225 222021 WARNING nova.compute.manager [req-744e3307-d85d-4cc8-91c7-1c3b4e9a051f req-f62a7dbe-705a-43c1-84c9-3120eb6eca7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received unexpected event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:05:23 np0005593233 nova_compute[222017]: 2026-01-23 10:05:23.569 222021 DEBUG nova.compute.manager [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:23 np0005593233 nova_compute[222017]: 2026-01-23 10:05:23.570 222021 DEBUG nova.compute.manager [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing instance network info cache due to event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:23 np0005593233 nova_compute[222017]: 2026-01-23 10:05:23.570 222021 DEBUG oslo_concurrency.lockutils [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:23 np0005593233 nova_compute[222017]: 2026-01-23 10:05:23.571 222021 DEBUG oslo_concurrency.lockutils [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:23 np0005593233 nova_compute[222017]: 2026-01-23 10:05:23.571 222021 DEBUG nova.network.neutron [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:23 np0005593233 nova_compute[222017]: 2026-01-23 10:05:23.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:24.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:24.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:24 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:24Z|00509|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:05:24 np0005593233 nova_compute[222017]: 2026-01-23 10:05:24.478 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:05:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:26.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:05:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:26.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.154 222021 DEBUG nova.compute.manager [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.155 222021 DEBUG nova.compute.manager [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing instance network info cache due to event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.155 222021 DEBUG oslo_concurrency.lockutils [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.292 222021 DEBUG nova.network.neutron [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updated VIF entry in instance network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.293 222021 DEBUG nova.network.neutron [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.332 222021 DEBUG oslo_concurrency.lockutils [req-b2cbae60-a0d4-4a70-ba54-15cdedd1cabf req-4a90d238-f289-44d7-8f3e-850e0c0e330f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.334 222021 DEBUG oslo_concurrency.lockutils [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:26 np0005593233 nova_compute[222017]: 2026-01-23 10:05:26.335 222021 DEBUG nova.network.neutron [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:28.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:28.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:28 np0005593233 nova_compute[222017]: 2026-01-23 10:05:28.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:28 np0005593233 nova_compute[222017]: 2026-01-23 10:05:28.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:28 np0005593233 nova_compute[222017]: 2026-01-23 10:05:28.688 222021 DEBUG nova.network.neutron [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updated VIF entry in instance network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:28 np0005593233 nova_compute[222017]: 2026-01-23 10:05:28.690 222021 DEBUG nova.network.neutron [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:28 np0005593233 nova_compute[222017]: 2026-01-23 10:05:28.720 222021 DEBUG oslo_concurrency.lockutils [req-11247228-7e71-4c26-bf9b-bd98c453eb96 req-15ff599b-b330-443e-9b9a-b00a4423c93d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:30.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:30.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:30 np0005593233 nova_compute[222017]: 2026-01-23 10:05:30.950 222021 DEBUG nova.compute.manager [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:30 np0005593233 nova_compute[222017]: 2026-01-23 10:05:30.950 222021 DEBUG nova.compute.manager [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing instance network info cache due to event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:30 np0005593233 nova_compute[222017]: 2026-01-23 10:05:30.950 222021 DEBUG oslo_concurrency.lockutils [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:30 np0005593233 nova_compute[222017]: 2026-01-23 10:05:30.951 222021 DEBUG oslo_concurrency.lockutils [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:30 np0005593233 nova_compute[222017]: 2026-01-23 10:05:30.951 222021 DEBUG nova.network.neutron [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:31 np0005593233 nova_compute[222017]: 2026-01-23 10:05:31.095 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:32.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:32.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3509404422' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:33 np0005593233 nova_compute[222017]: 2026-01-23 10:05:33.625 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:34.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:34.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:34 np0005593233 nova_compute[222017]: 2026-01-23 10:05:34.457 222021 DEBUG nova.network.neutron [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updated VIF entry in instance network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:34 np0005593233 nova_compute[222017]: 2026-01-23 10:05:34.458 222021 DEBUG nova.network.neutron [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:34 np0005593233 nova_compute[222017]: 2026-01-23 10:05:34.491 222021 DEBUG oslo_concurrency.lockutils [req-cc4acae9-a755-46d2-a575-28b2fe3fbe9c req-b1cb5909-37cc-4c36-ac62-6857cfd330a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:35 np0005593233 podman[267501]: 2026-01-23 10:05:35.074104344 +0000 UTC m=+0.077837207 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.412 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.412 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1821917588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:35 np0005593233 nova_compute[222017]: 2026-01-23 10:05:35.908 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.068 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.069 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.073 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.074 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.074 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000006f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:36.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.096 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:36.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.286 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.288 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4169MB free_disk=20.8001708984375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.288 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.288 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.638 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 51a66602-3548-4341-add1-988bd6c7aa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.639 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 3fe4245a-5986-4aa5-8762-24329c7e1043 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.641 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.641 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:05:36 np0005593233 nova_compute[222017]: 2026-01-23 10:05:36.715 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/942932129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:37 np0005593233 nova_compute[222017]: 2026-01-23 10:05:37.196 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:37 np0005593233 nova_compute[222017]: 2026-01-23 10:05:37.203 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:05:37 np0005593233 nova_compute[222017]: 2026-01-23 10:05:37.370 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:05:37 np0005593233 nova_compute[222017]: 2026-01-23 10:05:37.399 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:05:37 np0005593233 nova_compute[222017]: 2026-01-23 10:05:37.400 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:38.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:38.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:38 np0005593233 nova_compute[222017]: 2026-01-23 10:05:38.401 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:38 np0005593233 nova_compute[222017]: 2026-01-23 10:05:38.402 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:38 np0005593233 nova_compute[222017]: 2026-01-23 10:05:38.402 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:38 np0005593233 nova_compute[222017]: 2026-01-23 10:05:38.403 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:05:38 np0005593233 nova_compute[222017]: 2026-01-23 10:05:38.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:39 np0005593233 nova_compute[222017]: 2026-01-23 10:05:39.926 222021 DEBUG nova.compute.manager [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:39 np0005593233 nova_compute[222017]: 2026-01-23 10:05:39.926 222021 DEBUG nova.compute.manager [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing instance network info cache due to event network-changed-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:39 np0005593233 nova_compute[222017]: 2026-01-23 10:05:39.927 222021 DEBUG oslo_concurrency.lockutils [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:39 np0005593233 nova_compute[222017]: 2026-01-23 10:05:39.927 222021 DEBUG oslo_concurrency.lockutils [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:39 np0005593233 nova_compute[222017]: 2026-01-23 10:05:39.927 222021 DEBUG nova.network.neutron [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Refreshing network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:40.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:40.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:41 np0005593233 nova_compute[222017]: 2026-01-23 10:05:41.099 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:42.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:42.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:42 np0005593233 nova_compute[222017]: 2026-01-23 10:05:42.620 222021 DEBUG nova.network.neutron [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updated VIF entry in instance network info cache for port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:42 np0005593233 nova_compute[222017]: 2026-01-23 10:05:42.621 222021 DEBUG nova.network.neutron [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [{"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:42 np0005593233 nova_compute[222017]: 2026-01-23 10:05:42.639 222021 DEBUG oslo_concurrency.lockutils [req-521e8b0b-017c-4aa2-9c39-c9ef9750703e req-ebff0055-f8c8-444e-88ea-dd6e7383a96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3fe4245a-5986-4aa5-8762-24329c7e1043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:42.667 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:42.668 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:42.669 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:43 np0005593233 nova_compute[222017]: 2026-01-23 10:05:43.630 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:44.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:44.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:44 np0005593233 nova_compute[222017]: 2026-01-23 10:05:44.381 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:44 np0005593233 nova_compute[222017]: 2026-01-23 10:05:44.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:44 np0005593233 nova_compute[222017]: 2026-01-23 10:05:44.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:05:44 np0005593233 nova_compute[222017]: 2026-01-23 10:05:44.410 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:05:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:05:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1062943653' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:05:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:05:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1062943653' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:05:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:46.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:46.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.150 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.445 222021 DEBUG nova.compute.manager [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-changed-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.446 222021 DEBUG nova.compute.manager [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Refreshing instance network info cache due to event network-changed-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.446 222021 DEBUG oslo_concurrency.lockutils [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.446 222021 DEBUG oslo_concurrency.lockutils [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.447 222021 DEBUG nova.network.neutron [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Refreshing network info cache for port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.480 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.483 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.483 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.483 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.483 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.484 222021 INFO nova.compute.manager [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Terminating instance#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.485 222021 DEBUG nova.compute.manager [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:05:46 np0005593233 kernel: tape9cb7ed5-cf (unregistering): left promiscuous mode
Jan 23 05:05:46 np0005593233 NetworkManager[48871]: <info>  [1769162746.5465] device (tape9cb7ed5-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:46Z|00510|binding|INFO|Releasing lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb from this chassis (sb_readonly=0)
Jan 23 05:05:46 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:46Z|00511|binding|INFO|Setting lport e9cb7ed5-cf17-45b5-930e-f69b40b4feeb down in Southbound
Jan 23 05:05:46 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:46Z|00512|binding|INFO|Removing iface tape9cb7ed5-cf ovn-installed in OVS
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.557 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:46.563 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:26:7e 10.100.0.11'], port_security=['fa:16:3e:50:26:7e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3fe4245a-5986-4aa5-8762-24329c7e1043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ed922f3-1187-4218-88c3-8aa17da9140a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1cc631a7d348a5ad176273e81495bb', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a17a68ae-d91e-463f-82bb-e1b60b7cf4d3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d7fa43d-e34b-4327-b645-5d2c8b16d3b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:46.564 140224 INFO neutron.agent.ovn.metadata.agent [-] Port e9cb7ed5-cf17-45b5-930e-f69b40b4feeb in datapath 7ed922f3-1187-4218-88c3-8aa17da9140a unbound from our chassis#033[00m
Jan 23 05:05:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:46.565 140224 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7ed922f3-1187-4218-88c3-8aa17da9140a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:05:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:05:46.567 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[65325330-3a07-4086-a808-83cbe2b38dc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.570 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 23 05:05:46 np0005593233 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d0000006f.scope: Consumed 15.003s CPU time.
Jan 23 05:05:46 np0005593233 systemd-machined[190954]: Machine qemu-54-instance-0000006f terminated.
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.720 222021 INFO nova.virt.libvirt.driver [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Instance destroyed successfully.#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.722 222021 DEBUG nova.objects.instance [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lazy-loading 'resources' on Instance uuid 3fe4245a-5986-4aa5-8762-24329c7e1043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.737 222021 DEBUG nova.virt.libvirt.vif [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:04:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1541258776',display_name='tempest-ServerRescueTestJSONUnderV235-server-1541258776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1541258776',id=111,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ca1cc631a7d348a5ad176273e81495bb',ramdisk_id='',reservation_id='r-be45v9br',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-976372845',owner_user_name='tempest-ServerRescueTestJSONUnderV235-976372845-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:05:20Z,user_data=None,user_id='878babe1fbab428f98092e314b2ae0b1',uuid=3fe4245a-5986-4aa5-8762-24329c7e1043,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.737 222021 DEBUG nova.network.os_vif_util [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converting VIF {"id": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "address": "fa:16:3e:50:26:7e", "network": {"id": "7ed922f3-1187-4218-88c3-8aa17da9140a", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-831983545-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "ca1cc631a7d348a5ad176273e81495bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9cb7ed5-cf", "ovs_interfaceid": "e9cb7ed5-cf17-45b5-930e-f69b40b4feeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.739 222021 DEBUG nova.network.os_vif_util [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.739 222021 DEBUG os_vif [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.742 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9cb7ed5-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593233 nova_compute[222017]: 2026-01-23 10:05:46.752 222021 INFO os_vif [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:26:7e,bridge_name='br-int',has_traffic_filtering=True,id=e9cb7ed5-cf17-45b5-930e-f69b40b4feeb,network=Network(7ed922f3-1187-4218-88c3-8aa17da9140a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9cb7ed5-cf')#033[00m
Jan 23 05:05:47 np0005593233 nova_compute[222017]: 2026-01-23 10:05:47.730 222021 DEBUG nova.compute.manager [req-c6ed964c-d208-4150-b68d-07b2c04cd0d4 req-d5403ea2-fa14-468e-ae11-55a215792558 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-unplugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:47 np0005593233 nova_compute[222017]: 2026-01-23 10:05:47.730 222021 DEBUG oslo_concurrency.lockutils [req-c6ed964c-d208-4150-b68d-07b2c04cd0d4 req-d5403ea2-fa14-468e-ae11-55a215792558 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:47 np0005593233 nova_compute[222017]: 2026-01-23 10:05:47.731 222021 DEBUG oslo_concurrency.lockutils [req-c6ed964c-d208-4150-b68d-07b2c04cd0d4 req-d5403ea2-fa14-468e-ae11-55a215792558 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:47 np0005593233 nova_compute[222017]: 2026-01-23 10:05:47.731 222021 DEBUG oslo_concurrency.lockutils [req-c6ed964c-d208-4150-b68d-07b2c04cd0d4 req-d5403ea2-fa14-468e-ae11-55a215792558 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:47 np0005593233 nova_compute[222017]: 2026-01-23 10:05:47.731 222021 DEBUG nova.compute.manager [req-c6ed964c-d208-4150-b68d-07b2c04cd0d4 req-d5403ea2-fa14-468e-ae11-55a215792558 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-unplugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:47 np0005593233 nova_compute[222017]: 2026-01-23 10:05:47.731 222021 DEBUG nova.compute.manager [req-c6ed964c-d208-4150-b68d-07b2c04cd0d4 req-d5403ea2-fa14-468e-ae11-55a215792558 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-unplugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:05:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:48.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:48.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.225 222021 INFO nova.virt.libvirt.driver [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Deleting instance files /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043_del#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.226 222021 INFO nova.virt.libvirt.driver [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Deletion of /var/lib/nova/instances/3fe4245a-5986-4aa5-8762-24329c7e1043_del complete#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.311 222021 INFO nova.compute.manager [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Took 1.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.311 222021 DEBUG oslo.service.loopingcall [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.312 222021 DEBUG nova.compute.manager [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.312 222021 DEBUG nova.network.neutron [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:05:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:05:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:05:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:05:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.609 222021 DEBUG nova.network.neutron [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated VIF entry in instance network info cache for port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.609 222021 DEBUG nova.network.neutron [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:48 np0005593233 nova_compute[222017]: 2026-01-23 10:05:48.633 222021 DEBUG oslo_concurrency.lockutils [req-1d3423d4-8abb-4497-9f14-76068f9772ae req-572bb9d7-6032-4a90-9678-3e202e37bf18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.622 222021 DEBUG nova.network.neutron [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.647 222021 INFO nova.compute.manager [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.706 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.707 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.789 222021 DEBUG oslo_concurrency.processutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.872 222021 DEBUG nova.compute.manager [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.873 222021 DEBUG oslo_concurrency.lockutils [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.873 222021 DEBUG oslo_concurrency.lockutils [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.873 222021 DEBUG oslo_concurrency.lockutils [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.874 222021 DEBUG nova.compute.manager [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] No waiting events found dispatching network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.874 222021 WARNING nova.compute.manager [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received unexpected event network-vif-plugged-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:05:49 np0005593233 nova_compute[222017]: 2026-01-23 10:05:49.874 222021 DEBUG nova.compute.manager [req-41c7960e-fd4a-4274-9395-85d6665fa7fd req-6bf5072f-13a2-40c1-b5a0-de209945808f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Received event network-vif-deleted-e9cb7ed5-cf17-45b5-930e-f69b40b4feeb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:50.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:50.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/611670186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:50 np0005593233 nova_compute[222017]: 2026-01-23 10:05:50.263 222021 DEBUG oslo_concurrency.processutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:50 np0005593233 nova_compute[222017]: 2026-01-23 10:05:50.272 222021 DEBUG nova.compute.provider_tree [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:05:50 np0005593233 nova_compute[222017]: 2026-01-23 10:05:50.294 222021 DEBUG nova.scheduler.client.report [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:05:50 np0005593233 nova_compute[222017]: 2026-01-23 10:05:50.360 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:50 np0005593233 nova_compute[222017]: 2026-01-23 10:05:50.433 222021 INFO nova.scheduler.client.report [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Deleted allocations for instance 3fe4245a-5986-4aa5-8762-24329c7e1043#033[00m
Jan 23 05:05:50 np0005593233 nova_compute[222017]: 2026-01-23 10:05:50.531 222021 DEBUG oslo_concurrency.lockutils [None req-7ca31bb8-c6fe-4787-81c9-6a21c6812eba 878babe1fbab428f98092e314b2ae0b1 ca1cc631a7d348a5ad176273e81495bb - - default default] Lock "3fe4245a-5986-4aa5-8762-24329c7e1043" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:51 np0005593233 nova_compute[222017]: 2026-01-23 10:05:51.152 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:51 np0005593233 nova_compute[222017]: 2026-01-23 10:05:51.772 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:52.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:52.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:53 np0005593233 podman[267756]: 2026-01-23 10:05:53.131133472 +0000 UTC m=+0.127610148 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:53 np0005593233 nova_compute[222017]: 2026-01-23 10:05:53.405 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:54.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:54.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:56.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:56.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:56 np0005593233 nova_compute[222017]: 2026-01-23 10:05:56.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:05:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:05:56 np0005593233 nova_compute[222017]: 2026-01-23 10:05:56.819 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:56 np0005593233 ovn_controller[130653]: 2026-01-23T10:05:56Z|00513|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:05:57 np0005593233 nova_compute[222017]: 2026-01-23 10:05:57.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:05:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:58.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:05:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:05:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:58.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:00.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:00.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:01 np0005593233 nova_compute[222017]: 2026-01-23 10:06:01.215 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:01 np0005593233 nova_compute[222017]: 2026-01-23 10:06:01.720 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162746.7180903, 3fe4245a-5986-4aa5-8762-24329c7e1043 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:01 np0005593233 nova_compute[222017]: 2026-01-23 10:06:01.722 222021 INFO nova.compute.manager [-] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:06:01 np0005593233 nova_compute[222017]: 2026-01-23 10:06:01.798 222021 DEBUG nova.compute.manager [None req-2c669eff-dcdc-41e1-ae04-c9b2934f963f - - - - - -] [instance: 3fe4245a-5986-4aa5-8762-24329c7e1043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:01 np0005593233 nova_compute[222017]: 2026-01-23 10:06:01.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:02.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:02.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:04.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:04.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:06 np0005593233 podman[267833]: 2026-01-23 10:06:06.055938003 +0000 UTC m=+0.060094161 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:06:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:06.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:06:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:06.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:06:06 np0005593233 nova_compute[222017]: 2026-01-23 10:06:06.218 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:06 np0005593233 nova_compute[222017]: 2026-01-23 10:06:06.824 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:08.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:08.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:10.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:10.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:11 np0005593233 nova_compute[222017]: 2026-01-23 10:06:11.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:11 np0005593233 nova_compute[222017]: 2026-01-23 10:06:11.828 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:12.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:12.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:14.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:14.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:16.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:16.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:16 np0005593233 nova_compute[222017]: 2026-01-23 10:06:16.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593233 nova_compute[222017]: 2026-01-23 10:06:16.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:18.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:18.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:19 np0005593233 nova_compute[222017]: 2026-01-23 10:06:19.453 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:06:19.454 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:06:19.456 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:06:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:20.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:20.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.434732) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780434825, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 903, "num_deletes": 251, "total_data_size": 1699139, "memory_usage": 1732688, "flush_reason": "Manual Compaction"}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780449422, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 1121837, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 53378, "largest_seqno": 54276, "table_properties": {"data_size": 1117637, "index_size": 1852, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9647, "raw_average_key_size": 19, "raw_value_size": 1109191, "raw_average_value_size": 2277, "num_data_blocks": 81, "num_entries": 487, "num_filter_entries": 487, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162717, "oldest_key_time": 1769162717, "file_creation_time": 1769162780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 14819 microseconds, and 5392 cpu microseconds.
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.449548) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 1121837 bytes OK
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.449592) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.453301) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.453376) EVENT_LOG_v1 {"time_micros": 1769162780453360, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.453413) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1694475, prev total WAL file size 1694475, number of live WAL files 2.
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.454363) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(1095KB)], [105(12MB)]
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780454411, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 14330701, "oldest_snapshot_seqno": -1}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7671 keys, 12511920 bytes, temperature: kUnknown
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780549124, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 12511920, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12459701, "index_size": 31947, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19205, "raw_key_size": 198767, "raw_average_key_size": 25, "raw_value_size": 12321820, "raw_average_value_size": 1606, "num_data_blocks": 1261, "num_entries": 7671, "num_filter_entries": 7671, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.549671) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 12511920 bytes
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.559756) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.9 rd, 131.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 12.6 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(23.9) write-amplify(11.2) OK, records in: 8185, records dropped: 514 output_compression: NoCompression
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.559825) EVENT_LOG_v1 {"time_micros": 1769162780559800, "job": 66, "event": "compaction_finished", "compaction_time_micros": 94978, "compaction_time_cpu_micros": 38598, "output_level": 6, "num_output_files": 1, "total_output_size": 12511920, "num_input_records": 8185, "num_output_records": 7671, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780560505, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780563549, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.454289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.563819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.563830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.563832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.563835) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:06:20.563846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:21 np0005593233 nova_compute[222017]: 2026-01-23 10:06:21.229 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:21 np0005593233 nova_compute[222017]: 2026-01-23 10:06:21.879 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:22.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:22.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:06:23.460 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:24 np0005593233 podman[267852]: 2026-01-23 10:06:24.135144659 +0000 UTC m=+0.139153571 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.165 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.166 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:24.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:24.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.263 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.434 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.435 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.452 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.452 222021 INFO nova.compute.claims [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:06:24 np0005593233 nova_compute[222017]: 2026-01-23 10:06:24.785 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1370664241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.267 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.273 222021 DEBUG nova.compute.provider_tree [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.342 222021 DEBUG nova.scheduler.client.report [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.467 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.468 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.621 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.694 222021 INFO nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.732 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.991 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.992 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:06:25 np0005593233 nova_compute[222017]: 2026-01-23 10:06:25.993 222021 INFO nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Creating image(s)#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.031 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.068 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.103 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.107 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:26.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:26.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.206 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.207 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.209 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.210 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.245 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.253 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.284 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.571 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.318s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.660 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] resizing rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.780 222021 DEBUG nova.objects.instance [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'migration_context' on Instance uuid 0105d5bf-f2c2-4ed5-822c-a2172abb0582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.881 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.926 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.927 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Ensure instance console log exists: /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.928 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.928 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.929 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.931 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.936 222021 WARNING nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.948 222021 DEBUG nova.virt.libvirt.host [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.950 222021 DEBUG nova.virt.libvirt.host [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.963 222021 DEBUG nova.virt.libvirt.host [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.964 222021 DEBUG nova.virt.libvirt.host [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.966 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.966 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.967 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.967 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.967 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.967 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.968 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.968 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.968 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.968 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.968 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.969 222021 DEBUG nova.virt.hardware [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:06:26 np0005593233 nova_compute[222017]: 2026-01-23 10:06:26.972 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3858484359' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:27 np0005593233 nova_compute[222017]: 2026-01-23 10:06:27.453 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:27 np0005593233 nova_compute[222017]: 2026-01-23 10:06:27.488 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:27 np0005593233 nova_compute[222017]: 2026-01-23 10:06:27.493 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/992619440' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.053 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.559s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.055 222021 DEBUG nova.objects.instance [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'pci_devices' on Instance uuid 0105d5bf-f2c2-4ed5-822c-a2172abb0582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.082 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <uuid>0105d5bf-f2c2-4ed5-822c-a2172abb0582</uuid>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <name>instance-00000075</name>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerShowV247Test-server-29550483</nova:name>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:06:26</nova:creationTime>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:user uuid="e0fe7d252cd04174840bdf8dfefa3510">tempest-ServerShowV247Test-1613897366-project-member</nova:user>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <nova:project uuid="3ecb2c0cafc441fd9457198fe09cc97b">tempest-ServerShowV247Test-1613897366</nova:project>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <nova:ports/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <entry name="serial">0105d5bf-f2c2-4ed5-822c-a2172abb0582</entry>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <entry name="uuid">0105d5bf-f2c2-4ed5-822c-a2172abb0582</entry>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk.config">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/console.log" append="off"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:06:28 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:06:28 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:06:28 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:06:28 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:06:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:28.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:28.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:06:28Z|00514|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.481 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.481 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.482 222021 INFO nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Using config drive#033[00m
Jan 23 05:06:28 np0005593233 nova_compute[222017]: 2026-01-23 10:06:28.508 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.436 222021 INFO nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Creating config drive at /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/disk.config#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.443 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ax55fzt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.597 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ax55fzt" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.633 222021 DEBUG nova.storage.rbd_utils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.638 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/disk.config 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.946 222021 DEBUG oslo_concurrency.processutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/disk.config 0105d5bf-f2c2-4ed5-822c-a2172abb0582_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.308s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593233 nova_compute[222017]: 2026-01-23 10:06:29.947 222021 INFO nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Deleting local config drive /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582/disk.config because it was imported into RBD.#033[00m
Jan 23 05:06:30 np0005593233 systemd-machined[190954]: New machine qemu-55-instance-00000075.
Jan 23 05:06:30 np0005593233 systemd[1]: Started Virtual Machine qemu-55-instance-00000075.
Jan 23 05:06:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:30.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:30.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.659 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162790.6584237, 0105d5bf-f2c2-4ed5-822c-a2172abb0582 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.659 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.663 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.664 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.668 222021 INFO nova.virt.libvirt.driver [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Instance spawned successfully.#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.668 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.709 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.714 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.727 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.728 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.729 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.729 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.730 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.730 222021 DEBUG nova.virt.libvirt.driver [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.822 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.823 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162790.6597464, 0105d5bf-f2c2-4ed5-822c-a2172abb0582 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.823 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] VM Started (Lifecycle Event)#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.969 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:30 np0005593233 nova_compute[222017]: 2026-01-23 10:06:30.973 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.014 222021 INFO nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Took 5.02 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.015 222021 DEBUG nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.025 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:06:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.193 222021 INFO nova.compute.manager [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Took 6.81 seconds to build instance.#033[00m
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.253 222021 DEBUG oslo_concurrency.lockutils [None req-1c8b6cd7-2b18-4ec9-ba8d-57b64cce3e15 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:31 np0005593233 nova_compute[222017]: 2026-01-23 10:06:31.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:32.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:32.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:34.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:34.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:35 np0005593233 nova_compute[222017]: 2026-01-23 10:06:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:35 np0005593233 nova_compute[222017]: 2026-01-23 10:06:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:36.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:06:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:36.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:06:36 np0005593233 nova_compute[222017]: 2026-01-23 10:06:36.237 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:36 np0005593233 nova_compute[222017]: 2026-01-23 10:06:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:36 np0005593233 nova_compute[222017]: 2026-01-23 10:06:36.886 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:37 np0005593233 podman[268245]: 2026-01-23 10:06:37.100576565 +0000 UTC m=+0.092441665 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:06:37 np0005593233 nova_compute[222017]: 2026-01-23 10:06:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:38.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:38.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:38 np0005593233 nova_compute[222017]: 2026-01-23 10:06:38.445 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:38 np0005593233 nova_compute[222017]: 2026-01-23 10:06:38.446 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:38 np0005593233 nova_compute[222017]: 2026-01-23 10:06:38.446 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:38 np0005593233 nova_compute[222017]: 2026-01-23 10:06:38.446 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:06:38 np0005593233 nova_compute[222017]: 2026-01-23 10:06:38.446 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2701426189' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:38 np0005593233 nova_compute[222017]: 2026-01-23 10:06:38.942 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.073 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.074 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000061 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.078 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.078 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000075 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.263 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.265 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4183MB free_disk=20.80970001220703GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.266 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.266 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.457 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 51a66602-3548-4341-add1-988bd6c7aa57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.458 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0105d5bf-f2c2-4ed5-822c-a2172abb0582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.458 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.458 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:06:39 np0005593233 nova_compute[222017]: 2026-01-23 10:06:39.602 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4286449734' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:40 np0005593233 nova_compute[222017]: 2026-01-23 10:06:40.060 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:40 np0005593233 nova_compute[222017]: 2026-01-23 10:06:40.068 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:06:40 np0005593233 nova_compute[222017]: 2026-01-23 10:06:40.088 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:06:40 np0005593233 nova_compute[222017]: 2026-01-23 10:06:40.134 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:06:40 np0005593233 nova_compute[222017]: 2026-01-23 10:06:40.135 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:40.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:40.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:41 np0005593233 nova_compute[222017]: 2026-01-23 10:06:41.136 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:41 np0005593233 nova_compute[222017]: 2026-01-23 10:06:41.137 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:41 np0005593233 nova_compute[222017]: 2026-01-23 10:06:41.137 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:06:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:41 np0005593233 nova_compute[222017]: 2026-01-23 10:06:41.240 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:41 np0005593233 nova_compute[222017]: 2026-01-23 10:06:41.889 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:42.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:42.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:06:42.669 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:06:42.671 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:06:42.672 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:44.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:44.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:44 np0005593233 nova_compute[222017]: 2026-01-23 10:06:44.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:44 np0005593233 nova_compute[222017]: 2026-01-23 10:06:44.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:06:44 np0005593233 nova_compute[222017]: 2026-01-23 10:06:44.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:06:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:06:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/599516109' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:06:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:06:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/599516109' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:06:45 np0005593233 nova_compute[222017]: 2026-01-23 10:06:45.480 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:06:45 np0005593233 nova_compute[222017]: 2026-01-23 10:06:45.480 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:06:45 np0005593233 nova_compute[222017]: 2026-01-23 10:06:45.480 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:06:45 np0005593233 nova_compute[222017]: 2026-01-23 10:06:45.481 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:46.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:46.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:46 np0005593233 nova_compute[222017]: 2026-01-23 10:06:46.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:46 np0005593233 nova_compute[222017]: 2026-01-23 10:06:46.899 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:48.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:48.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:49 np0005593233 nova_compute[222017]: 2026-01-23 10:06:49.903 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [{"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:49 np0005593233 nova_compute[222017]: 2026-01-23 10:06:49.946 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-51a66602-3548-4341-add1-988bd6c7aa57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:06:49 np0005593233 nova_compute[222017]: 2026-01-23 10:06:49.946 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:06:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:50.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:50.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:50 np0005593233 nova_compute[222017]: 2026-01-23 10:06:50.845 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:51 np0005593233 nova_compute[222017]: 2026-01-23 10:06:51.244 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:51 np0005593233 nova_compute[222017]: 2026-01-23 10:06:51.902 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:51 np0005593233 nova_compute[222017]: 2026-01-23 10:06:51.939 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:52.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:52.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:54.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:54.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:55 np0005593233 podman[268310]: 2026-01-23 10:06:55.11201814 +0000 UTC m=+0.114503102 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 05:06:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:56.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:56.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:56 np0005593233 nova_compute[222017]: 2026-01-23 10:06:56.247 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:56 np0005593233 nova_compute[222017]: 2026-01-23 10:06:56.905 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:06:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:06:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:58.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:06:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:06:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:58.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:58 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:06:58 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:06:59 np0005593233 nova_compute[222017]: 2026-01-23 10:06:59.034 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:00.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:00.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:01 np0005593233 nova_compute[222017]: 2026-01-23 10:07:01.249 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:01 np0005593233 nova_compute[222017]: 2026-01-23 10:07:01.972 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:02.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:02.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:07:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:04.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:07:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:04.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:07:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:07:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:06.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:06.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:06 np0005593233 nova_compute[222017]: 2026-01-23 10:07:06.251 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:06 np0005593233 nova_compute[222017]: 2026-01-23 10:07:06.974 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:08 np0005593233 podman[268518]: 2026-01-23 10:07:08.056329642 +0000 UTC m=+0.057748665 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:07:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:08.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:08.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:09 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:09Z|00515|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:07:09 np0005593233 nova_compute[222017]: 2026-01-23 10:07:09.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:09 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:09Z|00516|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:07:09 np0005593233 nova_compute[222017]: 2026-01-23 10:07:09.220 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.072 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.073 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.074 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.074 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.074 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.077 222021 INFO nova.compute.manager [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Terminating instance#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.078 222021 DEBUG nova.compute.manager [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:07:10 np0005593233 kernel: tap0ef6c9e2-66 (unregistering): left promiscuous mode
Jan 23 05:07:10 np0005593233 NetworkManager[48871]: <info>  [1769162830.1601] device (tap0ef6c9e2-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:07:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:10Z|00517|binding|INFO|Releasing lport 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 from this chassis (sb_readonly=0)
Jan 23 05:07:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:10Z|00518|binding|INFO|Setting lport 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 down in Southbound
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.169 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:10Z|00519|binding|INFO|Removing iface tap0ef6c9e2-66 ovn-installed in OVS
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.171 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.182 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:9c:08 10.100.0.6'], port_security=['fa:16:3e:de:9c:08 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '51a66602-3548-4341-add1-988bd6c7aa57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9910180-8b38-41b2-8cb3-4e4af7eb2c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.184 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b unbound from our chassis#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.185 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8575e824-4be0-4206-873e-2f9a3d1ded0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.190 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.188 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[da3cd994-2295-4920-897b-23cf6f547486]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.191 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace which is not needed anymore#033[00m
Jan 23 05:07:10 np0005593233 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000061.scope: Deactivated successfully.
Jan 23 05:07:10 np0005593233 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000061.scope: Consumed 29.447s CPU time.
Jan 23 05:07:10 np0005593233 systemd-machined[190954]: Machine qemu-47-instance-00000061 terminated.
Jan 23 05:07:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:10.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:10.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.330 222021 INFO nova.virt.libvirt.driver [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Instance destroyed successfully.#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.332 222021 DEBUG nova.objects.instance [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'resources' on Instance uuid 51a66602-3548-4341-add1-988bd6c7aa57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:10 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [NOTICE]   (261640) : haproxy version is 2.8.14-c23fe91
Jan 23 05:07:10 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [NOTICE]   (261640) : path to executable is /usr/sbin/haproxy
Jan 23 05:07:10 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [WARNING]  (261640) : Exiting Master process...
Jan 23 05:07:10 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [WARNING]  (261640) : Exiting Master process...
Jan 23 05:07:10 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [ALERT]    (261640) : Current worker (261642) exited with code 143 (Terminated)
Jan 23 05:07:10 np0005593233 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[261636]: [WARNING]  (261640) : All workers exited. Exiting... (0)
Jan 23 05:07:10 np0005593233 systemd[1]: libpod-93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44.scope: Deactivated successfully.
Jan 23 05:07:10 np0005593233 podman[268563]: 2026-01-23 10:07:10.369166567 +0000 UTC m=+0.061388177 container died 93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.385 222021 DEBUG nova.virt.libvirt.vif [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:01:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-953780140',display_name='tempest-ServerActionsTestOtherA-server-953780140',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-953780140',id=97,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPDPMbCnqcp11s7OR05vsDdiZlZSU5ZbBJSLaqQpawTODCANj+91AmOb6Hdh0FgzlQPvmSu+VYXOLfZik0SA3L4m61/nruOol9dJ9Mz34f8cV2NJKksVR2Ar2t+W5r4M6w==',key_name='tempest-keypair-2078677939',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:01:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-502m022b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:01:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='29710db389c842df836944048225740f',uuid=51a66602-3548-4341-add1-988bd6c7aa57,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.385 222021 DEBUG nova.network.os_vif_util [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "address": "fa:16:3e:de:9c:08", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ef6c9e2-66", "ovs_interfaceid": "0ef6c9e2-66fb-4264-8cb0-7cd8ba296828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.386 222021 DEBUG nova.network.os_vif_util [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.387 222021 DEBUG os_vif [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.388 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.388 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ef6c9e2-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.392 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.394 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.396 222021 INFO os_vif [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:9c:08,bridge_name='br-int',has_traffic_filtering=True,id=0ef6c9e2-66fb-4264-8cb0-7cd8ba296828,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ef6c9e2-66')#033[00m
Jan 23 05:07:10 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44-userdata-shm.mount: Deactivated successfully.
Jan 23 05:07:10 np0005593233 systemd[1]: var-lib-containers-storage-overlay-458a2ef67bc34737649f0557f0123cb318adab6330bc1c29c9e6f131863e2bd2-merged.mount: Deactivated successfully.
Jan 23 05:07:10 np0005593233 podman[268563]: 2026-01-23 10:07:10.421246193 +0000 UTC m=+0.113467793 container cleanup 93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:07:10 np0005593233 systemd[1]: libpod-conmon-93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44.scope: Deactivated successfully.
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.458 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.459 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.459 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.460 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.460 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.461 222021 INFO nova.compute.manager [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Terminating instance#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.462 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "refresh_cache-0105d5bf-f2c2-4ed5-822c-a2172abb0582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.462 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquired lock "refresh_cache-0105d5bf-f2c2-4ed5-822c-a2172abb0582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.462 222021 DEBUG nova.network.neutron [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:07:10 np0005593233 podman[268621]: 2026-01-23 10:07:10.514311065 +0000 UTC m=+0.060226495 container remove 93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.524 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fb5757-91a5-468c-870d-1fdbfc3589e5]: (4, ('Fri Jan 23 10:07:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44)\n93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44\nFri Jan 23 10:07:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44)\n93d21500dcdba8b24ce549f9eb6b8327d2f09f6c77a1adc2b7f024928f4d0c44\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.526 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[82f8482d-604d-47c7-8f3b-63c64e6b6e03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.527 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 kernel: tap8575e824-40: left promiscuous mode
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.549 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfb4907-cc62-4e16-b2a5-249ec694a25e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.566 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b60eccf5-4335-472e-9a7b-0e16a4cc4486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.568 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ee029f8d-2f0f-48a1-b1e9-3dc5fba27756]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.589 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[448edd90-ec96-407f-96cd-2cef653b7ffa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640418, 'reachable_time': 22943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268639, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 systemd[1]: run-netns-ovnmeta\x2d8575e824\x2d4be0\x2d4206\x2d873e\x2d2f9a3d1ded0b.mount: Deactivated successfully.
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.596 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:07:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:10.598 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[7200a1c2-cd6d-4055-9e50-552c08e01e79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.721 222021 DEBUG nova.compute.manager [req-bd6426a1-b42c-44a0-b28a-84304c4d37f1 req-062b6ff0-22df-42e3-a60c-5093bd42d75b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-vif-unplugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.722 222021 DEBUG oslo_concurrency.lockutils [req-bd6426a1-b42c-44a0-b28a-84304c4d37f1 req-062b6ff0-22df-42e3-a60c-5093bd42d75b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.722 222021 DEBUG oslo_concurrency.lockutils [req-bd6426a1-b42c-44a0-b28a-84304c4d37f1 req-062b6ff0-22df-42e3-a60c-5093bd42d75b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.722 222021 DEBUG oslo_concurrency.lockutils [req-bd6426a1-b42c-44a0-b28a-84304c4d37f1 req-062b6ff0-22df-42e3-a60c-5093bd42d75b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.722 222021 DEBUG nova.compute.manager [req-bd6426a1-b42c-44a0-b28a-84304c4d37f1 req-062b6ff0-22df-42e3-a60c-5093bd42d75b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] No waiting events found dispatching network-vif-unplugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.723 222021 DEBUG nova.compute.manager [req-bd6426a1-b42c-44a0-b28a-84304c4d37f1 req-062b6ff0-22df-42e3-a60c-5093bd42d75b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-vif-unplugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.786 222021 DEBUG nova.network.neutron [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.807 222021 INFO nova.virt.libvirt.driver [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Deleting instance files /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57_del#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.808 222021 INFO nova.virt.libvirt.driver [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Deletion of /var/lib/nova/instances/51a66602-3548-4341-add1-988bd6c7aa57_del complete#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.933 222021 INFO nova.compute.manager [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.934 222021 DEBUG oslo.service.loopingcall [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.935 222021 DEBUG nova.compute.manager [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:07:10 np0005593233 nova_compute[222017]: 2026-01-23 10:07:10.935 222021 DEBUG nova.network.neutron [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:07:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:11 np0005593233 nova_compute[222017]: 2026-01-23 10:07:11.253 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:11 np0005593233 nova_compute[222017]: 2026-01-23 10:07:11.513 222021 DEBUG nova.network.neutron [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:11 np0005593233 nova_compute[222017]: 2026-01-23 10:07:11.530 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Releasing lock "refresh_cache-0105d5bf-f2c2-4ed5-822c-a2172abb0582" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:07:11 np0005593233 nova_compute[222017]: 2026-01-23 10:07:11.531 222021 DEBUG nova.compute.manager [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:07:11 np0005593233 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Deactivated successfully.
Jan 23 05:07:11 np0005593233 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Consumed 15.988s CPU time.
Jan 23 05:07:11 np0005593233 systemd-machined[190954]: Machine qemu-55-instance-00000075 terminated.
Jan 23 05:07:11 np0005593233 nova_compute[222017]: 2026-01-23 10:07:11.761 222021 INFO nova.virt.libvirt.driver [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Instance destroyed successfully.#033[00m
Jan 23 05:07:11 np0005593233 nova_compute[222017]: 2026-01-23 10:07:11.762 222021 DEBUG nova.objects.instance [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'resources' on Instance uuid 0105d5bf-f2c2-4ed5-822c-a2172abb0582 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:12.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:12.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.510 222021 INFO nova.virt.libvirt.driver [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Deleting instance files /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582_del#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.511 222021 INFO nova.virt.libvirt.driver [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Deletion of /var/lib/nova/instances/0105d5bf-f2c2-4ed5-822c-a2172abb0582_del complete#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.611 222021 INFO nova.compute.manager [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Took 1.08 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.612 222021 DEBUG oslo.service.loopingcall [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.612 222021 DEBUG nova.compute.manager [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.612 222021 DEBUG nova.network.neutron [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.876 222021 DEBUG nova.compute.manager [req-0c2a9d61-7687-418b-a02d-02e494c71f27 req-d210062b-c809-4e8a-90d3-74e3e7133384 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.878 222021 DEBUG oslo_concurrency.lockutils [req-0c2a9d61-7687-418b-a02d-02e494c71f27 req-d210062b-c809-4e8a-90d3-74e3e7133384 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "51a66602-3548-4341-add1-988bd6c7aa57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.878 222021 DEBUG oslo_concurrency.lockutils [req-0c2a9d61-7687-418b-a02d-02e494c71f27 req-d210062b-c809-4e8a-90d3-74e3e7133384 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.878 222021 DEBUG oslo_concurrency.lockutils [req-0c2a9d61-7687-418b-a02d-02e494c71f27 req-d210062b-c809-4e8a-90d3-74e3e7133384 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.879 222021 DEBUG nova.compute.manager [req-0c2a9d61-7687-418b-a02d-02e494c71f27 req-d210062b-c809-4e8a-90d3-74e3e7133384 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] No waiting events found dispatching network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.879 222021 WARNING nova.compute.manager [req-0c2a9d61-7687-418b-a02d-02e494c71f27 req-d210062b-c809-4e8a-90d3-74e3e7133384 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received unexpected event network-vif-plugged-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.901 222021 DEBUG nova.network.neutron [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.968 222021 DEBUG nova.network.neutron [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.993 222021 INFO nova.compute.manager [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Took 0.38 seconds to deallocate network for instance.#033[00m
Jan 23 05:07:12 np0005593233 nova_compute[222017]: 2026-01-23 10:07:12.998 222021 DEBUG nova.network.neutron [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.032 222021 INFO nova.compute.manager [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Took 2.10 seconds to deallocate network for instance.#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.070 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.070 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.098 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.259 222021 DEBUG oslo_concurrency.processutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.302 222021 DEBUG nova.compute.manager [req-8849e4a2-07a4-4511-9e1d-1ead8e378856 req-bfa6faab-f4e1-4794-aa1f-d9453a9cc5a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Received event network-vif-deleted-0ef6c9e2-66fb-4264-8cb0-7cd8ba296828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3396022956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.735 222021 DEBUG oslo_concurrency.processutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.742 222021 DEBUG nova.compute.provider_tree [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.779 222021 DEBUG nova.scheduler.client.report [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.818 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.821 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.877 222021 INFO nova.scheduler.client.report [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Deleted allocations for instance 0105d5bf-f2c2-4ed5-822c-a2172abb0582#033[00m
Jan 23 05:07:13 np0005593233 nova_compute[222017]: 2026-01-23 10:07:13.945 222021 DEBUG oslo_concurrency.processutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.014 222021 DEBUG oslo_concurrency.lockutils [None req-19c5d5f4-175a-46d0-ab36-b691fdcf4627 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "0105d5bf-f2c2-4ed5-822c-a2172abb0582" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:14.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:14.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/715019760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.391 222021 DEBUG oslo_concurrency.processutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.397 222021 DEBUG nova.compute.provider_tree [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.434 222021 DEBUG nova.scheduler.client.report [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.486 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.592 222021 INFO nova.scheduler.client.report [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Deleted allocations for instance 51a66602-3548-4341-add1-988bd6c7aa57#033[00m
Jan 23 05:07:14 np0005593233 nova_compute[222017]: 2026-01-23 10:07:14.735 222021 DEBUG oslo_concurrency.lockutils [None req-96e0f8db-91f3-4d32-8e87-02bf0b1fdc78 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "51a66602-3548-4341-add1-988bd6c7aa57" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:15 np0005593233 nova_compute[222017]: 2026-01-23 10:07:15.391 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:16.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:16 np0005593233 nova_compute[222017]: 2026-01-23 10:07:16.256 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:16.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:18.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:18.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:20.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:20.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:20 np0005593233 nova_compute[222017]: 2026-01-23 10:07:20.395 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:21 np0005593233 nova_compute[222017]: 2026-01-23 10:07:21.258 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.682985) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841683045, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 893, "num_deletes": 250, "total_data_size": 1663370, "memory_usage": 1685768, "flush_reason": "Manual Compaction"}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841693587, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 715406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54281, "largest_seqno": 55169, "table_properties": {"data_size": 711904, "index_size": 1218, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9673, "raw_average_key_size": 20, "raw_value_size": 704393, "raw_average_value_size": 1524, "num_data_blocks": 53, "num_entries": 462, "num_filter_entries": 462, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162781, "oldest_key_time": 1769162781, "file_creation_time": 1769162841, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 10692 microseconds, and 5630 cpu microseconds.
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.693641) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 715406 bytes OK
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.693707) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.696475) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.696492) EVENT_LOG_v1 {"time_micros": 1769162841696486, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.696519) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 1658729, prev total WAL file size 1658729, number of live WAL files 2.
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.697504) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373536' seq:72057594037927935, type:22 .. '6D6772737461740032303037' seq:0, type:0; will stop at (end)
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(698KB)], [108(11MB)]
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841697611, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 13227326, "oldest_snapshot_seqno": -1}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 7645 keys, 9760692 bytes, temperature: kUnknown
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841858560, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 9760692, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9712727, "index_size": 27757, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19141, "raw_key_size": 198512, "raw_average_key_size": 25, "raw_value_size": 9579543, "raw_average_value_size": 1253, "num_data_blocks": 1087, "num_entries": 7645, "num_filter_entries": 7645, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162841, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.859141) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 9760692 bytes
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.906095) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 82.1 rd, 60.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 11.9 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(32.1) write-amplify(13.6) OK, records in: 8133, records dropped: 488 output_compression: NoCompression
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.906155) EVENT_LOG_v1 {"time_micros": 1769162841906133, "job": 68, "event": "compaction_finished", "compaction_time_micros": 161104, "compaction_time_cpu_micros": 29048, "output_level": 6, "num_output_files": 1, "total_output_size": 9760692, "num_input_records": 8133, "num_output_records": 7645, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841906613, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841909333, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.697358) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.909406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.909417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.909419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.909420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:07:21.909422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:22.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:22.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:24.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:24.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:25 np0005593233 nova_compute[222017]: 2026-01-23 10:07:25.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:25 np0005593233 nova_compute[222017]: 2026-01-23 10:07:25.328 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162830.3272295, 51a66602-3548-4341-add1-988bd6c7aa57 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:25 np0005593233 nova_compute[222017]: 2026-01-23 10:07:25.329 222021 INFO nova.compute.manager [-] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:07:25 np0005593233 nova_compute[222017]: 2026-01-23 10:07:25.384 222021 DEBUG nova.compute.manager [None req-3d2465c0-0d2c-40ae-86aa-f5a92525d4cb - - - - - -] [instance: 51a66602-3548-4341-add1-988bd6c7aa57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:25 np0005593233 nova_compute[222017]: 2026-01-23 10:07:25.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:25 np0005593233 podman[268706]: 2026-01-23 10:07:25.826059511 +0000 UTC m=+0.089112402 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:07:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 05:07:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:26.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:26.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:26 np0005593233 nova_compute[222017]: 2026-01-23 10:07:26.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:26.490 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:07:26 np0005593233 nova_compute[222017]: 2026-01-23 10:07:26.491 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:26.491 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:07:26 np0005593233 nova_compute[222017]: 2026-01-23 10:07:26.759 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162831.7580843, 0105d5bf-f2c2-4ed5-822c-a2172abb0582 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:26 np0005593233 nova_compute[222017]: 2026-01-23 10:07:26.760 222021 INFO nova.compute.manager [-] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:07:26 np0005593233 nova_compute[222017]: 2026-01-23 10:07:26.858 222021 DEBUG nova.compute.manager [None req-d75ffcfe-4c1c-42ef-abab-bff7413e8115 - - - - - -] [instance: 0105d5bf-f2c2-4ed5-822c-a2172abb0582] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 05:07:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:29.494 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:30.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:30.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593233 nova_compute[222017]: 2026-01-23 10:07:30.400 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:31 np0005593233 nova_compute[222017]: 2026-01-23 10:07:31.300 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:31 np0005593233 nova_compute[222017]: 2026-01-23 10:07:31.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:32.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:32.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:34.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:34.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:35 np0005593233 nova_compute[222017]: 2026-01-23 10:07:35.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:35 np0005593233 nova_compute[222017]: 2026-01-23 10:07:35.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:07:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:36.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:07:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:36.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:36 np0005593233 nova_compute[222017]: 2026-01-23 10:07:36.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.429 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.429 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.430 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.430 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3110263584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:37 np0005593233 nova_compute[222017]: 2026-01-23 10:07:37.923 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.115 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.117 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4572MB free_disk=20.963062286376953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.117 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.118 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.267 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.268 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:07:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:38.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:38.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.345 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4051747367' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.809 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.816 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.871 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.941 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:07:38 np0005593233 nova_compute[222017]: 2026-01-23 10:07:38.941 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:39 np0005593233 podman[268777]: 2026-01-23 10:07:39.105004541 +0000 UTC m=+0.100989564 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 05:07:39 np0005593233 nova_compute[222017]: 2026-01-23 10:07:39.943 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:39 np0005593233 nova_compute[222017]: 2026-01-23 10:07:39.944 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:40.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:40.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:40 np0005593233 nova_compute[222017]: 2026-01-23 10:07:40.409 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:41 np0005593233 nova_compute[222017]: 2026-01-23 10:07:41.337 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:41 np0005593233 nova_compute[222017]: 2026-01-23 10:07:41.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:41 np0005593233 nova_compute[222017]: 2026-01-23 10:07:41.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:07:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:42.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:42.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:42.670 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:42.671 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:42.671 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:44.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:44.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.642 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.642 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.671 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.835 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.836 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.847 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:07:44 np0005593233 nova_compute[222017]: 2026-01-23 10:07:44.848 222021 INFO nova.compute.claims [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.018 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.388 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.437 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:07:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3179285486' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.512 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.518 222021 DEBUG nova.compute.provider_tree [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.554 222021 DEBUG nova.scheduler.client.report [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.609 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.610 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.708 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.709 222021 DEBUG nova.network.neutron [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.765 222021 INFO nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.791 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.935 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.937 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.938 222021 INFO nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Creating image(s)#033[00m
Jan 23 05:07:45 np0005593233 nova_compute[222017]: 2026-01-23 10:07:45.982 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.016 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.046 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.050 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.120 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.121 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.122 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.123 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.151 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.155 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 91cc1048-141a-4a20-b148-991a883adfa9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:46.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:46.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.339 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.725 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 91cc1048-141a-4a20-b148-991a883adfa9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.787 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] resizing rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.895 222021 DEBUG nova.objects.instance [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'migration_context' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.913 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.914 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Ensure instance console log exists: /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.914 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.915 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.915 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:46 np0005593233 nova_compute[222017]: 2026-01-23 10:07:46.948 222021 DEBUG nova.policy [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cde472cc8af0464992006a69d047d0d4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '746ea02b745c4e21ace4cb49c193899d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:07:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:48.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:48.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593233 nova_compute[222017]: 2026-01-23 10:07:48.430 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:48 np0005593233 nova_compute[222017]: 2026-01-23 10:07:48.561 222021 DEBUG nova.network.neutron [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Successfully created port: 8bb3c318-ff77-47e8-a160-22e4c278fc88 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:07:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:50.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:50.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.441 222021 DEBUG nova.network.neutron [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Successfully updated port: 8bb3c318-ff77-47e8-a160-22e4c278fc88 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.951 222021 DEBUG nova.compute.manager [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-changed-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.951 222021 DEBUG nova.compute.manager [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Refreshing instance network info cache due to event network-changed-8bb3c318-ff77-47e8-a160-22e4c278fc88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.952 222021 DEBUG oslo_concurrency.lockutils [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.952 222021 DEBUG oslo_concurrency.lockutils [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.952 222021 DEBUG nova.network.neutron [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Refreshing network info cache for port 8bb3c318-ff77-47e8-a160-22e4c278fc88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:07:50 np0005593233 nova_compute[222017]: 2026-01-23 10:07:50.954 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:07:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:51 np0005593233 nova_compute[222017]: 2026-01-23 10:07:51.343 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:51 np0005593233 nova_compute[222017]: 2026-01-23 10:07:51.704 222021 DEBUG nova.network.neutron [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:52.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:52.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:52 np0005593233 nova_compute[222017]: 2026-01-23 10:07:52.559 222021 DEBUG nova.network.neutron [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:52 np0005593233 nova_compute[222017]: 2026-01-23 10:07:52.594 222021 DEBUG oslo_concurrency.lockutils [req-b33accf0-ef0d-4fc6-9f66-61a50ea8e6a2 req-b15e63df-0428-4def-8ec6-d9af017e7770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:07:52 np0005593233 nova_compute[222017]: 2026-01-23 10:07:52.595 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:07:52 np0005593233 nova_compute[222017]: 2026-01-23 10:07:52.596 222021 DEBUG nova.network.neutron [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:07:52 np0005593233 nova_compute[222017]: 2026-01-23 10:07:52.898 222021 DEBUG nova.network.neutron [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:54.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:54.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.633 222021 DEBUG nova.network.neutron [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.688 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.689 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance network_info: |[{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.691 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Start _get_guest_xml network_info=[{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.696 222021 WARNING nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.702 222021 DEBUG nova.virt.libvirt.host [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.702 222021 DEBUG nova.virt.libvirt.host [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.705 222021 DEBUG nova.virt.libvirt.host [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.706 222021 DEBUG nova.virt.libvirt.host [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.707 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.707 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.708 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.708 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.708 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.709 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.709 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.709 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.709 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.710 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.710 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.710 222021 DEBUG nova.virt.hardware [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:07:54 np0005593233 nova_compute[222017]: 2026-01-23 10:07:54.713 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:07:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/94224182' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.178 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.215 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.221 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.420 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:07:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1392609054' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.721 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.725 222021 DEBUG nova.virt.libvirt.vif [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:07:45Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.726 222021 DEBUG nova.network.os_vif_util [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.727 222021 DEBUG nova.network.os_vif_util [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.729 222021 DEBUG nova.objects.instance [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'pci_devices' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.768 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <uuid>91cc1048-141a-4a20-b148-991a883adfa9</uuid>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <name>instance-00000079</name>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersNegativeTestJSON-server-980180187</nova:name>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:07:54</nova:creationTime>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:user uuid="cde472cc8af0464992006a69d047d0d4">tempest-ServersNegativeTestJSON-623507515-project-member</nova:user>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:project uuid="746ea02b745c4e21ace4cb49c193899d">tempest-ServersNegativeTestJSON-623507515</nova:project>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <nova:port uuid="8bb3c318-ff77-47e8-a160-22e4c278fc88">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <entry name="serial">91cc1048-141a-4a20-b148-991a883adfa9</entry>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <entry name="uuid">91cc1048-141a-4a20-b148-991a883adfa9</entry>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/91cc1048-141a-4a20-b148-991a883adfa9_disk">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/91cc1048-141a-4a20-b148-991a883adfa9_disk.config">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:5b:ad:46"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <target dev="tap8bb3c318-ff"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/console.log" append="off"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:07:55 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:07:55 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:07:55 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:07:55 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.770 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Preparing to wait for external event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.771 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.771 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.771 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.772 222021 DEBUG nova.virt.libvirt.vif [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:07:45Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.772 222021 DEBUG nova.network.os_vif_util [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.773 222021 DEBUG nova.network.os_vif_util [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.773 222021 DEBUG os_vif [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.774 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.775 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.779 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bb3c318-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.779 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8bb3c318-ff, col_values=(('external_ids', {'iface-id': '8bb3c318-ff77-47e8-a160-22e4c278fc88', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:ad:46', 'vm-uuid': '91cc1048-141a-4a20-b148-991a883adfa9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593233 NetworkManager[48871]: <info>  [1769162875.7822] manager: (tap8bb3c318-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.806 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.807 222021 INFO os_vif [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff')#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.898 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.898 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.898 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No VIF found with MAC fa:16:3e:5b:ad:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.900 222021 INFO nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Using config drive#033[00m
Jan 23 05:07:55 np0005593233 nova_compute[222017]: 2026-01-23 10:07:55.932 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:56 np0005593233 podman[269069]: 2026-01-23 10:07:56.152830378 +0000 UTC m=+0.144423328 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:07:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:56.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:56.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.722 222021 INFO nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Creating config drive at /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config#033[00m
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.732 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoz6fgsfa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.893 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoz6fgsfa" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.949 222021 DEBUG nova.storage.rbd_utils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:56 np0005593233 nova_compute[222017]: 2026-01-23 10:07:56.954 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config 91cc1048-141a-4a20-b148-991a883adfa9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:57 np0005593233 nova_compute[222017]: 2026-01-23 10:07:57.946 222021 DEBUG oslo_concurrency.processutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config 91cc1048-141a-4a20-b148-991a883adfa9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.991s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:57 np0005593233 nova_compute[222017]: 2026-01-23 10:07:57.947 222021 INFO nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deleting local config drive /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config because it was imported into RBD.#033[00m
Jan 23 05:07:58 np0005593233 kernel: tap8bb3c318-ff: entered promiscuous mode
Jan 23 05:07:58 np0005593233 NetworkManager[48871]: <info>  [1769162878.0193] manager: (tap8bb3c318-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/243)
Jan 23 05:07:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:58Z|00520|binding|INFO|Claiming lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 for this chassis.
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.021 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:58Z|00521|binding|INFO|8bb3c318-ff77-47e8-a160-22e4c278fc88: Claiming fa:16:3e:5b:ad:46 10.100.0.12
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.038 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ad:46 10.100.0.12'], port_security=['fa:16:3e:5b:ad:46 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91cc1048-141a-4a20-b148-991a883adfa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8bb3c318-ff77-47e8-a160-22e4c278fc88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.039 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb3c318-ff77-47e8-a160-22e4c278fc88 in datapath 63877f45-8244-4c80-903a-80901a7d83cb bound to our chassis#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.041 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63877f45-8244-4c80-903a-80901a7d83cb#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.058 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d1874d-4aa5-484e-8c7a-4ddd2fc85244]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.059 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap63877f45-81 in ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.061 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap63877f45-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.061 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2e050523-8b62-4e73-99de-08aaa27c60f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.062 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1e23db00-c869-4ce3-9753-c72991a5f24c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 systemd-machined[190954]: New machine qemu-56-instance-00000079.
Jan 23 05:07:58 np0005593233 systemd-udevd[269149]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.081 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[784d4041-90ac-43fb-9e77-17fd26e91eea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 NetworkManager[48871]: <info>  [1769162878.0869] device (tap8bb3c318-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:07:58 np0005593233 NetworkManager[48871]: <info>  [1769162878.0887] device (tap8bb3c318-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:07:58 np0005593233 systemd[1]: Started Virtual Machine qemu-56-instance-00000079.
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.105 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a7fc9d27-6372-43a3-99b8-52bca42083fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.116 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:58Z|00522|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 ovn-installed in OVS
Jan 23 05:07:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:58Z|00523|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 up in Southbound
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.126 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.158 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4aaa2c-eab4-479c-a710-8683d9a977dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.165 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc94b47-6ab2-47f8-b7ee-42d01325d8f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 systemd-udevd[269153]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:07:58 np0005593233 NetworkManager[48871]: <info>  [1769162878.1667] manager: (tap63877f45-80): new Veth device (/org/freedesktop/NetworkManager/Devices/244)
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.208 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[eac95e16-53fc-4f7e-ac2a-15f3375bdf44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.213 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd4e97d-8d3e-4e23-bf7d-9c66a754e5a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 NetworkManager[48871]: <info>  [1769162878.2481] device (tap63877f45-80): carrier: link connected
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.255 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3d7fa2-ea8d-4e71-9e75-c8574097a0a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.276 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a71e656a-d8bc-44f9-a21c-a6c363659f33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 679087, 'reachable_time': 32067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269181, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.300 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[582d28bd-0f49-458a-b199-1e96a4219f57]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:59a0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 679087, 'tstamp': 679087}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269182, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.322 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd18cec-cabf-4822-bf5d-5b9ce8eb451a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 679087, 'reachable_time': 32067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 269183, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:07:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:58.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:07:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:07:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:58.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.365 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[821fe543-37c2-4afa-9468-d984bf4e7e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.443 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[db43f174-f6d2-448b-ada5-9efa545e27f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.445 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.446 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.446 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63877f45-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:58 np0005593233 NetworkManager[48871]: <info>  [1769162878.5311] manager: (tap63877f45-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 kernel: tap63877f45-80: entered promiscuous mode
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.533 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63877f45-80, col_values=(('external_ids', {'iface-id': 'c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.533 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:07:58Z|00524|binding|INFO|Releasing lport c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23 from this chassis (sb_readonly=0)
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.535 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.550 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.551 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4400c2a7-05fa-40f3-92db-7e75a84c749a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.552 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-63877f45-8244-4c80-903a-80901a7d83cb
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 63877f45-8244-4c80-903a-80901a7d83cb
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:07:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:07:58.553 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'env', 'PROCESS_TAG=haproxy-63877f45-8244-4c80-903a-80901a7d83cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/63877f45-8244-4c80-903a-80901a7d83cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.753 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162878.7521563, 91cc1048-141a-4a20-b148-991a883adfa9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.754 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.789 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.798 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162878.7526052, 91cc1048-141a-4a20-b148-991a883adfa9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.799 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.827 222021 DEBUG nova.compute.manager [req-8628cd4b-b25c-4f89-92db-e4c9759b42fe req-c44aabfb-3849-4265-aa3a-88924972683f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.827 222021 DEBUG oslo_concurrency.lockutils [req-8628cd4b-b25c-4f89-92db-e4c9759b42fe req-c44aabfb-3849-4265-aa3a-88924972683f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.827 222021 DEBUG oslo_concurrency.lockutils [req-8628cd4b-b25c-4f89-92db-e4c9759b42fe req-c44aabfb-3849-4265-aa3a-88924972683f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.828 222021 DEBUG oslo_concurrency.lockutils [req-8628cd4b-b25c-4f89-92db-e4c9759b42fe req-c44aabfb-3849-4265-aa3a-88924972683f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.828 222021 DEBUG nova.compute.manager [req-8628cd4b-b25c-4f89-92db-e4c9759b42fe req-c44aabfb-3849-4265-aa3a-88924972683f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Processing event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.829 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.829 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.835 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.839 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162878.8343165, 91cc1048-141a-4a20-b148-991a883adfa9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.840 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.843 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance spawned successfully.#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.844 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.865 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.871 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.875 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.876 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.876 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.876 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.877 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.877 222021 DEBUG nova.virt.libvirt.driver [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:58 np0005593233 nova_compute[222017]: 2026-01-23 10:07:58.951 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:07:58 np0005593233 podman[269256]: 2026-01-23 10:07:58.97815642 +0000 UTC m=+0.063315631 container create 869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:07:59 np0005593233 nova_compute[222017]: 2026-01-23 10:07:59.022 222021 INFO nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Took 13.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:07:59 np0005593233 nova_compute[222017]: 2026-01-23 10:07:59.023 222021 DEBUG nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:59 np0005593233 systemd[1]: Started libpod-conmon-869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51.scope.
Jan 23 05:07:59 np0005593233 podman[269256]: 2026-01-23 10:07:58.945857417 +0000 UTC m=+0.031016658 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:07:59 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:07:59 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aafc31ebbfe0c62a91440757444d561240cfa96eb1ead8e69e07c040a314acd2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:07:59 np0005593233 podman[269256]: 2026-01-23 10:07:59.077051655 +0000 UTC m=+0.162210886 container init 869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:07:59 np0005593233 podman[269256]: 2026-01-23 10:07:59.083411922 +0000 UTC m=+0.168571133 container start 869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:07:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [NOTICE]   (269275) : New worker (269277) forked
Jan 23 05:07:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [NOTICE]   (269275) : Loading success.
Jan 23 05:07:59 np0005593233 nova_compute[222017]: 2026-01-23 10:07:59.142 222021 INFO nova.compute.manager [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Took 14.36 seconds to build instance.#033[00m
Jan 23 05:07:59 np0005593233 nova_compute[222017]: 2026-01-23 10:07:59.163 222021 DEBUG oslo_concurrency.lockutils [None req-99ba368f-5831-46de-b92c-2542dd6ab3f0 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:00.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:00.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.923 222021 DEBUG nova.compute.manager [req-a4bc11b3-963b-431b-8649-3335e8b256f9 req-c65cfa96-801b-4d21-81e5-ffbdecc0eb56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.924 222021 DEBUG oslo_concurrency.lockutils [req-a4bc11b3-963b-431b-8649-3335e8b256f9 req-c65cfa96-801b-4d21-81e5-ffbdecc0eb56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.924 222021 DEBUG oslo_concurrency.lockutils [req-a4bc11b3-963b-431b-8649-3335e8b256f9 req-c65cfa96-801b-4d21-81e5-ffbdecc0eb56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.924 222021 DEBUG oslo_concurrency.lockutils [req-a4bc11b3-963b-431b-8649-3335e8b256f9 req-c65cfa96-801b-4d21-81e5-ffbdecc0eb56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.924 222021 DEBUG nova.compute.manager [req-a4bc11b3-963b-431b-8649-3335e8b256f9 req-c65cfa96-801b-4d21-81e5-ffbdecc0eb56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:00 np0005593233 nova_compute[222017]: 2026-01-23 10:08:00.924 222021 WARNING nova.compute.manager [req-a4bc11b3-963b-431b-8649-3335e8b256f9 req-c65cfa96-801b-4d21-81e5-ffbdecc0eb56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:08:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:01 np0005593233 nova_compute[222017]: 2026-01-23 10:08:01.350 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:02.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:02.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:04.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:04.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.553 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.556 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.580 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:08:05 np0005593233 podman[269456]: 2026-01-23 10:08:05.640765882 +0000 UTC m=+0.076327265 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.695 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.697 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.708 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.708 222021 INFO nova.compute.claims [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:08:05 np0005593233 podman[269456]: 2026-01-23 10:08:05.765439757 +0000 UTC m=+0.201001050 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.791 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:05 np0005593233 nova_compute[222017]: 2026-01-23 10:08:05.892 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:08:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2410421172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:08:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:06.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.355 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.378 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.385 222021 DEBUG nova.compute.provider_tree [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.403 222021 DEBUG nova.scheduler.client.report [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.439 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.441 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.530 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.531 222021 DEBUG nova.network.neutron [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.569 222021 INFO nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.597 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.757 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.759 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.760 222021 INFO nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Creating image(s)#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.791 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.833 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.868 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.877 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.945 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.946 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.947 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.948 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.979 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:06 np0005593233 nova_compute[222017]: 2026-01-23 10:08:06.983 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 8629dfed-aab4-4e09-aea4-060f986e8c25_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.332 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 8629dfed-aab4-4e09-aea4-060f986e8c25_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.349s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.417 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] resizing rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.543 222021 DEBUG nova.objects.instance [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'migration_context' on Instance uuid 8629dfed-aab4-4e09-aea4-060f986e8c25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.562 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.563 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Ensure instance console log exists: /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.564 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.564 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.565 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:07 np0005593233 nova_compute[222017]: 2026-01-23 10:08:07.726 222021 DEBUG nova.policy [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cde472cc8af0464992006a69d047d0d4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '746ea02b745c4e21ace4cb49c193899d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:08:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:08:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:08.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:08.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:09 np0005593233 nova_compute[222017]: 2026-01-23 10:08:09.190 222021 DEBUG nova.network.neutron [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Successfully created port: 66259f0e-be52-4a83-90a2-42e1c92392fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:08:10 np0005593233 podman[269896]: 2026-01-23 10:08:10.103665882 +0000 UTC m=+0.094518894 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.147 222021 DEBUG nova.network.neutron [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Successfully updated port: 66259f0e-be52-4a83-90a2-42e1c92392fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.167 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "refresh_cache-8629dfed-aab4-4e09-aea4-060f986e8c25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.168 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquired lock "refresh_cache-8629dfed-aab4-4e09-aea4-060f986e8c25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.168 222021 DEBUG nova.network.neutron [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:08:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:10.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:10.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.438 222021 DEBUG nova.compute.manager [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-changed-66259f0e-be52-4a83-90a2-42e1c92392fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.440 222021 DEBUG nova.compute.manager [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Refreshing instance network info cache due to event network-changed-66259f0e-be52-4a83-90a2-42e1c92392fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.441 222021 DEBUG oslo_concurrency.lockutils [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-8629dfed-aab4-4e09-aea4-060f986e8c25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.453 222021 DEBUG nova.network.neutron [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:08:10 np0005593233 nova_compute[222017]: 2026-01-23 10:08:10.832 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:11 np0005593233 nova_compute[222017]: 2026-01-23 10:08:11.356 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:12.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:12.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.880 222021 DEBUG nova.network.neutron [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Updating instance_info_cache with network_info: [{"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.909 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Releasing lock "refresh_cache-8629dfed-aab4-4e09-aea4-060f986e8c25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.910 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Instance network_info: |[{"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.911 222021 DEBUG oslo_concurrency.lockutils [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-8629dfed-aab4-4e09-aea4-060f986e8c25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.911 222021 DEBUG nova.network.neutron [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Refreshing network info cache for port 66259f0e-be52-4a83-90a2-42e1c92392fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.916 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Start _get_guest_xml network_info=[{"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.922 222021 WARNING nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.932 222021 DEBUG nova.virt.libvirt.host [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.934 222021 DEBUG nova.virt.libvirt.host [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.938 222021 DEBUG nova.virt.libvirt.host [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.938 222021 DEBUG nova.virt.libvirt.host [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.940 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.941 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.942 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.942 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.942 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.943 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.943 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.943 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.944 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.944 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.944 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.945 222021 DEBUG nova.virt.hardware [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:08:12 np0005593233 nova_compute[222017]: 2026-01-23 10:08:12.950 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:08:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3481856771' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.474 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:13 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:13Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:ad:46 10.100.0.12
Jan 23 05:08:13 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:13Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:ad:46 10.100.0.12
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.507 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.513 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:08:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3391141298' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.982 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.985 222021 DEBUG nova.virt.libvirt.vif [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:08:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-892445505',display_name='tempest-ServersNegativeTestJSON-server-892445505',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-892445505',id=124,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-ietlz504',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:08:06Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=8629dfed-aab4-4e09-aea4-060f986e8c25,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.986 222021 DEBUG nova.network.os_vif_util [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.987 222021 DEBUG nova.network.os_vif_util [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:08:13 np0005593233 nova_compute[222017]: 2026-01-23 10:08:13.988 222021 DEBUG nova.objects.instance [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'pci_devices' on Instance uuid 8629dfed-aab4-4e09-aea4-060f986e8c25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.021 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <uuid>8629dfed-aab4-4e09-aea4-060f986e8c25</uuid>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <name>instance-0000007c</name>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersNegativeTestJSON-server-892445505</nova:name>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:08:12</nova:creationTime>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:user uuid="cde472cc8af0464992006a69d047d0d4">tempest-ServersNegativeTestJSON-623507515-project-member</nova:user>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:project uuid="746ea02b745c4e21ace4cb49c193899d">tempest-ServersNegativeTestJSON-623507515</nova:project>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <nova:port uuid="66259f0e-be52-4a83-90a2-42e1c92392fc">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <entry name="serial">8629dfed-aab4-4e09-aea4-060f986e8c25</entry>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <entry name="uuid">8629dfed-aab4-4e09-aea4-060f986e8c25</entry>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/8629dfed-aab4-4e09-aea4-060f986e8c25_disk">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/8629dfed-aab4-4e09-aea4-060f986e8c25_disk.config">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:47:5a:9a"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <target dev="tap66259f0e-be"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/console.log" append="off"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:08:14 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:08:14 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:08:14 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:08:14 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.023 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Preparing to wait for external event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.024 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.024 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.024 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.025 222021 DEBUG nova.virt.libvirt.vif [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:08:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-892445505',display_name='tempest-ServersNegativeTestJSON-server-892445505',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-892445505',id=124,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-ietlz504',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:08:06Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=8629dfed-aab4-4e09-aea4-060f986e8c25,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.025 222021 DEBUG nova.network.os_vif_util [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.026 222021 DEBUG nova.network.os_vif_util [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.027 222021 DEBUG os_vif [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.028 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.028 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.032 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.032 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66259f0e-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.033 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66259f0e-be, col_values=(('external_ids', {'iface-id': '66259f0e-be52-4a83-90a2-42e1c92392fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:5a:9a', 'vm-uuid': '8629dfed-aab4-4e09-aea4-060f986e8c25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.034 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:14 np0005593233 NetworkManager[48871]: <info>  [1769162894.0357] manager: (tap66259f0e-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.037 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.043 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.045 222021 INFO os_vif [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be')#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.112 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.113 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.113 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No VIF found with MAC fa:16:3e:47:5a:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.114 222021 INFO nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Using config drive#033[00m
Jan 23 05:08:14 np0005593233 nova_compute[222017]: 2026-01-23 10:08:14.142 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:14.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:14.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.027 222021 INFO nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Creating config drive at /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/disk.config#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.032 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyl_vaa64 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.174 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyl_vaa64" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.205 222021 DEBUG nova.storage.rbd_utils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 8629dfed-aab4-4e09-aea4-060f986e8c25_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.209 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/disk.config 8629dfed-aab4-4e09-aea4-060f986e8c25_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.396 222021 DEBUG oslo_concurrency.processutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/disk.config 8629dfed-aab4-4e09-aea4-060f986e8c25_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.397 222021 INFO nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Deleting local config drive /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25/disk.config because it was imported into RBD.#033[00m
Jan 23 05:08:15 np0005593233 kernel: tap66259f0e-be: entered promiscuous mode
Jan 23 05:08:15 np0005593233 NetworkManager[48871]: <info>  [1769162895.4591] manager: (tap66259f0e-be): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Jan 23 05:08:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:15Z|00525|binding|INFO|Claiming lport 66259f0e-be52-4a83-90a2-42e1c92392fc for this chassis.
Jan 23 05:08:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:15Z|00526|binding|INFO|66259f0e-be52-4a83-90a2-42e1c92392fc: Claiming fa:16:3e:47:5a:9a 10.100.0.3
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.460 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:15Z|00527|binding|INFO|Setting lport 66259f0e-be52-4a83-90a2-42e1c92392fc ovn-installed in OVS
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.485 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.490 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:15 np0005593233 systemd-udevd[270099]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:08:15 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:15Z|00528|binding|INFO|Setting lport 66259f0e-be52-4a83-90a2-42e1c92392fc up in Southbound
Jan 23 05:08:15 np0005593233 NetworkManager[48871]: <info>  [1769162895.5081] device (tap66259f0e-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.507 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:5a:9a 10.100.0.3'], port_security=['fa:16:3e:47:5a:9a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8629dfed-aab4-4e09-aea4-060f986e8c25', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66259f0e-be52-4a83-90a2-42e1c92392fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:08:15 np0005593233 NetworkManager[48871]: <info>  [1769162895.5091] device (tap66259f0e-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.508 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66259f0e-be52-4a83-90a2-42e1c92392fc in datapath 63877f45-8244-4c80-903a-80901a7d83cb bound to our chassis#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.510 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63877f45-8244-4c80-903a-80901a7d83cb#033[00m
Jan 23 05:08:15 np0005593233 systemd-machined[190954]: New machine qemu-57-instance-0000007c.
Jan 23 05:08:15 np0005593233 systemd[1]: Started Virtual Machine qemu-57-instance-0000007c.
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.531 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2a5dcb-40e7-4212-9fe5-c09335f45554]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.569 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5b11e54b-14ed-4a47-9592-624bba5dcaa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.573 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7d12390e-2eed-4b9d-94f1-0794c31f89a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.610 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3b9436-50d1-4543-bf07-905a4d2ceff6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.630 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9a95e171-09e5-45e2-9006-e9e0b53cc56c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 679087, 'reachable_time': 32067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270115, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.645 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb33a04-9358-4d8a-b126-3b0a14417315]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap63877f45-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 679102, 'tstamp': 679102}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270116, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap63877f45-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 679106, 'tstamp': 679106}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270116, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.647 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.648 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.650 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.650 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63877f45-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.650 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.651 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63877f45-80, col_values=(('external_ids', {'iface-id': 'c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:15.651 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.881 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162895.8808618, 8629dfed-aab4-4e09-aea4-060f986e8c25 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:15 np0005593233 nova_compute[222017]: 2026-01-23 10:08:15.882 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] VM Started (Lifecycle Event)#033[00m
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.021 222021 DEBUG nova.network.neutron [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Updated VIF entry in instance network info cache for port 66259f0e-be52-4a83-90a2-42e1c92392fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.022 222021 DEBUG nova.network.neutron [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Updating instance_info_cache with network_info: [{"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:16.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:08:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:16.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.987 222021 DEBUG oslo_concurrency.lockutils [req-f9f017b6-3bb0-483e-b570-9a64ab407207 req-2492d04d-aecf-4aca-a8f3-c390c051d93a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-8629dfed-aab4-4e09-aea4-060f986e8c25" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.991 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.996 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162895.8843029, 8629dfed-aab4-4e09-aea4-060f986e8c25 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:16 np0005593233 nova_compute[222017]: 2026-01-23 10:08:16.997 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.025 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.028 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.050 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.420 222021 DEBUG nova.compute.manager [req-68e6f4c8-f37c-471f-bc3f-65119fef5f98 req-49cf963a-f235-40c2-ac23-039dfdf818b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.421 222021 DEBUG oslo_concurrency.lockutils [req-68e6f4c8-f37c-471f-bc3f-65119fef5f98 req-49cf963a-f235-40c2-ac23-039dfdf818b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.422 222021 DEBUG oslo_concurrency.lockutils [req-68e6f4c8-f37c-471f-bc3f-65119fef5f98 req-49cf963a-f235-40c2-ac23-039dfdf818b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.422 222021 DEBUG oslo_concurrency.lockutils [req-68e6f4c8-f37c-471f-bc3f-65119fef5f98 req-49cf963a-f235-40c2-ac23-039dfdf818b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.422 222021 DEBUG nova.compute.manager [req-68e6f4c8-f37c-471f-bc3f-65119fef5f98 req-49cf963a-f235-40c2-ac23-039dfdf818b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Processing event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.423 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.427 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162897.4272125, 8629dfed-aab4-4e09-aea4-060f986e8c25 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.428 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.431 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.434 222021 INFO nova.virt.libvirt.driver [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Instance spawned successfully.#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.434 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.475 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.475 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.476 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.476 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.477 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.477 222021 DEBUG nova.virt.libvirt.driver [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.481 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.484 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.515 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.585 222021 INFO nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Took 10.83 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.585 222021 DEBUG nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.667 222021 INFO nova.compute.manager [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Took 12.02 seconds to build instance.#033[00m
Jan 23 05:08:17 np0005593233 nova_compute[222017]: 2026-01-23 10:08:17.714 222021 DEBUG oslo_concurrency.lockutils [None req-a571bb83-2e55-4d0b-a2dd-d9556e3cf629 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:18.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:18.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:19 np0005593233 nova_compute[222017]: 2026-01-23 10:08:19.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:20.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:08:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:20.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.154 222021 DEBUG nova.compute.manager [req-8a0c48ca-d94d-4e12-bcdb-c2c7cb0148c7 req-4d1f37f2-efb5-471f-a62a-d861015304fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.155 222021 DEBUG oslo_concurrency.lockutils [req-8a0c48ca-d94d-4e12-bcdb-c2c7cb0148c7 req-4d1f37f2-efb5-471f-a62a-d861015304fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.156 222021 DEBUG oslo_concurrency.lockutils [req-8a0c48ca-d94d-4e12-bcdb-c2c7cb0148c7 req-4d1f37f2-efb5-471f-a62a-d861015304fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.157 222021 DEBUG oslo_concurrency.lockutils [req-8a0c48ca-d94d-4e12-bcdb-c2c7cb0148c7 req-4d1f37f2-efb5-471f-a62a-d861015304fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.158 222021 DEBUG nova.compute.manager [req-8a0c48ca-d94d-4e12-bcdb-c2c7cb0148c7 req-4d1f37f2-efb5-471f-a62a-d861015304fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] No waiting events found dispatching network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.158 222021 WARNING nova.compute.manager [req-8a0c48ca-d94d-4e12-bcdb-c2c7cb0148c7 req-4d1f37f2-efb5-471f-a62a-d861015304fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received unexpected event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc for instance with vm_state active and task_state None.#033[00m
Jan 23 05:08:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:21 np0005593233 nova_compute[222017]: 2026-01-23 10:08:21.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 05:08:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:22.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 05:08:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:22.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:24 np0005593233 nova_compute[222017]: 2026-01-23 10:08:24.037 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:24.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:24.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:26 np0005593233 nova_compute[222017]: 2026-01-23 10:08:26.363 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:26.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:26.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:27 np0005593233 podman[270159]: 2026-01-23 10:08:27.163801373 +0000 UTC m=+0.168065019 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.330 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.330 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.331 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.331 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.331 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.332 222021 INFO nova.compute.manager [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Terminating instance#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.333 222021 DEBUG nova.compute.manager [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:08:27 np0005593233 kernel: tap66259f0e-be (unregistering): left promiscuous mode
Jan 23 05:08:27 np0005593233 NetworkManager[48871]: <info>  [1769162907.3846] device (tap66259f0e-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.390 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:27Z|00529|binding|INFO|Releasing lport 66259f0e-be52-4a83-90a2-42e1c92392fc from this chassis (sb_readonly=0)
Jan 23 05:08:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:27Z|00530|binding|INFO|Setting lport 66259f0e-be52-4a83-90a2-42e1c92392fc down in Southbound
Jan 23 05:08:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:08:27Z|00531|binding|INFO|Removing iface tap66259f0e-be ovn-installed in OVS
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.394 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.398 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:5a:9a 10.100.0.3'], port_security=['fa:16:3e:47:5a:9a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8629dfed-aab4-4e09-aea4-060f986e8c25', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66259f0e-be52-4a83-90a2-42e1c92392fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.400 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66259f0e-be52-4a83-90a2-42e1c92392fc in datapath 63877f45-8244-4c80-903a-80901a7d83cb unbound from our chassis#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.401 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63877f45-8244-4c80-903a-80901a7d83cb#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.413 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.420 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3160ac-0fa4-447f-9e1e-9d473110a3d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:27 np0005593233 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Jan 23 05:08:27 np0005593233 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Consumed 10.510s CPU time.
Jan 23 05:08:27 np0005593233 systemd-machined[190954]: Machine qemu-57-instance-0000007c terminated.
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.459 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3df97791-8e14-4aae-bab6-3197a79161c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.469 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fec97797-b455-4ba1-a4fe-49b85f69a123]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.508 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ab92e8-b180-421e-828f-c36198a6dab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.532 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5f54075f-a4b2-40d2-adbc-dc647f392ab1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 679087, 'reachable_time': 32067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270200, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.552 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d8485e6f-8780-4301-9fa4-e81c1d5d77bd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap63877f45-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 679102, 'tstamp': 679102}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270201, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap63877f45-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 679106, 'tstamp': 679106}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270201, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.554 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.561 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.563 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63877f45-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.563 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.564 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63877f45-80, col_values=(('external_ids', {'iface-id': 'c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:27.565 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.587 222021 INFO nova.virt.libvirt.driver [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Instance destroyed successfully.#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.588 222021 DEBUG nova.objects.instance [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'resources' on Instance uuid 8629dfed-aab4-4e09-aea4-060f986e8c25 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.621 222021 DEBUG nova.virt.libvirt.vif [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:08:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-892445505',display_name='tempest-ServersNegativeTestJSON-server-892445505',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-892445505',id=124,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:08:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-ietlz504',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:08:17Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=8629dfed-aab4-4e09-aea4-060f986e8c25,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.622 222021 DEBUG nova.network.os_vif_util [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "66259f0e-be52-4a83-90a2-42e1c92392fc", "address": "fa:16:3e:47:5a:9a", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66259f0e-be", "ovs_interfaceid": "66259f0e-be52-4a83-90a2-42e1c92392fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.623 222021 DEBUG nova.network.os_vif_util [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.623 222021 DEBUG os_vif [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.625 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.626 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66259f0e-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.628 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.630 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.633 222021 INFO os_vif [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:5a:9a,bridge_name='br-int',has_traffic_filtering=True,id=66259f0e-be52-4a83-90a2-42e1c92392fc,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66259f0e-be')#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.882 222021 DEBUG nova.compute.manager [req-c3354df2-cb12-4824-97c9-f323bb93d7a7 req-2807ca26-6dea-4600-816b-695943534518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-vif-unplugged-66259f0e-be52-4a83-90a2-42e1c92392fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.883 222021 DEBUG oslo_concurrency.lockutils [req-c3354df2-cb12-4824-97c9-f323bb93d7a7 req-2807ca26-6dea-4600-816b-695943534518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.883 222021 DEBUG oslo_concurrency.lockutils [req-c3354df2-cb12-4824-97c9-f323bb93d7a7 req-2807ca26-6dea-4600-816b-695943534518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.884 222021 DEBUG oslo_concurrency.lockutils [req-c3354df2-cb12-4824-97c9-f323bb93d7a7 req-2807ca26-6dea-4600-816b-695943534518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.884 222021 DEBUG nova.compute.manager [req-c3354df2-cb12-4824-97c9-f323bb93d7a7 req-2807ca26-6dea-4600-816b-695943534518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] No waiting events found dispatching network-vif-unplugged-66259f0e-be52-4a83-90a2-42e1c92392fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:27 np0005593233 nova_compute[222017]: 2026-01-23 10:08:27.885 222021 DEBUG nova.compute.manager [req-c3354df2-cb12-4824-97c9-f323bb93d7a7 req-2807ca26-6dea-4600-816b-695943534518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-vif-unplugged-66259f0e-be52-4a83-90a2-42e1c92392fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:08:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:28.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:28.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:28 np0005593233 nova_compute[222017]: 2026-01-23 10:08:28.720 222021 INFO nova.virt.libvirt.driver [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Deleting instance files /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25_del#033[00m
Jan 23 05:08:28 np0005593233 nova_compute[222017]: 2026-01-23 10:08:28.721 222021 INFO nova.virt.libvirt.driver [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Deletion of /var/lib/nova/instances/8629dfed-aab4-4e09-aea4-060f986e8c25_del complete#033[00m
Jan 23 05:08:29 np0005593233 nova_compute[222017]: 2026-01-23 10:08:29.690 222021 INFO nova.compute.manager [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Took 2.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:08:29 np0005593233 nova_compute[222017]: 2026-01-23 10:08:29.691 222021 DEBUG oslo.service.loopingcall [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:08:29 np0005593233 nova_compute[222017]: 2026-01-23 10:08:29.692 222021 DEBUG nova.compute.manager [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:08:29 np0005593233 nova_compute[222017]: 2026-01-23 10:08:29.692 222021 DEBUG nova.network.neutron [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:08:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:30.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:30.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:30 np0005593233 nova_compute[222017]: 2026-01-23 10:08:30.812 222021 DEBUG nova.compute.manager [req-b0540057-8c97-4fd2-8723-fa51e95cb630 req-7869ccff-afb6-46bc-94df-f7128cbf1059 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:30 np0005593233 nova_compute[222017]: 2026-01-23 10:08:30.813 222021 DEBUG oslo_concurrency.lockutils [req-b0540057-8c97-4fd2-8723-fa51e95cb630 req-7869ccff-afb6-46bc-94df-f7128cbf1059 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:30 np0005593233 nova_compute[222017]: 2026-01-23 10:08:30.813 222021 DEBUG oslo_concurrency.lockutils [req-b0540057-8c97-4fd2-8723-fa51e95cb630 req-7869ccff-afb6-46bc-94df-f7128cbf1059 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:30 np0005593233 nova_compute[222017]: 2026-01-23 10:08:30.813 222021 DEBUG oslo_concurrency.lockutils [req-b0540057-8c97-4fd2-8723-fa51e95cb630 req-7869ccff-afb6-46bc-94df-f7128cbf1059 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:30 np0005593233 nova_compute[222017]: 2026-01-23 10:08:30.814 222021 DEBUG nova.compute.manager [req-b0540057-8c97-4fd2-8723-fa51e95cb630 req-7869ccff-afb6-46bc-94df-f7128cbf1059 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] No waiting events found dispatching network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:30 np0005593233 nova_compute[222017]: 2026-01-23 10:08:30.814 222021 WARNING nova.compute.manager [req-b0540057-8c97-4fd2-8723-fa51e95cb630 req-7869ccff-afb6-46bc-94df-f7128cbf1059 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received unexpected event network-vif-plugged-66259f0e-be52-4a83-90a2-42e1c92392fc for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:08:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:31 np0005593233 nova_compute[222017]: 2026-01-23 10:08:31.365 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:32 np0005593233 nova_compute[222017]: 2026-01-23 10:08:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:32.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:32.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:32 np0005593233 nova_compute[222017]: 2026-01-23 10:08:32.628 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:33 np0005593233 nova_compute[222017]: 2026-01-23 10:08:33.466 222021 DEBUG nova.network.neutron [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:33 np0005593233 nova_compute[222017]: 2026-01-23 10:08:33.497 222021 INFO nova.compute.manager [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Took 3.81 seconds to deallocate network for instance.#033[00m
Jan 23 05:08:33 np0005593233 nova_compute[222017]: 2026-01-23 10:08:33.564 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:33 np0005593233 nova_compute[222017]: 2026-01-23 10:08:33.565 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:33 np0005593233 nova_compute[222017]: 2026-01-23 10:08:33.679 222021 DEBUG nova.compute.manager [req-232fe43b-8680-4ff8-934f-ad5709da6d6f req-a39851e5-d5fa-4b3b-9795-57df40e7fb49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Received event network-vif-deleted-66259f0e-be52-4a83-90a2-42e1c92392fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:33 np0005593233 nova_compute[222017]: 2026-01-23 10:08:33.980 222021 DEBUG oslo_concurrency.processutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:34.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:34.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:08:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/817189493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:08:34 np0005593233 nova_compute[222017]: 2026-01-23 10:08:34.444 222021 DEBUG oslo_concurrency.processutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:34 np0005593233 nova_compute[222017]: 2026-01-23 10:08:34.451 222021 DEBUG nova.compute.provider_tree [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:08:35 np0005593233 nova_compute[222017]: 2026-01-23 10:08:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:35 np0005593233 nova_compute[222017]: 2026-01-23 10:08:35.978 222021 DEBUG nova.scheduler.client.report [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:08:36 np0005593233 nova_compute[222017]: 2026-01-23 10:08:36.018 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:36 np0005593233 nova_compute[222017]: 2026-01-23 10:08:36.167 222021 INFO nova.scheduler.client.report [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Deleted allocations for instance 8629dfed-aab4-4e09-aea4-060f986e8c25#033[00m
Jan 23 05:08:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:36 np0005593233 nova_compute[222017]: 2026-01-23 10:08:36.323 222021 DEBUG oslo_concurrency.lockutils [None req-6cad410d-057f-45bd-bd69-3cc00b699452 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "8629dfed-aab4-4e09-aea4-060f986e8c25" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:36 np0005593233 nova_compute[222017]: 2026-01-23 10:08:36.369 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:36.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:36.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:37 np0005593233 nova_compute[222017]: 2026-01-23 10:08:37.630 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:38 np0005593233 nova_compute[222017]: 2026-01-23 10:08:38.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:38.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:38.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.435 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.435 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.893 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.981 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:08:39 np0005593233 nova_compute[222017]: 2026-01-23 10:08:39.982 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:08:40 np0005593233 nova_compute[222017]: 2026-01-23 10:08:40.143 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:08:40 np0005593233 nova_compute[222017]: 2026-01-23 10:08:40.144 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4266MB free_disk=20.806034088134766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:08:40 np0005593233 nova_compute[222017]: 2026-01-23 10:08:40.144 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:40 np0005593233 nova_compute[222017]: 2026-01-23 10:08:40.144 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:40.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:40.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e269 e269: 3 total, 3 up, 3 in
Jan 23 05:08:41 np0005593233 podman[270278]: 2026-01-23 10:08:41.106723644 +0000 UTC m=+0.089540574 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:08:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:41 np0005593233 nova_compute[222017]: 2026-01-23 10:08:41.373 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e270 e270: 3 total, 3 up, 3 in
Jan 23 05:08:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:42.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:42.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.449 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 91cc1048-141a-4a20-b148-991a883adfa9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.449 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.450 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.528 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.585 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162907.583866, 8629dfed-aab4-4e09-aea4-060f986e8c25 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.586 222021 INFO nova.compute.manager [-] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:08:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e271 e271: 3 total, 3 up, 3 in
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.625 222021 DEBUG nova.compute.manager [None req-190ffad9-4c50-4b6a-bd09-ffff399acd7b - - - - - -] [instance: 8629dfed-aab4-4e09-aea4-060f986e8c25] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:42 np0005593233 nova_compute[222017]: 2026-01-23 10:08:42.632 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:42.671 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:42.671 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:42.672 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:43 np0005593233 nova_compute[222017]: 2026-01-23 10:08:43.029 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:43 np0005593233 nova_compute[222017]: 2026-01-23 10:08:43.037 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:08:43 np0005593233 nova_compute[222017]: 2026-01-23 10:08:43.151 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:08:43 np0005593233 nova_compute[222017]: 2026-01-23 10:08:43.188 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:08:43 np0005593233 nova_compute[222017]: 2026-01-23 10:08:43.189 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:44 np0005593233 nova_compute[222017]: 2026-01-23 10:08:44.190 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:44 np0005593233 nova_compute[222017]: 2026-01-23 10:08:44.190 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:44 np0005593233 nova_compute[222017]: 2026-01-23 10:08:44.191 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:44 np0005593233 nova_compute[222017]: 2026-01-23 10:08:44.191 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:08:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:44.391 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:08:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:44.392 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:08:44 np0005593233 nova_compute[222017]: 2026-01-23 10:08:44.393 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:44.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:44.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:08:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/558863384' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:08:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:08:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/558863384' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:08:45 np0005593233 nova_compute[222017]: 2026-01-23 10:08:45.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:46 np0005593233 nova_compute[222017]: 2026-01-23 10:08:46.376 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:46.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:46.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.449 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.449 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.450 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.951 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.952 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.953 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:08:47 np0005593233 nova_compute[222017]: 2026-01-23 10:08:47.953 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:48.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:48.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e272 e272: 3 total, 3 up, 3 in
Jan 23 05:08:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:50.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:50.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:51 np0005593233 nova_compute[222017]: 2026-01-23 10:08:51.378 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:51 np0005593233 nova_compute[222017]: 2026-01-23 10:08:51.400 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:51 np0005593233 nova_compute[222017]: 2026-01-23 10:08:51.435 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:08:51 np0005593233 nova_compute[222017]: 2026-01-23 10:08:51.435 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:08:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e273 e273: 3 total, 3 up, 3 in
Jan 23 05:08:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e274 e274: 3 total, 3 up, 3 in
Jan 23 05:08:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:52.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:52.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:52 np0005593233 nova_compute[222017]: 2026-01-23 10:08:52.637 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:53 np0005593233 nova_compute[222017]: 2026-01-23 10:08:53.366 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:08:53.394 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:54.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:54.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:55 np0005593233 nova_compute[222017]: 2026-01-23 10:08:55.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:55 np0005593233 nova_compute[222017]: 2026-01-23 10:08:55.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:08:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:56 np0005593233 nova_compute[222017]: 2026-01-23 10:08:56.413 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:08:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:56.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:08:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:56.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:57 np0005593233 nova_compute[222017]: 2026-01-23 10:08:57.639 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:58 np0005593233 podman[270322]: 2026-01-23 10:08:58.100198743 +0000 UTC m=+0.110761394 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:08:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:58.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:08:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:58.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:00.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:00.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e275 e275: 3 total, 3 up, 3 in
Jan 23 05:09:01 np0005593233 nova_compute[222017]: 2026-01-23 10:09:01.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e276 e276: 3 total, 3 up, 3 in
Jan 23 05:09:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:02.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:02.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:02 np0005593233 nova_compute[222017]: 2026-01-23 10:09:02.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e277 e277: 3 total, 3 up, 3 in
Jan 23 05:09:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e278 e278: 3 total, 3 up, 3 in
Jan 23 05:09:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:04.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:04.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:06.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:06.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:06 np0005593233 nova_compute[222017]: 2026-01-23 10:09:06.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:07 np0005593233 nova_compute[222017]: 2026-01-23 10:09:07.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:08 np0005593233 nova_compute[222017]: 2026-01-23 10:09:08.407 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:08 np0005593233 nova_compute[222017]: 2026-01-23 10:09:08.408 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:09:08 np0005593233 nova_compute[222017]: 2026-01-23 10:09:08.437 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:09:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:08.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:08.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e279 e279: 3 total, 3 up, 3 in
Jan 23 05:09:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:10.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:10.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:11 np0005593233 nova_compute[222017]: 2026-01-23 10:09:11.464 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e280 e280: 3 total, 3 up, 3 in
Jan 23 05:09:12 np0005593233 podman[270349]: 2026-01-23 10:09:12.065013119 +0000 UTC m=+0.059945467 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:09:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:12.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:12.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:12 np0005593233 nova_compute[222017]: 2026-01-23 10:09:12.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:09:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:14.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:09:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e281 e281: 3 total, 3 up, 3 in
Jan 23 05:09:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:16.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:16 np0005593233 nova_compute[222017]: 2026-01-23 10:09:16.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:16.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e282 e282: 3 total, 3 up, 3 in
Jan 23 05:09:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:09:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:09:17 np0005593233 nova_compute[222017]: 2026-01-23 10:09:17.721 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:18.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:18.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:20.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:20.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:21 np0005593233 nova_compute[222017]: 2026-01-23 10:09:21.468 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e283 e283: 3 total, 3 up, 3 in
Jan 23 05:09:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:22.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:22.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:22 np0005593233 nova_compute[222017]: 2026-01-23 10:09:22.723 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:24.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:24.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:26.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:26.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:26 np0005593233 nova_compute[222017]: 2026-01-23 10:09:26.517 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 e284: 3 total, 3 up, 3 in
Jan 23 05:09:27 np0005593233 nova_compute[222017]: 2026-01-23 10:09:27.726 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:28.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:28.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:29 np0005593233 podman[270549]: 2026-01-23 10:09:29.123486366 +0000 UTC m=+0.128326811 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 23 05:09:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:30.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:30.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:31 np0005593233 nova_compute[222017]: 2026-01-23 10:09:31.519 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:32.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:32 np0005593233 nova_compute[222017]: 2026-01-23 10:09:32.729 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.947602) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162973947680, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 1804, "num_deletes": 258, "total_data_size": 3976394, "memory_usage": 4036112, "flush_reason": "Manual Compaction"}
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162973975760, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 2600978, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55174, "largest_seqno": 56973, "table_properties": {"data_size": 2593315, "index_size": 4541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16662, "raw_average_key_size": 20, "raw_value_size": 2577779, "raw_average_value_size": 3218, "num_data_blocks": 198, "num_entries": 801, "num_filter_entries": 801, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162842, "oldest_key_time": 1769162842, "file_creation_time": 1769162973, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 28235 microseconds, and 15149 cpu microseconds.
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.975837) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 2600978 bytes OK
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.975867) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.977965) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.977983) EVENT_LOG_v1 {"time_micros": 1769162973977977, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.978011) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 3967984, prev total WAL file size 3967984, number of live WAL files 2.
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.979423) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(2540KB)], [111(9531KB)]
Jan 23 05:09:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162973979527, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 12361670, "oldest_snapshot_seqno": -1}
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 7916 keys, 10458340 bytes, temperature: kUnknown
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162974083673, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 10458340, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10407680, "index_size": 29748, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19845, "raw_key_size": 205073, "raw_average_key_size": 25, "raw_value_size": 10268834, "raw_average_value_size": 1297, "num_data_blocks": 1165, "num_entries": 7916, "num_filter_entries": 7916, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769162973, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.084163) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 10458340 bytes
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.085797) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 118.6 rd, 100.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 9.3 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(8.8) write-amplify(4.0) OK, records in: 8446, records dropped: 530 output_compression: NoCompression
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.085859) EVENT_LOG_v1 {"time_micros": 1769162974085837, "job": 70, "event": "compaction_finished", "compaction_time_micros": 104260, "compaction_time_cpu_micros": 28386, "output_level": 6, "num_output_files": 1, "total_output_size": 10458340, "num_input_records": 8446, "num_output_records": 7916, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162974086838, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162974088822, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:33.979292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.088903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.088939) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.088941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.088943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:09:34.088946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593233 nova_compute[222017]: 2026-01-23 10:09:34.415 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:09:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:34.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:09:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:34.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:35 np0005593233 nova_compute[222017]: 2026-01-23 10:09:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:36.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:36.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:36 np0005593233 nova_compute[222017]: 2026-01-23 10:09:36.563 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593233 nova_compute[222017]: 2026-01-23 10:09:37.731 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:38.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:38.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:39 np0005593233 nova_compute[222017]: 2026-01-23 10:09:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:40.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.439 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.440 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.440 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.441 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.442 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:41 np0005593233 nova_compute[222017]: 2026-01-23 10:09:41.603 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1808218231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.009 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.567s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.080 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.080 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.270 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.272 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4319MB free_disk=20.84278106689453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.272 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.273 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:42.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:42.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.559 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 91cc1048-141a-4a20-b148-991a883adfa9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.560 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.560 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.626 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.649 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.649 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:09:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:09:42.672 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:09:42.673 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.673 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:09:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:09:42.674 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.704 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:42 np0005593233 nova_compute[222017]: 2026-01-23 10:09:42.746 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:43 np0005593233 podman[270614]: 2026-01-23 10:09:43.048491973 +0000 UTC m=+0.061209132 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:09:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/420221121' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:43 np0005593233 nova_compute[222017]: 2026-01-23 10:09:43.220 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:43 np0005593233 nova_compute[222017]: 2026-01-23 10:09:43.225 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:09:43 np0005593233 nova_compute[222017]: 2026-01-23 10:09:43.246 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:09:43 np0005593233 nova_compute[222017]: 2026-01-23 10:09:43.247 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:09:43 np0005593233 nova_compute[222017]: 2026-01-23 10:09:43.248 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:09:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/841356999' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:09:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:09:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/841356999' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:09:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:44.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:44.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:45 np0005593233 nova_compute[222017]: 2026-01-23 10:09:45.248 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:45 np0005593233 nova_compute[222017]: 2026-01-23 10:09:45.248 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:45 np0005593233 nova_compute[222017]: 2026-01-23 10:09:45.249 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:09:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:46.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:46.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:46 np0005593233 nova_compute[222017]: 2026-01-23 10:09:46.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.024 222021 INFO nova.compute.manager [None req-37b225c8-e881-4b8a-ada9-58b25db44b27 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Pausing#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.026 222021 DEBUG nova.objects.instance [None req-37b225c8-e881-4b8a-ada9-58b25db44b27 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'flavor' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.065 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162987.0640914, 91cc1048-141a-4a20-b148-991a883adfa9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.065 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.068 222021 DEBUG nova.compute.manager [None req-37b225c8-e881-4b8a-ada9-58b25db44b27 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.117 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.121 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:09:47 np0005593233 nova_compute[222017]: 2026-01-23 10:09:47.736 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:47 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.125 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:09:48.132 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:09:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:09:48.133 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:09:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:48.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:48.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.697 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.698 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.698 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:09:48 np0005593233 nova_compute[222017]: 2026-01-23 10:09:48.699 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.720 222021 INFO nova.compute.manager [None req-322e3617-9cac-4f0d-800c-a2f9465c91ca cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Unpausing#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.722 222021 DEBUG nova.objects.instance [None req-322e3617-9cac-4f0d-800c-a2f9465c91ca cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'flavor' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.758 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769162989.7580857, 91cc1048-141a-4a20-b148-991a883adfa9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.759 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:09:49 np0005593233 virtqemud[221325]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.765 222021 DEBUG nova.virt.libvirt.guest [None req-322e3617-9cac-4f0d-800c-a2f9465c91ca cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.766 222021 DEBUG nova.compute.manager [None req-322e3617-9cac-4f0d-800c-a2f9465c91ca cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.813 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.817 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:09:49 np0005593233 nova_compute[222017]: 2026-01-23 10:09:49.851 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 23 05:09:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:50.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:50.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:50 np0005593233 nova_compute[222017]: 2026-01-23 10:09:50.717 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:09:50 np0005593233 nova_compute[222017]: 2026-01-23 10:09:50.758 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:09:50 np0005593233 nova_compute[222017]: 2026-01-23 10:09:50.759 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:09:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:51 np0005593233 nova_compute[222017]: 2026-01-23 10:09:51.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:52.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:52.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:52 np0005593233 nova_compute[222017]: 2026-01-23 10:09:52.739 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:52 np0005593233 nova_compute[222017]: 2026-01-23 10:09:52.751 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:54.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:54.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:56.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:09:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:56.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:09:56 np0005593233 nova_compute[222017]: 2026-01-23 10:09:56.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:09:57.135 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:57 np0005593233 nova_compute[222017]: 2026-01-23 10:09:57.746 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:09:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:58.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:09:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:09:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:58.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:00 np0005593233 podman[270637]: 2026-01-23 10:10:00.118046352 +0000 UTC m=+0.128442704 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:10:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:00.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:00.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 05:10:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:01 np0005593233 nova_compute[222017]: 2026-01-23 10:10:01.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:01 np0005593233 nova_compute[222017]: 2026-01-23 10:10:01.614 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:02.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:02.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:02 np0005593233 nova_compute[222017]: 2026-01-23 10:10:02.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:02 np0005593233 nova_compute[222017]: 2026-01-23 10:10:02.919 222021 DEBUG nova.compute.manager [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.039 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.040 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.094 222021 DEBUG nova.objects.instance [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'pci_requests' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.129 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.130 222021 INFO nova.compute.claims [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.130 222021 DEBUG nova.objects.instance [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'resources' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.162 222021 DEBUG nova.objects.instance [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'pci_devices' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.236 222021 INFO nova.compute.resource_tracker [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updating resource usage from migration fef9a971-4d26-4fce-accb-849c814745fe#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.237 222021 DEBUG nova.compute.resource_tracker [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Starting to track incoming migration fef9a971-4d26-4fce-accb-849c814745fe with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.383 222021 DEBUG oslo_concurrency.processutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:03 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1483641124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.945 222021 DEBUG oslo_concurrency.processutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.957 222021 DEBUG nova.compute.provider_tree [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:03 np0005593233 nova_compute[222017]: 2026-01-23 10:10:03.979 222021 DEBUG nova.scheduler.client.report [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:04 np0005593233 nova_compute[222017]: 2026-01-23 10:10:04.010 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:04 np0005593233 nova_compute[222017]: 2026-01-23 10:10:04.010 222021 INFO nova.compute.manager [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Migrating#033[00m
Jan 23 05:10:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:04.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:04.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:06.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:06.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:06 np0005593233 nova_compute[222017]: 2026-01-23 10:10:06.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:07 np0005593233 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 05:10:07 np0005593233 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 05:10:07 np0005593233 systemd-logind[804]: New session 60 of user nova.
Jan 23 05:10:07 np0005593233 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 05:10:07 np0005593233 systemd[1]: Starting User Manager for UID 42436...
Jan 23 05:10:07 np0005593233 systemd[270690]: Queued start job for default target Main User Target.
Jan 23 05:10:07 np0005593233 systemd[270690]: Created slice User Application Slice.
Jan 23 05:10:07 np0005593233 systemd[270690]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:10:07 np0005593233 systemd[270690]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:10:07 np0005593233 systemd[270690]: Reached target Paths.
Jan 23 05:10:07 np0005593233 systemd[270690]: Reached target Timers.
Jan 23 05:10:07 np0005593233 systemd[270690]: Starting D-Bus User Message Bus Socket...
Jan 23 05:10:07 np0005593233 systemd[270690]: Starting Create User's Volatile Files and Directories...
Jan 23 05:10:07 np0005593233 systemd[270690]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:10:07 np0005593233 systemd[270690]: Finished Create User's Volatile Files and Directories.
Jan 23 05:10:07 np0005593233 systemd[270690]: Reached target Sockets.
Jan 23 05:10:07 np0005593233 systemd[270690]: Reached target Basic System.
Jan 23 05:10:07 np0005593233 systemd[270690]: Reached target Main User Target.
Jan 23 05:10:07 np0005593233 systemd[270690]: Startup finished in 147ms.
Jan 23 05:10:07 np0005593233 systemd[1]: Started User Manager for UID 42436.
Jan 23 05:10:07 np0005593233 systemd[1]: Started Session 60 of User nova.
Jan 23 05:10:07 np0005593233 systemd[1]: session-60.scope: Deactivated successfully.
Jan 23 05:10:07 np0005593233 systemd-logind[804]: Session 60 logged out. Waiting for processes to exit.
Jan 23 05:10:07 np0005593233 systemd-logind[804]: Removed session 60.
Jan 23 05:10:07 np0005593233 nova_compute[222017]: 2026-01-23 10:10:07.750 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:07 np0005593233 systemd-logind[804]: New session 62 of user nova.
Jan 23 05:10:07 np0005593233 systemd[1]: Started Session 62 of User nova.
Jan 23 05:10:07 np0005593233 systemd[1]: session-62.scope: Deactivated successfully.
Jan 23 05:10:07 np0005593233 systemd-logind[804]: Session 62 logged out. Waiting for processes to exit.
Jan 23 05:10:07 np0005593233 systemd-logind[804]: Removed session 62.
Jan 23 05:10:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:08.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:08.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:08 np0005593233 nova_compute[222017]: 2026-01-23 10:10:08.733 222021 INFO nova.network.neutron [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updating port 8ad4c021-5d44-41aa-adad-f593da5206c1 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:10:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:10.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:10.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.012 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.012 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.013 222021 DEBUG nova.network.neutron [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.182 222021 DEBUG nova.compute.manager [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-changed-8ad4c021-5d44-41aa-adad-f593da5206c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.183 222021 DEBUG nova.compute.manager [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Refreshing instance network info cache due to event network-changed-8ad4c021-5d44-41aa-adad-f593da5206c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.183 222021 DEBUG oslo_concurrency.lockutils [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:11 np0005593233 nova_compute[222017]: 2026-01-23 10:10:11.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:12.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:12.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:12 np0005593233 nova_compute[222017]: 2026-01-23 10:10:12.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.448 222021 DEBUG nova.network.neutron [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updating instance_info_cache with network_info: [{"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.476 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.485 222021 DEBUG oslo_concurrency.lockutils [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.486 222021 DEBUG nova.network.neutron [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Refreshing network info cache for port 8ad4c021-5d44-41aa-adad-f593da5206c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.604 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.607 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.607 222021 INFO nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Creating image(s)#033[00m
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.664 222021 DEBUG nova.storage.rbd_utils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] creating snapshot(nova-resize) on rbd image(81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:10:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e285 e285: 3 total, 3 up, 3 in
Jan 23 05:10:13 np0005593233 nova_compute[222017]: 2026-01-23 10:10:13.812 222021 DEBUG nova.objects.instance [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:14 np0005593233 podman[270787]: 2026-01-23 10:10:14.077970538 +0000 UTC m=+0.087839065 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.087 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.088 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Ensure instance console log exists: /var/lib/nova/instances/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.088 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.089 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.089 222021 DEBUG oslo_concurrency.lockutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.092 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Start _get_guest_xml network_info=[{"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:46:50:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.099 222021 WARNING nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.107 222021 DEBUG nova.virt.libvirt.host [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.109 222021 DEBUG nova.virt.libvirt.host [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.117 222021 DEBUG nova.virt.libvirt.host [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.117 222021 DEBUG nova.virt.libvirt.host [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.119 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.119 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.120 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.120 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.120 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.120 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.121 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.121 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.121 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.122 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.122 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.122 222021 DEBUG nova.virt.hardware [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.123 222021 DEBUG nova.objects.instance [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.158 222021 DEBUG oslo_concurrency.processutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:14.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:14.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:10:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4277358725' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.718 222021 DEBUG oslo_concurrency.processutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:14 np0005593233 nova_compute[222017]: 2026-01-23 10:10:14.768 222021 DEBUG oslo_concurrency.processutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:10:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/95713151' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.255 222021 DEBUG oslo_concurrency.processutils [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.258 222021 DEBUG nova.virt.libvirt.vif [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-8628670',display_name='tempest-ServerActionsTestOtherB-server-8628670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-8628670',id=122,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:08:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-05boc59s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=81a8be01-ddd9-4fd2-91a1-886e7f47bfa3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:46:50:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.259 222021 DEBUG nova.network.os_vif_util [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:46:50:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.260 222021 DEBUG nova.network.os_vif_util [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.265 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <uuid>81a8be01-ddd9-4fd2-91a1-886e7f47bfa3</uuid>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <name>instance-0000007a</name>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <memory>196608</memory>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerActionsTestOtherB-server-8628670</nova:name>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:10:14</nova:creationTime>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.micro">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:memory>192</nova:memory>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:user uuid="aca3cab576d641d3b89e7dddf155d467">tempest-ServerActionsTestOtherB-1052932467-project-member</nova:user>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:project uuid="9dd869ce76e44fc8a82b8bbee1654d33">tempest-ServerActionsTestOtherB-1052932467</nova:project>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <nova:port uuid="8ad4c021-5d44-41aa-adad-f593da5206c1">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <entry name="serial">81a8be01-ddd9-4fd2-91a1-886e7f47bfa3</entry>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <entry name="uuid">81a8be01-ddd9-4fd2-91a1-886e7f47bfa3</entry>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_disk">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_disk.config">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:46:50:89"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <target dev="tap8ad4c021-5d"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3/console.log" append="off"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:10:15 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:10:15 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:10:15 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:10:15 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.267 222021 DEBUG nova.virt.libvirt.vif [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-8628670',display_name='tempest-ServerActionsTestOtherB-server-8628670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-8628670',id=122,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:08:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-05boc59s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=81a8be01-ddd9-4fd2-91a1-886e7f47bfa3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:46:50:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.268 222021 DEBUG nova.network.os_vif_util [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:46:50:89"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.270 222021 DEBUG nova.network.os_vif_util [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.270 222021 DEBUG os_vif [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.272 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.273 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.278 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.278 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ad4c021-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.279 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ad4c021-5d, col_values=(('external_ids', {'iface-id': '8ad4c021-5d44-41aa-adad-f593da5206c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:50:89', 'vm-uuid': '81a8be01-ddd9-4fd2-91a1-886e7f47bfa3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.282 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:15 np0005593233 NetworkManager[48871]: <info>  [1769163015.2834] manager: (tap8ad4c021-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.286 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.293 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.295 222021 INFO os_vif [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d')#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.380 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.380 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.381 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No VIF found with MAC fa:16:3e:46:50:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.381 222021 INFO nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Using config drive#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.423 222021 DEBUG nova.compute.manager [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:10:15 np0005593233 nova_compute[222017]: 2026-01-23 10:10:15.424 222021 DEBUG nova.virt.libvirt.driver [None req-892060a3-9c28-4ce5-8cf0-479dd3439d7f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 05:10:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.476 222021 DEBUG nova.network.neutron [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updated VIF entry in instance network info cache for port 8ad4c021-5d44-41aa-adad-f593da5206c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.477 222021 DEBUG nova.network.neutron [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updating instance_info_cache with network_info: [{"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:16.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:16.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.725 222021 DEBUG oslo_concurrency.lockutils [req-8db6933e-79be-4b10-bffb-58c161d1fd2b req-d241e5a8-bd60-45f6-a71a-632722eb6189 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.883 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.884 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:16 np0005593233 nova_compute[222017]: 2026-01-23 10:10:16.961 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:10:17 np0005593233 nova_compute[222017]: 2026-01-23 10:10:17.259 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:17 np0005593233 nova_compute[222017]: 2026-01-23 10:10:17.260 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:17 np0005593233 nova_compute[222017]: 2026-01-23 10:10:17.266 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:10:17 np0005593233 nova_compute[222017]: 2026-01-23 10:10:17.266 222021 INFO nova.compute.claims [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:10:17 np0005593233 nova_compute[222017]: 2026-01-23 10:10:17.583 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3543631138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:18 np0005593233 nova_compute[222017]: 2026-01-23 10:10:18.092 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:18 np0005593233 nova_compute[222017]: 2026-01-23 10:10:18.099 222021 DEBUG nova.compute.provider_tree [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:18 np0005593233 nova_compute[222017]: 2026-01-23 10:10:18.130 222021 DEBUG nova.scheduler.client.report [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:18 np0005593233 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 05:10:18 np0005593233 systemd[270690]: Activating special unit Exit the Session...
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped target Main User Target.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped target Basic System.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped target Paths.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped target Sockets.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped target Timers.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:10:18 np0005593233 systemd[270690]: Closed D-Bus User Message Bus Socket.
Jan 23 05:10:18 np0005593233 systemd[270690]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:10:18 np0005593233 systemd[270690]: Removed slice User Application Slice.
Jan 23 05:10:18 np0005593233 systemd[270690]: Reached target Shutdown.
Jan 23 05:10:18 np0005593233 systemd[270690]: Finished Exit the Session.
Jan 23 05:10:18 np0005593233 systemd[270690]: Reached target Exit the Session.
Jan 23 05:10:18 np0005593233 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 05:10:18 np0005593233 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 05:10:18 np0005593233 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 05:10:18 np0005593233 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 05:10:18 np0005593233 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 05:10:18 np0005593233 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 05:10:18 np0005593233 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 05:10:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:18.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:18.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:18 np0005593233 nova_compute[222017]: 2026-01-23 10:10:18.958 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:18 np0005593233 nova_compute[222017]: 2026-01-23 10:10:18.960 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.122 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.122 222021 DEBUG nova.network.neutron [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.257 222021 INFO nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.279 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.375 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.376 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.377 222021 INFO nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Creating image(s)#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.404 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.433 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.460 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.465 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.544 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.546 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.548 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.549 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.587 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.592 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.839 222021 DEBUG nova.policy [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c99d09acd2e849a69846a6ccda1e0bc7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '924f976bcbb74ec195730b68eebe1f2a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:10:19 np0005593233 nova_compute[222017]: 2026-01-23 10:10:19.923 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.331s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.007 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] resizing rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.133 222021 DEBUG nova.objects.instance [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'migration_context' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.179 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.180 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Ensure instance console log exists: /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.180 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.181 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.181 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.283 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:20.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:20.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.653 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.653 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.654 222021 INFO nova.compute.manager [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Shelving#033[00m
Jan 23 05:10:20 np0005593233 nova_compute[222017]: 2026-01-23 10:10:20.681 222021 DEBUG nova.virt.libvirt.driver [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:10:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:21 np0005593233 nova_compute[222017]: 2026-01-23 10:10:21.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:21 np0005593233 nova_compute[222017]: 2026-01-23 10:10:21.927 222021 DEBUG nova.network.neutron [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Successfully created port: 283da12a-be97-4dbe-9ecf-fd4e5aae8289 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:10:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:22.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:22.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:23 np0005593233 kernel: tap8bb3c318-ff (unregistering): left promiscuous mode
Jan 23 05:10:23 np0005593233 NetworkManager[48871]: <info>  [1769163023.0257] device (tap8bb3c318-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:23Z|00532|binding|INFO|Releasing lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 from this chassis (sb_readonly=0)
Jan 23 05:10:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:23Z|00533|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 down in Southbound
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.040 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:23Z|00534|binding|INFO|Removing iface tap8bb3c318-ff ovn-installed in OVS
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.043 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.049 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ad:46 10.100.0.12'], port_security=['fa:16:3e:5b:ad:46 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91cc1048-141a-4a20-b148-991a883adfa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8bb3c318-ff77-47e8-a160-22e4c278fc88) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.051 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb3c318-ff77-47e8-a160-22e4c278fc88 in datapath 63877f45-8244-4c80-903a-80901a7d83cb unbound from our chassis#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.053 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63877f45-8244-4c80-903a-80901a7d83cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.056 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.056 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd7eb3a-ce02-4171-81c3-da8a9f65e54a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.058 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb namespace which is not needed anymore#033[00m
Jan 23 05:10:23 np0005593233 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 23 05:10:23 np0005593233 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000079.scope: Consumed 20.739s CPU time.
Jan 23 05:10:23 np0005593233 systemd-machined[190954]: Machine qemu-56-instance-00000079 terminated.
Jan 23 05:10:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [NOTICE]   (269275) : haproxy version is 2.8.14-c23fe91
Jan 23 05:10:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [NOTICE]   (269275) : path to executable is /usr/sbin/haproxy
Jan 23 05:10:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [WARNING]  (269275) : Exiting Master process...
Jan 23 05:10:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [WARNING]  (269275) : Exiting Master process...
Jan 23 05:10:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [ALERT]    (269275) : Current worker (269277) exited with code 143 (Terminated)
Jan 23 05:10:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[269271]: [WARNING]  (269275) : All workers exited. Exiting... (0)
Jan 23 05:10:23 np0005593233 systemd[1]: libpod-869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51.scope: Deactivated successfully.
Jan 23 05:10:23 np0005593233 podman[271224]: 2026-01-23 10:10:23.246820519 +0000 UTC m=+0.066204883 container died 869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:10:23 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51-userdata-shm.mount: Deactivated successfully.
Jan 23 05:10:23 np0005593233 systemd[1]: var-lib-containers-storage-overlay-aafc31ebbfe0c62a91440757444d561240cfa96eb1ead8e69e07c040a314acd2-merged.mount: Deactivated successfully.
Jan 23 05:10:23 np0005593233 podman[271224]: 2026-01-23 10:10:23.302438142 +0000 UTC m=+0.121822506 container cleanup 869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:10:23 np0005593233 systemd[1]: libpod-conmon-869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51.scope: Deactivated successfully.
Jan 23 05:10:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e286 e286: 3 total, 3 up, 3 in
Jan 23 05:10:23 np0005593233 podman[271276]: 2026-01-23 10:10:23.381759845 +0000 UTC m=+0.048830112 container remove 869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.389 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d7f0f71-4e7c-4c66-9875-a1d6e95ccf05]: (4, ('Fri Jan 23 10:10:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb (869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51)\n869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51\nFri Jan 23 10:10:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb (869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51)\n869b53350ee2b3432659556ef40b86621a665c1225aa99432bc8ccdd3049ed51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.392 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc9fd91-f50d-488a-bf60-39600d2d2a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.393 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.395 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593233 kernel: tap63877f45-80: left promiscuous mode
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.425 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.433 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c59cd1-2ee7-47e3-b111-0d41509ab4e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.446 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9583f7c7-f647-4006-84ea-39ab1dd026b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.447 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[205ea971-a76b-4052-b49c-8f9d8f257084]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.474 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2da68911-a669-4e8c-923e-e6cde714cd6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 679077, 'reachable_time': 30438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271299, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 systemd[1]: run-netns-ovnmeta\x2d63877f45\x2d8244\x2d4c80\x2d903a\x2d80901a7d83cb.mount: Deactivated successfully.
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.478 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:10:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:23.479 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[52ea00f2-04be-4e2c-8690-ed64cfe77f72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.701 222021 INFO nova.virt.libvirt.driver [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.712 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance destroyed successfully.#033[00m
Jan 23 05:10:23 np0005593233 nova_compute[222017]: 2026-01-23 10:10:23.713 222021 DEBUG nova.objects.instance [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'numa_topology' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.072 222021 DEBUG nova.network.neutron [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Successfully updated port: 283da12a-be97-4dbe-9ecf-fd4e5aae8289 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.102 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.102 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquired lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.102 222021 DEBUG nova.network.neutron [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.169 222021 INFO nova.virt.libvirt.driver [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Beginning cold snapshot process#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.201 222021 DEBUG nova.compute.manager [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-changed-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.202 222021 DEBUG nova.compute.manager [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Refreshing instance network info cache due to event network-changed-283da12a-be97-4dbe-9ecf-fd4e5aae8289. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.202 222021 DEBUG oslo_concurrency.lockutils [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.220 222021 DEBUG nova.compute.manager [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.221 222021 DEBUG oslo_concurrency.lockutils [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.221 222021 DEBUG oslo_concurrency.lockutils [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.221 222021 DEBUG oslo_concurrency.lockutils [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.221 222021 DEBUG nova.compute.manager [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.222 222021 WARNING nova.compute.manager [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state active and task_state shelving.#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.222 222021 DEBUG nova.compute.manager [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.222 222021 DEBUG oslo_concurrency.lockutils [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.222 222021 DEBUG oslo_concurrency.lockutils [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.222 222021 DEBUG oslo_concurrency.lockutils [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.223 222021 DEBUG nova.compute.manager [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.223 222021 WARNING nova.compute.manager [req-da55bff0-d571-4d3e-9f6d-6f589ee69227 req-9cba7200-a67f-4d93-a5e3-042e60e7131f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state active and task_state shelving.#033[00m
Jan 23 05:10:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:10:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:10:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.377 222021 DEBUG nova.network.neutron [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.387 222021 DEBUG nova.virt.libvirt.imagebackend [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:10:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:24.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:24.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:24 np0005593233 nova_compute[222017]: 2026-01-23 10:10:24.730 222021 DEBUG nova.storage.rbd_utils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] creating snapshot(e83ac3639b7d45399e706c8e363a756a) on rbd image(91cc1048-141a-4a20-b148-991a883adfa9_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:10:25 np0005593233 nova_compute[222017]: 2026-01-23 10:10:25.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e287 e287: 3 total, 3 up, 3 in
Jan 23 05:10:25 np0005593233 nova_compute[222017]: 2026-01-23 10:10:25.457 222021 DEBUG nova.storage.rbd_utils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] cloning vms/91cc1048-141a-4a20-b148-991a883adfa9_disk@e83ac3639b7d45399e706c8e363a756a to images/034d6719-9097-4197-9373-c0b4b83dfc98 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:10:25 np0005593233 nova_compute[222017]: 2026-01-23 10:10:25.594 222021 DEBUG nova.storage.rbd_utils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] flattening images/034d6719-9097-4197-9373-c0b4b83dfc98 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:10:25 np0005593233 nova_compute[222017]: 2026-01-23 10:10:25.800 222021 DEBUG nova.network.neutron [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updating instance_info_cache with network_info: [{"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.092 222021 DEBUG nova.storage.rbd_utils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] removing snapshot(e83ac3639b7d45399e706c8e363a756a) on rbd image(91cc1048-141a-4a20-b148-991a883adfa9_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:10:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e287 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:26.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.645 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Releasing lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.646 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Instance network_info: |[{"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.648 222021 DEBUG oslo_concurrency.lockutils [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.648 222021 DEBUG nova.network.neutron [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Refreshing network info cache for port 283da12a-be97-4dbe-9ecf-fd4e5aae8289 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.652 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Start _get_guest_xml network_info=[{"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.659 222021 WARNING nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.671 222021 DEBUG nova.virt.libvirt.host [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.672 222021 DEBUG nova.virt.libvirt.host [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:10:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e288 e288: 3 total, 3 up, 3 in
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.685 222021 DEBUG nova.virt.libvirt.host [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.686 222021 DEBUG nova.virt.libvirt.host [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.687 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.688 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.688 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.688 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.689 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.689 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.689 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.689 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.689 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.690 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.690 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.690 222021 DEBUG nova.virt.hardware [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.692 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:26 np0005593233 nova_compute[222017]: 2026-01-23 10:10:26.763 222021 DEBUG nova.storage.rbd_utils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] creating snapshot(snap) on rbd image(034d6719-9097-4197-9373-c0b4b83dfc98) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:10:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:10:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3730976270' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.192 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.240 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.246 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:10:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2672614943' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.872 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.625s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.874 222021 DEBUG nova.virt.libvirt.vif [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1738552703',display_name='tempest-AttachVolumeNegativeTest-server-1738552703',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1738552703',id=129,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLOCDc+hB1zgCPdlKOUnZEX+Rl7ewtvKqeeImFpXurwnY4SYrfZFBkwZIE3g5r9nA9h2+pYvShYhnh7AlXCp7hzc3PTeL5rqvcKdXNZAyMR1hX9qOLVJ6T8cdqLx8wglSA==',key_name='tempest-keypair-1358246388',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='924f976bcbb74ec195730b68eebe1f2a',ramdisk_id='',reservation_id='r-qxl4yoot',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1470050886',owner_user_name='tempest-AttachVolumeNegativeTest-1470050886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c99d09acd2e849a69846a6ccda1e0bc7',uuid=1b7661cc-4a60-4a80-967f-f9243a031c9f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.875 222021 DEBUG nova.network.os_vif_util [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converting VIF {"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.876 222021 DEBUG nova.network.os_vif_util [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:27 np0005593233 nova_compute[222017]: 2026-01-23 10:10:27.877 222021 DEBUG nova.objects.instance [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e289 e289: 3 total, 3 up, 3 in
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.125 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <uuid>1b7661cc-4a60-4a80-967f-f9243a031c9f</uuid>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <name>instance-00000081</name>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:name>tempest-AttachVolumeNegativeTest-server-1738552703</nova:name>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:10:26</nova:creationTime>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:user uuid="c99d09acd2e849a69846a6ccda1e0bc7">tempest-AttachVolumeNegativeTest-1470050886-project-member</nova:user>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:project uuid="924f976bcbb74ec195730b68eebe1f2a">tempest-AttachVolumeNegativeTest-1470050886</nova:project>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <nova:port uuid="283da12a-be97-4dbe-9ecf-fd4e5aae8289">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <entry name="serial">1b7661cc-4a60-4a80-967f-f9243a031c9f</entry>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <entry name="uuid">1b7661cc-4a60-4a80-967f-f9243a031c9f</entry>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/1b7661cc-4a60-4a80-967f-f9243a031c9f_disk">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/1b7661cc-4a60-4a80-967f-f9243a031c9f_disk.config">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:b1:d8:4b"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <target dev="tap283da12a-be"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/console.log" append="off"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:10:28 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:10:28 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:10:28 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:10:28 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.127 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Preparing to wait for external event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.128 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.129 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.129 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.131 222021 DEBUG nova.virt.libvirt.vif [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1738552703',display_name='tempest-AttachVolumeNegativeTest-server-1738552703',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1738552703',id=129,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLOCDc+hB1zgCPdlKOUnZEX+Rl7ewtvKqeeImFpXurwnY4SYrfZFBkwZIE3g5r9nA9h2+pYvShYhnh7AlXCp7hzc3PTeL5rqvcKdXNZAyMR1hX9qOLVJ6T8cdqLx8wglSA==',key_name='tempest-keypair-1358246388',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='924f976bcbb74ec195730b68eebe1f2a',ramdisk_id='',reservation_id='r-qxl4yoot',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1470050886',owner_user_name='tempest-AttachVolumeNegativeTest-1470050886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c99d09acd2e849a69846a6ccda1e0bc7',uuid=1b7661cc-4a60-4a80-967f-f9243a031c9f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.131 222021 DEBUG nova.network.os_vif_util [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converting VIF {"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.133 222021 DEBUG nova.network.os_vif_util [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.134 222021 DEBUG os_vif [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.135 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.136 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.137 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.142 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.143 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap283da12a-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.144 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap283da12a-be, col_values=(('external_ids', {'iface-id': '283da12a-be97-4dbe-9ecf-fd4e5aae8289', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:d8:4b', 'vm-uuid': '1b7661cc-4a60-4a80-967f-f9243a031c9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.146 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:28 np0005593233 NetworkManager[48871]: <info>  [1769163028.1475] manager: (tap283da12a-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.150 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.160 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.163 222021 INFO os_vif [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be')#033[00m
Jan 23 05:10:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:28.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:28.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.743 222021 DEBUG nova.objects.instance [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'flavor' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.757 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.758 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.758 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No VIF found with MAC fa:16:3e:b1:d8:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.759 222021 INFO nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Using config drive#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.795 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.805 222021 DEBUG oslo_concurrency.lockutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.805 222021 DEBUG oslo_concurrency.lockutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.805 222021 DEBUG nova.network.neutron [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.806 222021 DEBUG nova.objects.instance [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'info_cache' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.838 222021 DEBUG nova.network.neutron [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updated VIF entry in instance network info cache for port 283da12a-be97-4dbe-9ecf-fd4e5aae8289. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.840 222021 DEBUG nova.network.neutron [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updating instance_info_cache with network_info: [{"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:28 np0005593233 nova_compute[222017]: 2026-01-23 10:10:28.939 222021 DEBUG oslo_concurrency.lockutils [req-7247c94f-5270-434b-968e-f5ad20814b52 req-abc77399-07d9-4c85-8d72-d0cf24646b0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.278 222021 INFO nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Creating config drive at /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/disk.config#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.286 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq4_5kvae execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.446 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq4_5kvae" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.488 222021 DEBUG nova.storage.rbd_utils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.493 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/disk.config 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.674 222021 DEBUG oslo_concurrency.processutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/disk.config 1b7661cc-4a60-4a80-967f-f9243a031c9f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.675 222021 INFO nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Deleting local config drive /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f/disk.config because it was imported into RBD.#033[00m
Jan 23 05:10:29 np0005593233 NetworkManager[48871]: <info>  [1769163029.7520] manager: (tap283da12a-be): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Jan 23 05:10:29 np0005593233 kernel: tap283da12a-be: entered promiscuous mode
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:29Z|00535|binding|INFO|Claiming lport 283da12a-be97-4dbe-9ecf-fd4e5aae8289 for this chassis.
Jan 23 05:10:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:29Z|00536|binding|INFO|283da12a-be97-4dbe-9ecf-fd4e5aae8289: Claiming fa:16:3e:b1:d8:4b 10.100.0.5
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.779 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:d8:4b 10.100.0.5'], port_security=['fa:16:3e:b1:d8:4b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1b7661cc-4a60-4a80-967f-f9243a031c9f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93735878-f62d-4a5f-96df-bf97f85d787a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '924f976bcbb74ec195730b68eebe1f2a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '394c0c6a-bef6-491c-899b-f47fff4f799d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1f72e5c-e22f-424b-b6ed-0c502ff13aa3, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=283da12a-be97-4dbe-9ecf-fd4e5aae8289) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.781 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 283da12a-be97-4dbe-9ecf-fd4e5aae8289 in datapath 93735878-f62d-4a5f-96df-bf97f85d787a bound to our chassis#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.782 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 93735878-f62d-4a5f-96df-bf97f85d787a#033[00m
Jan 23 05:10:29 np0005593233 systemd-machined[190954]: New machine qemu-58-instance-00000081.
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.802 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aca4c62c-07b8-4c67-8e21-78aaeced764f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.803 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap93735878-f1 in ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.805 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap93735878-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.805 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5edff65e-3071-494f-a7c3-6caa03829a16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.806 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a4f15c-4705-428d-93a3-4d6ce6c2b6a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.812 222021 INFO nova.virt.libvirt.driver [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Snapshot image upload complete#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.812 222021 DEBUG nova.compute.manager [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:29 np0005593233 systemd[1]: Started Virtual Machine qemu-58-instance-00000081.
Jan 23 05:10:29 np0005593233 systemd-udevd[271585]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.828 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[d84125c9-e5d0-422d-8aab-14f5e47efa2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 NetworkManager[48871]: <info>  [1769163029.8433] device (tap283da12a-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:10:29 np0005593233 NetworkManager[48871]: <info>  [1769163029.8443] device (tap283da12a-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.858 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[44222019-86c2-48f9-886f-4893f5684f0a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.871 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:29Z|00537|binding|INFO|Setting lport 283da12a-be97-4dbe-9ecf-fd4e5aae8289 ovn-installed in OVS
Jan 23 05:10:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:29Z|00538|binding|INFO|Setting lport 283da12a-be97-4dbe-9ecf-fd4e5aae8289 up in Southbound
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.886 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.894 222021 INFO nova.compute.manager [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Shelve offloading#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.905 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance destroyed successfully.#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.905 222021 DEBUG nova.compute.manager [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.908 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.909 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:29 np0005593233 nova_compute[222017]: 2026-01-23 10:10:29.909 222021 DEBUG nova.network.neutron [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.909 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0f7bc9-a7a3-4c7e-8c78-ebd14caf93e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.914 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[26f03356-6d59-46f9-95b0-5a3440a59ad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 NetworkManager[48871]: <info>  [1769163029.9164] manager: (tap93735878-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/251)
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.958 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5329c827-16f9-4252-a2b6-723f6d8ebeee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:29.965 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf4225a-77fd-400b-aa6c-499afefd8d8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:29 np0005593233 NetworkManager[48871]: <info>  [1769163029.9955] device (tap93735878-f0): carrier: link connected
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.006 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee26006-9b78-49c3-9593-7c4a0a52d4b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.027 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb50330-8ca5-47f1-9ac4-6ad39115a168]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93735878-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:41:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694262, 'reachable_time': 24667, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271651, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.050 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2efb2347-afbc-4ba9-a3ae-9b25d0fb760c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:41c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694262, 'tstamp': 694262}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271660, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.075 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[28ddeed5-cd86-4b53-8682-e6262f5454f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93735878-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:41:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694262, 'reachable_time': 24667, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271676, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.131 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2210d1a9-3d07-401e-80cd-9544ae74eddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.207 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[779b14c9-3359-4404-a371-5e510705b427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.209 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93735878-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.210 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.210 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93735878-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:30 np0005593233 NetworkManager[48871]: <info>  [1769163030.2147] manager: (tap93735878-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.215 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:30 np0005593233 kernel: tap93735878-f0: entered promiscuous mode
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.228 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.229 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap93735878-f0, col_values=(('external_ids', {'iface-id': 'c75eef02-aabe-4477-9239-97f7fb86cd02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.231 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:30 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:30Z|00539|binding|INFO|Releasing lport c75eef02-aabe-4477-9239-97f7fb86cd02 from this chassis (sb_readonly=0)
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.254 222021 DEBUG nova.compute.manager [req-672f04d5-b8b4-48b5-b7a3-6540f227fd31 req-c86fcb3e-3905-467d-9a3c-75bd156a0334 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.255 222021 DEBUG oslo_concurrency.lockutils [req-672f04d5-b8b4-48b5-b7a3-6540f227fd31 req-c86fcb3e-3905-467d-9a3c-75bd156a0334 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.255 222021 DEBUG oslo_concurrency.lockutils [req-672f04d5-b8b4-48b5-b7a3-6540f227fd31 req-c86fcb3e-3905-467d-9a3c-75bd156a0334 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.256 222021 DEBUG oslo_concurrency.lockutils [req-672f04d5-b8b4-48b5-b7a3-6540f227fd31 req-c86fcb3e-3905-467d-9a3c-75bd156a0334 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.256 222021 DEBUG nova.compute.manager [req-672f04d5-b8b4-48b5-b7a3-6540f227fd31 req-c86fcb3e-3905-467d-9a3c-75bd156a0334 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Processing event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.274 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.275 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/93735878-f62d-4a5f-96df-bf97f85d787a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/93735878-f62d-4a5f-96df-bf97f85d787a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.277 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b64a2c4a-e4b1-4af5-a1de-49b8f5c10ff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.277 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-93735878-f62d-4a5f-96df-bf97f85d787a
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/93735878-f62d-4a5f-96df-bf97f85d787a.pid.haproxy
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 93735878-f62d-4a5f-96df-bf97f85d787a
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:10:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:30.278 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'env', 'PROCESS_TAG=haproxy-93735878-f62d-4a5f-96df-bf97f85d787a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/93735878-f62d-4a5f-96df-bf97f85d787a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:10:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:30.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:30.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:30 np0005593233 podman[271716]: 2026-01-23 10:10:30.732325245 +0000 UTC m=+0.080691943 container create 2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:10:30 np0005593233 podman[271716]: 2026-01-23 10:10:30.687783886 +0000 UTC m=+0.036150654 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:10:30 np0005593233 systemd[1]: Started libpod-conmon-2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9.scope.
Jan 23 05:10:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:10:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:10:30 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:10:30 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f2f1ebd8fcae54ea9d26fa88ab9d5ff44a2653b35435a2263d0cb0c4d342c41/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:30 np0005593233 podman[271716]: 2026-01-23 10:10:30.836729209 +0000 UTC m=+0.185095907 container init 2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:10:30 np0005593233 podman[271716]: 2026-01-23 10:10:30.845779115 +0000 UTC m=+0.194145803 container start 2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 05:10:30 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [NOTICE]   (271793) : New worker (271798) forked
Jan 23 05:10:30 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [NOTICE]   (271793) : Loading success.
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.893 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.895 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163030.8931484, 1b7661cc-4a60-4a80-967f-f9243a031c9f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.895 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] VM Started (Lifecycle Event)#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.899 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.904 222021 INFO nova.virt.libvirt.driver [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Instance spawned successfully.#033[00m
Jan 23 05:10:30 np0005593233 nova_compute[222017]: 2026-01-23 10:10:30.905 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:10:30 np0005593233 podman[271763]: 2026-01-23 10:10:30.934986258 +0000 UTC m=+0.155722466 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.023 222021 DEBUG nova.network.neutron [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updating instance_info_cache with network_info: [{"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.253 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.259 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.285 222021 DEBUG oslo_concurrency.lockutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.293 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.294 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.295 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.295 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.296 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.297 222021 DEBUG nova.virt.libvirt.driver [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.342 222021 DEBUG nova.network.neutron [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.411 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.412 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163030.8944776, 1b7661cc-4a60-4a80-967f-f9243a031c9f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.413 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.426 222021 INFO nova.virt.libvirt.driver [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Instance destroyed successfully.#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.427 222021 DEBUG nova.objects.instance [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'numa_topology' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.451 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.478 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.484 222021 DEBUG nova.objects.instance [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'resources' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.487 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163030.8980346, 1b7661cc-4a60-4a80-967f-f9243a031c9f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.488 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.511 222021 INFO nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Took 12.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.512 222021 DEBUG nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.513 222021 DEBUG nova.virt.libvirt.vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-8628670',display_name='tempest-ServerActionsTestOtherB-server-8628670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-8628670',id=122,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-05boc59s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:10:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=81a8be01-ddd9-4fd2-91a1-886e7f47bfa3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.514 222021 DEBUG nova.network.os_vif_util [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.515 222021 DEBUG nova.network.os_vif_util [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.515 222021 DEBUG os_vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.518 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.518 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ad4c021-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.521 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.523 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.535 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.538 222021 INFO os_vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d')#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.549 222021 DEBUG nova.virt.libvirt.driver [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Start _get_guest_xml network_info=[{"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.555 222021 WARNING nova.virt.libvirt.driver [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.563 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.564 222021 DEBUG nova.virt.libvirt.host [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.565 222021 DEBUG nova.virt.libvirt.host [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.569 222021 DEBUG nova.virt.libvirt.host [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.570 222021 DEBUG nova.virt.libvirt.host [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.571 222021 DEBUG nova.virt.libvirt.driver [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.572 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.572 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.573 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.573 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.574 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.574 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.574 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.575 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.575 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.576 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.576 222021 DEBUG nova.virt.hardware [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.577 222021 DEBUG nova.objects.instance [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.599 222021 DEBUG oslo_concurrency.processutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.647 222021 INFO nova.compute.manager [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Took 14.41 seconds to build instance.#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.668 222021 DEBUG oslo_concurrency.lockutils [None req-eff4f962-8afb-4e99-b46e-eb20cd732ea9 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:31 np0005593233 nova_compute[222017]: 2026-01-23 10:10:31.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e290 e290: 3 total, 3 up, 3 in
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.105 222021 DEBUG oslo_concurrency.processutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.147 222021 DEBUG oslo_concurrency.processutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.363 222021 DEBUG nova.compute.manager [req-f82ea47d-f216-4917-8b02-1d198e03cf9a req-3413a1e3-b170-4260-addf-b32df1fc69ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.365 222021 DEBUG oslo_concurrency.lockutils [req-f82ea47d-f216-4917-8b02-1d198e03cf9a req-3413a1e3-b170-4260-addf-b32df1fc69ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.365 222021 DEBUG oslo_concurrency.lockutils [req-f82ea47d-f216-4917-8b02-1d198e03cf9a req-3413a1e3-b170-4260-addf-b32df1fc69ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.366 222021 DEBUG oslo_concurrency.lockutils [req-f82ea47d-f216-4917-8b02-1d198e03cf9a req-3413a1e3-b170-4260-addf-b32df1fc69ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.366 222021 DEBUG nova.compute.manager [req-f82ea47d-f216-4917-8b02-1d198e03cf9a req-3413a1e3-b170-4260-addf-b32df1fc69ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] No waiting events found dispatching network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.366 222021 WARNING nova.compute.manager [req-f82ea47d-f216-4917-8b02-1d198e03cf9a req-3413a1e3-b170-4260-addf-b32df1fc69ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received unexpected event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.472 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance destroyed successfully.#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.473 222021 DEBUG nova.objects.instance [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'resources' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.491 222021 DEBUG nova.virt.libvirt.vif [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:07:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member',shelved_at='2026-01-23T10:10:29.812600',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='034d6719-9097-4197-9373-c0b4b83dfc98'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:10:24Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.492 222021 DEBUG nova.network.os_vif_util [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.492 222021 DEBUG nova.network.os_vif_util [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.493 222021 DEBUG os_vif [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.495 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.496 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bb3c318-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.498 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.499 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.502 222021 INFO os_vif [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff')#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.574 222021 DEBUG nova.compute.manager [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-changed-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.575 222021 DEBUG nova.compute.manager [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Refreshing instance network info cache due to event network-changed-8bb3c318-ff77-47e8-a160-22e4c278fc88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.575 222021 DEBUG oslo_concurrency.lockutils [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.576 222021 DEBUG oslo_concurrency.lockutils [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.576 222021 DEBUG nova.network.neutron [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Refreshing network info cache for port 8bb3c318-ff77-47e8-a160-22e4c278fc88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:10:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:32.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:10:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2700813794' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:10:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:32.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.624 222021 DEBUG oslo_concurrency.processutils [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.629 222021 DEBUG nova.virt.libvirt.vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-8628670',display_name='tempest-ServerActionsTestOtherB-server-8628670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-8628670',id=122,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-05boc59s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:10:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=81a8be01-ddd9-4fd2-91a1-886e7f47bfa3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.630 222021 DEBUG nova.network.os_vif_util [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.630 222021 DEBUG nova.network.os_vif_util [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.632 222021 DEBUG nova.objects.instance [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'pci_devices' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.651 222021 DEBUG nova.virt.libvirt.driver [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <uuid>81a8be01-ddd9-4fd2-91a1-886e7f47bfa3</uuid>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <name>instance-0000007a</name>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <memory>196608</memory>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerActionsTestOtherB-server-8628670</nova:name>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:10:31</nova:creationTime>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.micro">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:memory>192</nova:memory>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:user uuid="aca3cab576d641d3b89e7dddf155d467">tempest-ServerActionsTestOtherB-1052932467-project-member</nova:user>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:project uuid="9dd869ce76e44fc8a82b8bbee1654d33">tempest-ServerActionsTestOtherB-1052932467</nova:project>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <nova:port uuid="8ad4c021-5d44-41aa-adad-f593da5206c1">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <entry name="serial">81a8be01-ddd9-4fd2-91a1-886e7f47bfa3</entry>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <entry name="uuid">81a8be01-ddd9-4fd2-91a1-886e7f47bfa3</entry>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_disk">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_disk.config">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:46:50:89"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <target dev="tap8ad4c021-5d"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3/console.log" append="off"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:10:32 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:10:32 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:10:32 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:10:32 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.658 222021 DEBUG nova.virt.libvirt.driver [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.658 222021 DEBUG nova.virt.libvirt.driver [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] skipping disk for instance-0000007a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.659 222021 DEBUG nova.virt.libvirt.vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-8628670',display_name='tempest-ServerActionsTestOtherB-server-8628670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-8628670',id=122,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-05boc59s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:10:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=81a8be01-ddd9-4fd2-91a1-886e7f47bfa3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.660 222021 DEBUG nova.network.os_vif_util [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.660 222021 DEBUG nova.network.os_vif_util [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.661 222021 DEBUG os_vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.662 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.662 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.665 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ad4c021-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.666 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ad4c021-5d, col_values=(('external_ids', {'iface-id': '8ad4c021-5d44-41aa-adad-f593da5206c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:50:89', 'vm-uuid': '81a8be01-ddd9-4fd2-91a1-886e7f47bfa3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.667 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 NetworkManager[48871]: <info>  [1769163032.6691] manager: (tap8ad4c021-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.674 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.675 222021 INFO os_vif [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d')#033[00m
Jan 23 05:10:32 np0005593233 NetworkManager[48871]: <info>  [1769163032.7637] manager: (tap8ad4c021-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/254)
Jan 23 05:10:32 np0005593233 kernel: tap8ad4c021-5d: entered promiscuous mode
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:32Z|00540|binding|INFO|Claiming lport 8ad4c021-5d44-41aa-adad-f593da5206c1 for this chassis.
Jan 23 05:10:32 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:32Z|00541|binding|INFO|8ad4c021-5d44-41aa-adad-f593da5206c1: Claiming fa:16:3e:46:50:89 10.100.0.7
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.785 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 NetworkManager[48871]: <info>  [1769163032.7874] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 23 05:10:32 np0005593233 NetworkManager[48871]: <info>  [1769163032.7888] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 23 05:10:32 np0005593233 systemd-udevd[271909]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:10:32 np0005593233 systemd-machined[190954]: New machine qemu-59-instance-0000007a.
Jan 23 05:10:32 np0005593233 NetworkManager[48871]: <info>  [1769163032.8477] device (tap8ad4c021-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:10:32 np0005593233 NetworkManager[48871]: <info>  [1769163032.8495] device (tap8ad4c021-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:10:32 np0005593233 systemd[1]: Started Virtual Machine qemu-59-instance-0000007a.
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.922 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:32 np0005593233 nova_compute[222017]: 2026-01-23 10:10:32.933 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:33Z|00542|binding|INFO|Releasing lport c75eef02-aabe-4477-9239-97f7fb86cd02 from this chassis (sb_readonly=0)
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.042 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:50:89 10.100.0.7'], port_security=['fa:16:3e:46:50:89 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '81a8be01-ddd9-4fd2-91a1-886e7f47bfa3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8ad4c021-5d44-41aa-adad-f593da5206c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.045 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8ad4c021-5d44-41aa-adad-f593da5206c1 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d bound to our chassis#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.047 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:10:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:33Z|00543|binding|INFO|Setting lport 8ad4c021-5d44-41aa-adad-f593da5206c1 ovn-installed in OVS
Jan 23 05:10:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:33Z|00544|binding|INFO|Setting lport 8ad4c021-5d44-41aa-adad-f593da5206c1 up in Southbound
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.064 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.069 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c87c7d7d-5a0a-40e1-8e74-6cb463e491ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.071 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8d9599b4-81 in ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.075 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8d9599b4-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.075 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0f429454-1f4d-4985-bb6b-b435b3eaaa8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.077 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b0039920-84e8-49eb-aa12-3ff84836845c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.099 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c7176609-dad9-42d6-834c-b332c84fbed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.125 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe5dd66-57da-4a33-9adf-94bdc64505aa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.174 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b341d710-9b3d-48cf-bf75-2c5c28feb2d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.187 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0b7db8-76a5-48cd-ba68-61163790dc23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 NetworkManager[48871]: <info>  [1769163033.1886] manager: (tap8d9599b4-80): new Veth device (/org/freedesktop/NetworkManager/Devices/257)
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.248 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4a318c8e-bf2b-40e7-87b1-cb2aa0d0f1cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.253 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7e76bf-9f56-48f3-b4f8-7485a4bec843]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.262 222021 INFO nova.virt.libvirt.driver [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deleting instance files /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9_del#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.263 222021 INFO nova.virt.libvirt.driver [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deletion of /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9_del complete#033[00m
Jan 23 05:10:33 np0005593233 NetworkManager[48871]: <info>  [1769163033.2902] device (tap8d9599b4-80): carrier: link connected
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.297 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b56090b9-52a6-47bb-91b0-b1127603f012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.316 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4726ba-1820-4087-9d5b-46adf28a69f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694592, 'reachable_time': 43396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271962, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.342 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6629f9b7-ea94-4f63-a5b3-40baefb5799e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:a12b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694592, 'tstamp': 694592}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271970, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.363 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8bdb98-756b-4727-ba98-f46858d44167]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694592, 'reachable_time': 43396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271980, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.417 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[73079fe3-5bd1-45c4-95ea-971473f7de7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.501 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163033.4999037, 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.502 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.506 222021 DEBUG nova.compute.manager [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.510 222021 INFO nova.virt.libvirt.driver [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Instance rebooted successfully.#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.511 222021 DEBUG nova.compute.manager [None req-456fe123-ded5-4dfe-bb7c-38bbad6555a0 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.577 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7e40b146-93d6-4aeb-8297-df25cf070c27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.579 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.580 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.581 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.584 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 NetworkManager[48871]: <info>  [1769163033.5850] manager: (tap8d9599b4-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 23 05:10:33 np0005593233 kernel: tap8d9599b4-80: entered promiscuous mode
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.587 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.589 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.591 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:33Z|00545|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.619 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.621 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8d9599b4-8855-4310-af02-cdd058438f7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8d9599b4-8855-4310-af02-cdd058438f7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.623 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bea665a8-5485-4988-812c-604df2c3ef47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:33 np0005593233 nova_compute[222017]: 2026-01-23 10:10:33.624 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.625 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-8d9599b4-8855-4310-af02-cdd058438f7d
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/8d9599b4-8855-4310-af02-cdd058438f7d.pid.haproxy
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 8d9599b4-8855-4310-af02-cdd058438f7d
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:10:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:33.628 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'env', 'PROCESS_TAG=haproxy-8d9599b4-8855-4310-af02-cdd058438f7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8d9599b4-8855-4310-af02-cdd058438f7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:10:34 np0005593233 podman[272020]: 2026-01-23 10:10:34.077801362 +0000 UTC m=+0.050098858 container create b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:10:34 np0005593233 podman[272020]: 2026-01-23 10:10:34.049231304 +0000 UTC m=+0.021528830 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:10:34 np0005593233 systemd[1]: Started libpod-conmon-b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4.scope.
Jan 23 05:10:34 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:10:34 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acb67dfd02ae408927279953fcfc7059000dc4f52f68e63b6d1babb8e473d87f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:34 np0005593233 podman[272020]: 2026-01-23 10:10:34.203716383 +0000 UTC m=+0.176013889 container init b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 05:10:34 np0005593233 podman[272020]: 2026-01-23 10:10:34.212521352 +0000 UTC m=+0.184818848 container start b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:10:34 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [NOTICE]   (272040) : New worker (272042) forked
Jan 23 05:10:34 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [NOTICE]   (272040) : Loading success.
Jan 23 05:10:34 np0005593233 nova_compute[222017]: 2026-01-23 10:10:34.290 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:34 np0005593233 nova_compute[222017]: 2026-01-23 10:10:34.295 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:10:34 np0005593233 nova_compute[222017]: 2026-01-23 10:10:34.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:34.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:34.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:35 np0005593233 nova_compute[222017]: 2026-01-23 10:10:35.440 222021 DEBUG nova.network.neutron [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updated VIF entry in instance network info cache for port 8bb3c318-ff77-47e8-a160-22e4c278fc88. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:10:35 np0005593233 nova_compute[222017]: 2026-01-23 10:10:35.440 222021 DEBUG nova.network.neutron [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": null, "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:35 np0005593233 nova_compute[222017]: 2026-01-23 10:10:35.574 222021 INFO nova.scheduler.client.report [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Deleted allocations for instance 91cc1048-141a-4a20-b148-991a883adfa9#033[00m
Jan 23 05:10:35 np0005593233 nova_compute[222017]: 2026-01-23 10:10:35.781 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163033.5063362, 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:35 np0005593233 nova_compute[222017]: 2026-01-23 10:10:35.782 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] VM Started (Lifecycle Event)#033[00m
Jan 23 05:10:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.264 222021 DEBUG nova.compute.manager [req-5dc7b9b5-9ca1-4bbc-87c9-14c982647b98 req-3bba0a12-0d12-4498-9aa9-5af252e3bd9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.266 222021 DEBUG oslo_concurrency.lockutils [req-5dc7b9b5-9ca1-4bbc-87c9-14c982647b98 req-3bba0a12-0d12-4498-9aa9-5af252e3bd9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.267 222021 DEBUG oslo_concurrency.lockutils [req-5dc7b9b5-9ca1-4bbc-87c9-14c982647b98 req-3bba0a12-0d12-4498-9aa9-5af252e3bd9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.268 222021 DEBUG oslo_concurrency.lockutils [req-5dc7b9b5-9ca1-4bbc-87c9-14c982647b98 req-3bba0a12-0d12-4498-9aa9-5af252e3bd9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.269 222021 DEBUG nova.compute.manager [req-5dc7b9b5-9ca1-4bbc-87c9-14c982647b98 req-3bba0a12-0d12-4498-9aa9-5af252e3bd9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] No waiting events found dispatching network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.269 222021 WARNING nova.compute.manager [req-5dc7b9b5-9ca1-4bbc-87c9-14c982647b98 req-3bba0a12-0d12-4498-9aa9-5af252e3bd9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received unexpected event network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.293 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.300 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.555 222021 DEBUG oslo_concurrency.lockutils [req-d879c50a-79be-4ee8-b3ff-28b8b0abdfe2 req-d2204006-351f-467b-a150-1c813d0325d8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:36.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:36.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.719 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 e291: 3 total, 3 up, 3 in
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.818 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.820 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:36 np0005593233 nova_compute[222017]: 2026-01-23 10:10:36.891 222021 DEBUG oslo_concurrency.processutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:37 np0005593233 nova_compute[222017]: 2026-01-23 10:10:37.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:37 np0005593233 nova_compute[222017]: 2026-01-23 10:10:37.398 222021 DEBUG oslo_concurrency.processutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:37 np0005593233 nova_compute[222017]: 2026-01-23 10:10:37.406 222021 DEBUG nova.compute.provider_tree [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:37 np0005593233 nova_compute[222017]: 2026-01-23 10:10:37.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:38 np0005593233 nova_compute[222017]: 2026-01-23 10:10:38.293 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163023.2921088, 91cc1048-141a-4a20-b148-991a883adfa9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:38 np0005593233 nova_compute[222017]: 2026-01-23 10:10:38.294 222021 INFO nova.compute.manager [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:10:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:10:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:38.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:10:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:38.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.925 222021 DEBUG nova.compute.manager [req-df77ea8b-562d-4d53-bd0e-a2c14bef323a req-89040775-ab4a-41ce-961b-c746e576739b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.927 222021 DEBUG oslo_concurrency.lockutils [req-df77ea8b-562d-4d53-bd0e-a2c14bef323a req-89040775-ab4a-41ce-961b-c746e576739b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.927 222021 DEBUG oslo_concurrency.lockutils [req-df77ea8b-562d-4d53-bd0e-a2c14bef323a req-89040775-ab4a-41ce-961b-c746e576739b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.927 222021 DEBUG oslo_concurrency.lockutils [req-df77ea8b-562d-4d53-bd0e-a2c14bef323a req-89040775-ab4a-41ce-961b-c746e576739b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.928 222021 DEBUG nova.compute.manager [req-df77ea8b-562d-4d53-bd0e-a2c14bef323a req-89040775-ab4a-41ce-961b-c746e576739b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] No waiting events found dispatching network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.928 222021 WARNING nova.compute.manager [req-df77ea8b-562d-4d53-bd0e-a2c14bef323a req-89040775-ab4a-41ce-961b-c746e576739b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received unexpected event network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.962 222021 DEBUG nova.scheduler.client.report [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:39 np0005593233 nova_compute[222017]: 2026-01-23 10:10:39.982 222021 DEBUG nova.compute.manager [None req-e488f842-27d8-4e5c-800e-f15e573ce3b4 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.191 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 3.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.222 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.223 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.224 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.224 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.225 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.226 222021 INFO nova.compute.manager [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Terminating instance#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.227 222021 DEBUG nova.compute.manager [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:10:40 np0005593233 kernel: tap8ad4c021-5d (unregistering): left promiscuous mode
Jan 23 05:10:40 np0005593233 NetworkManager[48871]: <info>  [1769163040.2839] device (tap8ad4c021-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:10:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:40Z|00546|binding|INFO|Releasing lport 8ad4c021-5d44-41aa-adad-f593da5206c1 from this chassis (sb_readonly=0)
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.300 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:40Z|00547|binding|INFO|Setting lport 8ad4c021-5d44-41aa-adad-f593da5206c1 down in Southbound
Jan 23 05:10:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:40Z|00548|binding|INFO|Removing iface tap8ad4c021-5d ovn-installed in OVS
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.306 222021 DEBUG oslo_concurrency.lockutils [None req-250a55d3-1640-4e7b-bec0-4ce408dc8af3 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.322 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:50:89 10.100.0.7'], port_security=['fa:16:3e:46:50:89 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '81a8be01-ddd9-4fd2-91a1-886e7f47bfa3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8ad4c021-5d44-41aa-adad-f593da5206c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.324 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8ad4c021-5d44-41aa-adad-f593da5206c1 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d unbound from our chassis#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.325 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8d9599b4-8855-4310-af02-cdd058438f7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.326 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e41d9594-c492-47ad-b052-9c2f9afa4d48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.327 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d namespace which is not needed anymore#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.344 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Jan 23 05:10:40 np0005593233 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007a.scope: Consumed 7.845s CPU time.
Jan 23 05:10:40 np0005593233 systemd-machined[190954]: Machine qemu-59-instance-0000007a terminated.
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.456 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.464 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.474 222021 INFO nova.virt.libvirt.driver [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Instance destroyed successfully.#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.475 222021 DEBUG nova.objects.instance [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'resources' on Instance uuid 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:40 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [NOTICE]   (272040) : haproxy version is 2.8.14-c23fe91
Jan 23 05:10:40 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [NOTICE]   (272040) : path to executable is /usr/sbin/haproxy
Jan 23 05:10:40 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [WARNING]  (272040) : Exiting Master process...
Jan 23 05:10:40 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [ALERT]    (272040) : Current worker (272042) exited with code 143 (Terminated)
Jan 23 05:10:40 np0005593233 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[272036]: [WARNING]  (272040) : All workers exited. Exiting... (0)
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.515 222021 DEBUG nova.virt.libvirt.vif [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-8628670',display_name='tempest-ServerActionsTestOtherB-server-8628670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-8628670',id=122,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-05boc59s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:10:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=81a8be01-ddd9-4fd2-91a1-886e7f47bfa3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:10:40 np0005593233 systemd[1]: libpod-b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4.scope: Deactivated successfully.
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.518 222021 DEBUG nova.network.os_vif_util [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8ad4c021-5d44-41aa-adad-f593da5206c1", "address": "fa:16:3e:46:50:89", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ad4c021-5d", "ovs_interfaceid": "8ad4c021-5d44-41aa-adad-f593da5206c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.519 222021 DEBUG nova.network.os_vif_util [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.520 222021 DEBUG os_vif [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.522 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ad4c021-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:40 np0005593233 podman[272098]: 2026-01-23 10:10:40.522786299 +0000 UTC m=+0.055018137 container died b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.525 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.534 222021 INFO os_vif [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:50:89,bridge_name='br-int',has_traffic_filtering=True,id=8ad4c021-5d44-41aa-adad-f593da5206c1,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ad4c021-5d')#033[00m
Jan 23 05:10:40 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:10:40 np0005593233 systemd[1]: var-lib-containers-storage-overlay-acb67dfd02ae408927279953fcfc7059000dc4f52f68e63b6d1babb8e473d87f-merged.mount: Deactivated successfully.
Jan 23 05:10:40 np0005593233 podman[272098]: 2026-01-23 10:10:40.579322249 +0000 UTC m=+0.111554077 container cleanup b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:10:40 np0005593233 systemd[1]: libpod-conmon-b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4.scope: Deactivated successfully.
Jan 23 05:10:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:40.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:40.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:40 np0005593233 podman[272152]: 2026-01-23 10:10:40.67552081 +0000 UTC m=+0.062055257 container remove b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.685 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[89ce325e-5846-43fa-8d54-d7b22698ce81]: (4, ('Fri Jan 23 10:10:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d (b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4)\nb9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4\nFri Jan 23 10:10:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d (b9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4)\nb9a1617f8d8b01693d9c10a427c9bac6cd9c4e6fd75a0f2bbc52d4ff10e32cc4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.689 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[692dbe33-0ac5-4ea0-bd88-ef5d89eebd4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.690 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 kernel: tap8d9599b4-80: left promiscuous mode
Jan 23 05:10:40 np0005593233 nova_compute[222017]: 2026-01-23 10:10:40.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.719 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0cb50a-0ee2-4bbd-9637-72183102c665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.742 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2130202e-d59c-4411-8919-49998fdf2185]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.743 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8a41cc-f170-4225-8ab8-d3aef4fef14b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.764 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[29202c1f-19da-43ab-99c7-71b98bd254c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694579, 'reachable_time': 41069, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272170, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.768 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:10:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:40.768 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[1aba653f-d0a3-4fd8-9b36-9b9e6e8a608e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:40 np0005593233 systemd[1]: run-netns-ovnmeta\x2d8d9599b4\x2d8855\x2d4310\x2daf02\x2dcdd058438f7d.mount: Deactivated successfully.
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.082 222021 INFO nova.virt.libvirt.driver [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Deleting instance files /var/lib/nova/instances/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_del#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.085 222021 INFO nova.virt.libvirt.driver [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Deletion of /var/lib/nova/instances/81a8be01-ddd9-4fd2-91a1-886e7f47bfa3_del complete#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.210 222021 INFO nova.compute.manager [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Took 0.98 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.212 222021 DEBUG oslo.service.loopingcall [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.213 222021 DEBUG nova.compute.manager [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.213 222021 DEBUG nova.network.neutron [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:10:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:41 np0005593233 nova_compute[222017]: 2026-01-23 10:10:41.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:42.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:42.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:42.674 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:42.675 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:42.675 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:43 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:10:43 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:10:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:44.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:44.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.026 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.028 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.029 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.029 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.030 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:45 np0005593233 podman[272173]: 2026-01-23 10:10:45.087761238 +0000 UTC m=+0.083187634 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.147 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-vif-unplugged-8ad4c021-5d44-41aa-adad-f593da5206c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.147 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.148 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.148 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.148 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] No waiting events found dispatching network-vif-unplugged-8ad4c021-5d44-41aa-adad-f593da5206c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.148 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-vif-unplugged-8ad4c021-5d44-41aa-adad-f593da5206c1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.149 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-changed-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.149 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Refreshing instance network info cache due to event network-changed-283da12a-be97-4dbe-9ecf-fd4e5aae8289. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.149 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.149 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.150 222021 DEBUG nova.network.neutron [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Refreshing network info cache for port 283da12a-be97-4dbe-9ecf-fd4e5aae8289 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:10:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4187247524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.513 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.526 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:45Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:d8:4b 10.100.0.5
Jan 23 05:10:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:10:45Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:d8:4b 10.100.0.5
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.706 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.707 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000081 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.911 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.913 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4203MB free_disk=20.855579376220703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.913 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:45 np0005593233 nova_compute[222017]: 2026-01-23 10:10:45.913 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.047 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.047 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 1b7661cc-4a60-4a80-967f-f9243a031c9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.048 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.049 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.125 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3282501143' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:46.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:46.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.627 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.634 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.663 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.699 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.700 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:46 np0005593233 nova_compute[222017]: 2026-01-23 10:10:46.764 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.076 222021 DEBUG nova.network.neutron [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.105 222021 INFO nova.compute.manager [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Took 5.89 seconds to deallocate network for instance.#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.180 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.181 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.232 222021 DEBUG nova.compute.manager [req-1f546289-84dd-4e01-9ae1-ac7af95f776b req-33e3d0cb-aad4-4098-952d-59b99d01cb5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-vif-deleted-8ad4c021-5d44-41aa-adad-f593da5206c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.265 222021 DEBUG oslo_concurrency.processutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1232314708' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.738 222021 DEBUG oslo_concurrency.processutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.745 222021 DEBUG nova.compute.provider_tree [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.774 222021 DEBUG nova.scheduler.client.report [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.798 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:47 np0005593233 nova_compute[222017]: 2026-01-23 10:10:47.881 222021 INFO nova.scheduler.client.report [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Deleted allocations for instance 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3#033[00m
Jan 23 05:10:48 np0005593233 nova_compute[222017]: 2026-01-23 10:10:48.040 222021 DEBUG oslo_concurrency.lockutils [None req-9286ee43-6994-4e80-ac61-c76a95c0f03f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:48.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:48.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:48 np0005593233 nova_compute[222017]: 2026-01-23 10:10:48.699 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:48 np0005593233 nova_compute[222017]: 2026-01-23 10:10:48.700 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:10:48 np0005593233 nova_compute[222017]: 2026-01-23 10:10:48.700 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:10:48 np0005593233 nova_compute[222017]: 2026-01-23 10:10:48.962 222021 DEBUG nova.network.neutron [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updated VIF entry in instance network info cache for port 283da12a-be97-4dbe-9ecf-fd4e5aae8289. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:10:48 np0005593233 nova_compute[222017]: 2026-01-23 10:10:48.963 222021 DEBUG nova.network.neutron [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updating instance_info_cache with network_info: [{"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.108 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.109 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received event network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.109 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.109 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.110 222021 DEBUG oslo_concurrency.lockutils [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "81a8be01-ddd9-4fd2-91a1-886e7f47bfa3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.110 222021 DEBUG nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] No waiting events found dispatching network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.110 222021 WARNING nova.compute.manager [req-fa37aa5b-45f6-4717-a8b3-5c136cf9de35 req-8885d53d-66a9-416d-aea5-3133586c31eb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Received unexpected event network-vif-plugged-8ad4c021-5d44-41aa-adad-f593da5206c1 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.131 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.132 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.132 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.132 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.919 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.920 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:49 np0005593233 nova_compute[222017]: 2026-01-23 10:10:49.920 222021 INFO nova.compute.manager [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Unshelving#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.349 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.350 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.357 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'pci_requests' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.405 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'numa_topology' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.490 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.490 222021 INFO nova.compute.claims [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:50 np0005593233 nova_compute[222017]: 2026-01-23 10:10:50.616 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:50.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:50.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2244789577' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:51 np0005593233 nova_compute[222017]: 2026-01-23 10:10:51.108 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:51 np0005593233 nova_compute[222017]: 2026-01-23 10:10:51.117 222021 DEBUG nova.compute.provider_tree [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:51 np0005593233 nova_compute[222017]: 2026-01-23 10:10:51.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:52.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:52.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:52 np0005593233 nova_compute[222017]: 2026-01-23 10:10:52.663 222021 DEBUG nova.scheduler.client.report [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:52 np0005593233 nova_compute[222017]: 2026-01-23 10:10:52.705 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:52 np0005593233 nova_compute[222017]: 2026-01-23 10:10:52.966 222021 INFO nova.network.neutron [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating port 8bb3c318-ff77-47e8-a160-22e4c278fc88 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:10:53 np0005593233 nova_compute[222017]: 2026-01-23 10:10:53.881 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updating instance_info_cache with network_info: [{"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.055 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-1b7661cc-4a60-4a80-967f-f9243a031c9f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.056 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.058 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.059 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.059 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:10:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:54.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:54.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.778 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.778 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:54 np0005593233 nova_compute[222017]: 2026-01-23 10:10:54.779 222021 DEBUG nova.network.neutron [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.072 222021 DEBUG nova.compute.manager [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-changed-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.072 222021 DEBUG nova.compute.manager [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Refreshing instance network info cache due to event network-changed-8bb3c318-ff77-47e8-a160-22e4c278fc88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.072 222021 DEBUG oslo_concurrency.lockutils [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.475 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163040.4737573, 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.476 222021 INFO nova.compute.manager [-] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.495 222021 DEBUG nova.compute.manager [None req-7e0563e7-9f21-45b3-ac28-718307044d9c - - - - - -] [instance: 81a8be01-ddd9-4fd2-91a1-886e7f47bfa3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:10:55 np0005593233 nova_compute[222017]: 2026-01-23 10:10:55.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:56.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:56.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:56 np0005593233 nova_compute[222017]: 2026-01-23 10:10:56.740 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:56 np0005593233 nova_compute[222017]: 2026-01-23 10:10:56.769 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:57 np0005593233 nova_compute[222017]: 2026-01-23 10:10:57.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:57.799 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:10:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:10:57.800 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.606 222021 DEBUG oslo_concurrency.lockutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.606 222021 DEBUG oslo_concurrency.lockutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.636 222021 DEBUG nova.network.neutron [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:10:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:58.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:10:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:10:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:10:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:58.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.709 222021 DEBUG nova.objects.instance [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'flavor' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.715 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.716 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.716 222021 INFO nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Creating image(s)#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.751 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.756 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.759 222021 DEBUG oslo_concurrency.lockutils [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.759 222021 DEBUG nova.network.neutron [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Refreshing network info cache for port 8bb3c318-ff77-47e8-a160-22e4c278fc88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.827 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.864 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.870 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "bd110828d2b53231ac91e9ff379cc95b201d4baf" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.871 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "bd110828d2b53231ac91e9ff379cc95b201d4baf" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:58 np0005593233 nova_compute[222017]: 2026-01-23 10:10:58.882 222021 DEBUG oslo_concurrency.lockutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.200 222021 DEBUG nova.virt.libvirt.imagebackend [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/034d6719-9097-4197-9373-c0b4b83dfc98/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/034d6719-9097-4197-9373-c0b4b83dfc98/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.272 222021 DEBUG nova.virt.libvirt.imagebackend [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/034d6719-9097-4197-9373-c0b4b83dfc98/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.273 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] cloning images/034d6719-9097-4197-9373-c0b4b83dfc98@snap to None/91cc1048-141a-4a20-b148-991a883adfa9_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.318 222021 DEBUG oslo_concurrency.lockutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.319 222021 DEBUG oslo_concurrency.lockutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.319 222021 INFO nova.compute.manager [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Attaching volume 809049fe-43c3-4fd2-a846-ff41c8b09a79 to /dev/vdb#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.579 222021 DEBUG os_brick.utils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.581 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.598 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.598 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[f45c5949-72b2-4540-ac66-ca96965a5e00]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.599 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.611 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.611 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[8e35cdb7-d232-4bb0-980b-b0adeb2e4988]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.614 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.626 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.627 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7e72a1-ed25-403d-9add-219b5ad51e4c]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.629 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[e69a8be7-65af-4e57-a89c-a9409d32cae7]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.629 222021 DEBUG oslo_concurrency.processutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.668 222021 DEBUG oslo_concurrency.processutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "nvme version" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.671 222021 DEBUG os_brick.initiator.connectors.lightos [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.672 222021 DEBUG os_brick.initiator.connectors.lightos [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.672 222021 DEBUG os_brick.initiator.connectors.lightos [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.672 222021 DEBUG os_brick.utils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] <== get_connector_properties: return (93ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:10:59 np0005593233 nova_compute[222017]: 2026-01-23 10:10:59.673 222021 DEBUG nova.virt.block_device [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updating existing volume attachment record: 383ed606-3757-4f5e-bb69-b98b5ca72de4 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.367 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "bd110828d2b53231ac91e9ff379cc95b201d4baf" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.541 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'migration_context' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.638 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] flattening vms/91cc1048-141a-4a20-b148-991a883adfa9_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:11:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:00.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:00.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.706 222021 DEBUG nova.objects.instance [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'flavor' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.733 222021 DEBUG nova.virt.libvirt.driver [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Attempting to attach volume 809049fe-43c3-4fd2-a846-ff41c8b09a79 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.736 222021 DEBUG nova.virt.libvirt.guest [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-809049fe-43c3-4fd2-a846-ff41c8b09a79">
Jan 23 05:11:00 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:11:00 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:11:00 np0005593233 nova_compute[222017]:  <serial>809049fe-43c3-4fd2-a846-ff41c8b09a79</serial>
Jan 23 05:11:00 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:11:00 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.919 222021 DEBUG nova.virt.libvirt.driver [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.920 222021 DEBUG nova.virt.libvirt.driver [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.920 222021 DEBUG nova.virt.libvirt.driver [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:00 np0005593233 nova_compute[222017]: 2026-01-23 10:11:00.920 222021 DEBUG nova.virt.libvirt.driver [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No VIF found with MAC fa:16:3e:b1:d8:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:11:01 np0005593233 podman[272524]: 2026-01-23 10:11:01.092696137 +0000 UTC m=+0.103417987 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 23 05:11:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:01 np0005593233 nova_compute[222017]: 2026-01-23 10:11:01.772 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:01 np0005593233 nova_compute[222017]: 2026-01-23 10:11:01.841 222021 DEBUG nova.network.neutron [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updated VIF entry in instance network info cache for port 8bb3c318-ff77-47e8-a160-22e4c278fc88. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:11:01 np0005593233 nova_compute[222017]: 2026-01-23 10:11:01.841 222021 DEBUG nova.network.neutron [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:01 np0005593233 nova_compute[222017]: 2026-01-23 10:11:01.887 222021 DEBUG oslo_concurrency.lockutils [req-fedf9cb8-ea5a-48b3-8452-df63b7773000 req-813c6104-f58b-437c-9cc6-fea3b87ccd72 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:02 np0005593233 nova_compute[222017]: 2026-01-23 10:11:02.111 222021 DEBUG oslo_concurrency.lockutils [None req-6528a530-2893-40a3-9622-e893ee6d767d c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:02.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:02.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.200 222021 DEBUG oslo_concurrency.lockutils [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.201 222021 DEBUG oslo_concurrency.lockutils [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.217 222021 INFO nova.compute.manager [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Detaching volume 809049fe-43c3-4fd2-a846-ff41c8b09a79#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.408 222021 INFO nova.virt.block_device [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Attempting to driver detach volume 809049fe-43c3-4fd2-a846-ff41c8b09a79 from mountpoint /dev/vdb#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.417 222021 DEBUG nova.virt.libvirt.driver [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Attempting to detach device vdb from instance 1b7661cc-4a60-4a80-967f-f9243a031c9f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.418 222021 DEBUG nova.virt.libvirt.guest [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-809049fe-43c3-4fd2-a846-ff41c8b09a79">
Jan 23 05:11:04 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <serial>809049fe-43c3-4fd2-a846-ff41c8b09a79</serial>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:11:04 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.428 222021 INFO nova.virt.libvirt.driver [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Successfully detached device vdb from instance 1b7661cc-4a60-4a80-967f-f9243a031c9f from the persistent domain config.#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.428 222021 DEBUG nova.virt.libvirt.driver [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 1b7661cc-4a60-4a80-967f-f9243a031c9f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.429 222021 DEBUG nova.virt.libvirt.guest [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-809049fe-43c3-4fd2-a846-ff41c8b09a79">
Jan 23 05:11:04 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <serial>809049fe-43c3-4fd2-a846-ff41c8b09a79</serial>
Jan 23 05:11:04 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:11:04 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:11:04 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.493 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769163064.4928112, 1b7661cc-4a60-4a80-967f-f9243a031c9f => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.494 222021 DEBUG nova.virt.libvirt.driver [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 1b7661cc-4a60-4a80-967f-f9243a031c9f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.497 222021 INFO nova.virt.libvirt.driver [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Successfully detached device vdb from instance 1b7661cc-4a60-4a80-967f-f9243a031c9f from the live domain config.#033[00m
Jan 23 05:11:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:04.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:04.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.710 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Image rbd:vms/91cc1048-141a-4a20-b148-991a883adfa9_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.711 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.711 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Ensure instance console log exists: /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.712 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.712 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.712 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.715 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Start _get_guest_xml network_info=[{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:10:20Z,direct_url=<?>,disk_format='raw',id=034d6719-9097-4197-9373-c0b4b83dfc98,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-980180187-shelved',owner='746ea02b745c4e21ace4cb49c193899d',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:10:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.720 222021 WARNING nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.727 222021 DEBUG nova.virt.libvirt.host [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.728 222021 DEBUG nova.virt.libvirt.host [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.732 222021 DEBUG nova.virt.libvirt.host [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.733 222021 DEBUG nova.virt.libvirt.host [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.734 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.735 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:10:20Z,direct_url=<?>,disk_format='raw',id=034d6719-9097-4197-9373-c0b4b83dfc98,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-980180187-shelved',owner='746ea02b745c4e21ace4cb49c193899d',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:10:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.735 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.735 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.736 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.736 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.736 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.736 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.737 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.737 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.737 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.738 222021 DEBUG nova.virt.hardware [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.738 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.761 222021 DEBUG nova.objects.instance [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'flavor' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.772 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:04 np0005593233 nova_compute[222017]: 2026-01-23 10:11:04.842 222021 DEBUG oslo_concurrency.lockutils [None req-2fcbfc05-65c5-4168-88d4-cdb10089c279 c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3203496744' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.411 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.639s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.445 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.450 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.539 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.830 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.831 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.832 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.832 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.832 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.834 222021 INFO nova.compute.manager [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Terminating instance#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.835 222021 DEBUG nova.compute.manager [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:11:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2842520245' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.931 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.933 222021 DEBUG nova.virt.libvirt.vif [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='034d6719-9097-4197-9373-c0b4b83dfc98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:07:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member',shelved_at='2026-01-23T10:10:29.812600',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='034d6719-9097-4197-9373-c0b4b83dfc98'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:50Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.933 222021 DEBUG nova.network.os_vif_util [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.934 222021 DEBUG nova.network.os_vif_util [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.935 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'pci_devices' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.956 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <uuid>91cc1048-141a-4a20-b148-991a883adfa9</uuid>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <name>instance-00000079</name>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersNegativeTestJSON-server-980180187</nova:name>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:11:04</nova:creationTime>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:user uuid="cde472cc8af0464992006a69d047d0d4">tempest-ServersNegativeTestJSON-623507515-project-member</nova:user>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:project uuid="746ea02b745c4e21ace4cb49c193899d">tempest-ServersNegativeTestJSON-623507515</nova:project>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="034d6719-9097-4197-9373-c0b4b83dfc98"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <nova:port uuid="8bb3c318-ff77-47e8-a160-22e4c278fc88">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <entry name="serial">91cc1048-141a-4a20-b148-991a883adfa9</entry>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <entry name="uuid">91cc1048-141a-4a20-b148-991a883adfa9</entry>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/91cc1048-141a-4a20-b148-991a883adfa9_disk">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/91cc1048-141a-4a20-b148-991a883adfa9_disk.config">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:5b:ad:46"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <target dev="tap8bb3c318-ff"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/console.log" append="off"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:11:05 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:11:05 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:11:05 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:11:05 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.957 222021 DEBUG nova.compute.manager [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Preparing to wait for external event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.958 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.958 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.958 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.959 222021 DEBUG nova.virt.libvirt.vif [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='034d6719-9097-4197-9373-c0b4b83dfc98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:07:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member',shelved_at='2026-01-23T10:10:29.812600',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='034d6719-9097-4197-9373-c0b4b83dfc98'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:50Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.959 222021 DEBUG nova.network.os_vif_util [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.959 222021 DEBUG nova.network.os_vif_util [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.960 222021 DEBUG os_vif [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.960 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.961 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.961 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.965 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.965 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bb3c318-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.966 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8bb3c318-ff, col_values=(('external_ids', {'iface-id': '8bb3c318-ff77-47e8-a160-22e4c278fc88', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:ad:46', 'vm-uuid': '91cc1048-141a-4a20-b148-991a883adfa9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.967 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593233 NetworkManager[48871]: <info>  [1769163065.9688] manager: (tap8bb3c318-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.969 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.976 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593233 nova_compute[222017]: 2026-01-23 10:11:05.979 222021 INFO os_vif [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff')#033[00m
Jan 23 05:11:06 np0005593233 kernel: tap283da12a-be (unregistering): left promiscuous mode
Jan 23 05:11:06 np0005593233 NetworkManager[48871]: <info>  [1769163066.0663] device (tap283da12a-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.067 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:06Z|00549|binding|INFO|Releasing lport 283da12a-be97-4dbe-9ecf-fd4e5aae8289 from this chassis (sb_readonly=0)
Jan 23 05:11:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:06Z|00550|binding|INFO|Setting lport 283da12a-be97-4dbe-9ecf-fd4e5aae8289 down in Southbound
Jan 23 05:11:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:06Z|00551|binding|INFO|Removing iface tap283da12a-be ovn-installed in OVS
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000081.scope: Deactivated successfully.
Jan 23 05:11:06 np0005593233 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000081.scope: Consumed 15.743s CPU time.
Jan 23 05:11:06 np0005593233 systemd-machined[190954]: Machine qemu-58-instance-00000081 terminated.
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.161 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:d8:4b 10.100.0.5'], port_security=['fa:16:3e:b1:d8:4b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1b7661cc-4a60-4a80-967f-f9243a031c9f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93735878-f62d-4a5f-96df-bf97f85d787a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '924f976bcbb74ec195730b68eebe1f2a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '394c0c6a-bef6-491c-899b-f47fff4f799d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1f72e5c-e22f-424b-b6ed-0c502ff13aa3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=283da12a-be97-4dbe-9ecf-fd4e5aae8289) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.162 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 283da12a-be97-4dbe-9ecf-fd4e5aae8289 in datapath 93735878-f62d-4a5f-96df-bf97f85d787a unbound from our chassis#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.164 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 93735878-f62d-4a5f-96df-bf97f85d787a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.165 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c5503b-872b-4819-8d0b-4e318846c402]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.166 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a namespace which is not needed anymore#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.189 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.190 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.190 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] No VIF found with MAC fa:16:3e:5b:ad:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.191 222021 INFO nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Using config drive#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.224 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.277 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'ec2_ids' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.286 222021 INFO nova.virt.libvirt.driver [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Instance destroyed successfully.#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.287 222021 DEBUG nova.objects.instance [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'resources' on Instance uuid 1b7661cc-4a60-4a80-967f-f9243a031c9f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:06 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [NOTICE]   (271793) : haproxy version is 2.8.14-c23fe91
Jan 23 05:11:06 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [NOTICE]   (271793) : path to executable is /usr/sbin/haproxy
Jan 23 05:11:06 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [WARNING]  (271793) : Exiting Master process...
Jan 23 05:11:06 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [WARNING]  (271793) : Exiting Master process...
Jan 23 05:11:06 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [ALERT]    (271793) : Current worker (271798) exited with code 143 (Terminated)
Jan 23 05:11:06 np0005593233 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[271780]: [WARNING]  (271793) : All workers exited. Exiting... (0)
Jan 23 05:11:06 np0005593233 systemd[1]: libpod-2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9.scope: Deactivated successfully.
Jan 23 05:11:06 np0005593233 podman[272660]: 2026-01-23 10:11:06.343897646 +0000 UTC m=+0.066360298 container died 2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.355 222021 DEBUG nova.virt.libvirt.vif [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1738552703',display_name='tempest-AttachVolumeNegativeTest-server-1738552703',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1738552703',id=129,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLOCDc+hB1zgCPdlKOUnZEX+Rl7ewtvKqeeImFpXurwnY4SYrfZFBkwZIE3g5r9nA9h2+pYvShYhnh7AlXCp7hzc3PTeL5rqvcKdXNZAyMR1hX9qOLVJ6T8cdqLx8wglSA==',key_name='tempest-keypair-1358246388',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:10:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='924f976bcbb74ec195730b68eebe1f2a',ramdisk_id='',reservation_id='r-qxl4yoot',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-1470050886',owner_user_name='tempest-AttachVolumeNegativeTest-1470050886-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:10:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c99d09acd2e849a69846a6ccda1e0bc7',uuid=1b7661cc-4a60-4a80-967f-f9243a031c9f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.357 222021 DEBUG nova.network.os_vif_util [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converting VIF {"id": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "address": "fa:16:3e:b1:d8:4b", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap283da12a-be", "ovs_interfaceid": "283da12a-be97-4dbe-9ecf-fd4e5aae8289", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.358 222021 DEBUG nova.network.os_vif_util [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.358 222021 DEBUG os_vif [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.362 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap283da12a-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.386 222021 DEBUG nova.objects.instance [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'keypairs' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9-userdata-shm.mount: Deactivated successfully.
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.421 222021 INFO os_vif [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:d8:4b,bridge_name='br-int',has_traffic_filtering=True,id=283da12a-be97-4dbe-9ecf-fd4e5aae8289,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap283da12a-be')#033[00m
Jan 23 05:11:06 np0005593233 systemd[1]: var-lib-containers-storage-overlay-0f2f1ebd8fcae54ea9d26fa88ab9d5ff44a2653b35435a2263d0cb0c4d342c41-merged.mount: Deactivated successfully.
Jan 23 05:11:06 np0005593233 podman[272660]: 2026-01-23 10:11:06.432830631 +0000 UTC m=+0.155293273 container cleanup 2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:11:06 np0005593233 systemd[1]: libpod-conmon-2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9.scope: Deactivated successfully.
Jan 23 05:11:06 np0005593233 podman[272714]: 2026-01-23 10:11:06.511669211 +0000 UTC m=+0.051312712 container remove 2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.518 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e4b5f6-0ea0-4a43-9522-ce4468b566db]: (4, ('Fri Jan 23 10:11:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a (2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9)\n2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9\nFri Jan 23 10:11:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a (2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9)\n2aa05beef2619b0f769c65eed2f997a81193771e9e9192a73f9910062c7eddd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.520 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f4018323-0fc8-43cb-8390-7d801f56d4dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.521 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93735878-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.523 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 kernel: tap93735878-f0: left promiscuous mode
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.541 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.545 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a27cabd5-b60a-46b0-9372-8fe393e37331]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.559 222021 DEBUG nova.compute.manager [req-2cc7576a-dc5d-4ed7-a95a-0ccfb0b61949 req-0b16eac5-dfad-4dcf-81c7-723f0ff30e83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-vif-unplugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.560 222021 DEBUG oslo_concurrency.lockutils [req-2cc7576a-dc5d-4ed7-a95a-0ccfb0b61949 req-0b16eac5-dfad-4dcf-81c7-723f0ff30e83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.560 222021 DEBUG oslo_concurrency.lockutils [req-2cc7576a-dc5d-4ed7-a95a-0ccfb0b61949 req-0b16eac5-dfad-4dcf-81c7-723f0ff30e83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.561 222021 DEBUG oslo_concurrency.lockutils [req-2cc7576a-dc5d-4ed7-a95a-0ccfb0b61949 req-0b16eac5-dfad-4dcf-81c7-723f0ff30e83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.561 222021 DEBUG nova.compute.manager [req-2cc7576a-dc5d-4ed7-a95a-0ccfb0b61949 req-0b16eac5-dfad-4dcf-81c7-723f0ff30e83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] No waiting events found dispatching network-vif-unplugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.562 222021 DEBUG nova.compute.manager [req-2cc7576a-dc5d-4ed7-a95a-0ccfb0b61949 req-0b16eac5-dfad-4dcf-81c7-723f0ff30e83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-vif-unplugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.562 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[56b03bfa-ee49-4c5a-9ca1-9cb2c9405116]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.564 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[164ea8f4-14e1-4400-8675-7d92280f0999]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.584 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[72ca968a-a289-4cdd-8ee3-9ffcf50bf57c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694253, 'reachable_time': 34261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272736, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 systemd[1]: run-netns-ovnmeta\x2d93735878\x2df62d\x2d4a5f\x2d96df\x2dbf97f85d787a.mount: Deactivated successfully.
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.589 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.589 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[71d5c390-9588-4217-89db-362d851c9f97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:06.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:06.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:06 np0005593233 nova_compute[222017]: 2026-01-23 10:11:06.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:06.803 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.117 222021 INFO nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Creating config drive at /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.123 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9_aesxv3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.270 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9_aesxv3" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.309 222021 DEBUG nova.storage.rbd_utils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] rbd image 91cc1048-141a-4a20-b148-991a883adfa9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.314 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config 91cc1048-141a-4a20-b148-991a883adfa9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.363 222021 INFO nova.virt.libvirt.driver [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Deleting instance files /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f_del#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.365 222021 INFO nova.virt.libvirt.driver [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Deletion of /var/lib/nova/instances/1b7661cc-4a60-4a80-967f-f9243a031c9f_del complete#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.428 222021 INFO nova.compute.manager [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Took 1.59 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.431 222021 DEBUG oslo.service.loopingcall [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.431 222021 DEBUG nova.compute.manager [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.431 222021 DEBUG nova.network.neutron [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.515 222021 DEBUG oslo_concurrency.processutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config 91cc1048-141a-4a20-b148-991a883adfa9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.516 222021 INFO nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deleting local config drive /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9/disk.config because it was imported into RBD.#033[00m
Jan 23 05:11:07 np0005593233 kernel: tap8bb3c318-ff: entered promiscuous mode
Jan 23 05:11:07 np0005593233 NetworkManager[48871]: <info>  [1769163067.5830] manager: (tap8bb3c318-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Jan 23 05:11:07 np0005593233 systemd-udevd[272621]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.584 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:07Z|00552|binding|INFO|Claiming lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 for this chassis.
Jan 23 05:11:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:07Z|00553|binding|INFO|8bb3c318-ff77-47e8-a160-22e4c278fc88: Claiming fa:16:3e:5b:ad:46 10.100.0.12
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.591 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ad:46 10.100.0.12'], port_security=['fa:16:3e:5b:ad:46 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91cc1048-141a-4a20-b148-991a883adfa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '7', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8bb3c318-ff77-47e8-a160-22e4c278fc88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.593 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb3c318-ff77-47e8-a160-22e4c278fc88 in datapath 63877f45-8244-4c80-903a-80901a7d83cb bound to our chassis#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.595 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63877f45-8244-4c80-903a-80901a7d83cb#033[00m
Jan 23 05:11:07 np0005593233 NetworkManager[48871]: <info>  [1769163067.6011] device (tap8bb3c318-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:11:07 np0005593233 NetworkManager[48871]: <info>  [1769163067.6020] device (tap8bb3c318-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:11:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:07Z|00554|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 ovn-installed in OVS
Jan 23 05:11:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:07Z|00555|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 up in Southbound
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.603 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.611 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ed939336-7806-44f5-b722-c7648a717e5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.612 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap63877f45-81 in ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.614 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap63877f45-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.615 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1314047c-a05b-409e-8c77-bec0fd9ae4da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.616 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd6f7f1-061f-47cf-b7ab-43e10b130192]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 systemd-machined[190954]: New machine qemu-60-instance-00000079.
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.630 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfb3a72-23ee-4c98-9086-62fc89b8b17b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 systemd[1]: Started Virtual Machine qemu-60-instance-00000079.
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.661 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[15311d64-8d04-40be-8dc3-633473d27f4a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.705 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5457da5f-7951-463b-a0ff-344329e59624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.713 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e76265e4-b136-4e09-866a-4385cced0075]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 NetworkManager[48871]: <info>  [1769163067.7154] manager: (tap63877f45-80): new Veth device (/org/freedesktop/NetworkManager/Devices/261)
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.757 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f3f699-2494-4853-9475-ecfa15509092]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.762 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0a704fc2-1ba6-4780-b066-eb3756611794]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 NetworkManager[48871]: <info>  [1769163067.7927] device (tap63877f45-80): carrier: link connected
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.802 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[26edfe86-1d7e-4e05-9563-061630fb5c39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.828 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7f962072-9b81-40db-849e-bf2429c0fda9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698042, 'reachable_time': 15328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272822, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.851 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78379a50-bc0c-451f-962b-8411fecdbb8d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:59a0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 698042, 'tstamp': 698042}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272830, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.876 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[61867965-72b8-47f3-92de-3f9ac8783e5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698042, 'reachable_time': 15328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 272840, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.919 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9173b540-de15-405b-bb9b-f2843d1a3ef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.982 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[52a05f64-8829-4710-b985-b1737e6cd840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.983 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.984 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.984 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63877f45-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.986 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593233 NetworkManager[48871]: <info>  [1769163067.9868] manager: (tap63877f45-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 23 05:11:07 np0005593233 kernel: tap63877f45-80: entered promiscuous mode
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.989 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63877f45-80, col_values=(('external_ids', {'iface-id': 'c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:07Z|00556|binding|INFO|Releasing lport c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23 from this chassis (sb_readonly=0)
Jan 23 05:11:07 np0005593233 nova_compute[222017]: 2026-01-23 10:11:07.992 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.994 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.995 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1a939e-a918-4d42-aed3-dcb2a654d145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.997 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-63877f45-8244-4c80-903a-80901a7d83cb
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 63877f45-8244-4c80-903a-80901a7d83cb
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:11:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:07.997 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'env', 'PROCESS_TAG=haproxy-63877f45-8244-4c80-903a-80901a7d83cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/63877f45-8244-4c80-903a-80901a7d83cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.009 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.027 222021 DEBUG nova.compute.manager [req-58ed8932-9e5a-4714-80e5-f9a006036c99 req-d27b982b-05f0-42cc-8a2c-926c86dc8a70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.028 222021 DEBUG oslo_concurrency.lockutils [req-58ed8932-9e5a-4714-80e5-f9a006036c99 req-d27b982b-05f0-42cc-8a2c-926c86dc8a70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.028 222021 DEBUG oslo_concurrency.lockutils [req-58ed8932-9e5a-4714-80e5-f9a006036c99 req-d27b982b-05f0-42cc-8a2c-926c86dc8a70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.028 222021 DEBUG oslo_concurrency.lockutils [req-58ed8932-9e5a-4714-80e5-f9a006036c99 req-d27b982b-05f0-42cc-8a2c-926c86dc8a70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.028 222021 DEBUG nova.compute.manager [req-58ed8932-9e5a-4714-80e5-f9a006036c99 req-d27b982b-05f0-42cc-8a2c-926c86dc8a70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Processing event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.063 222021 DEBUG nova.compute.manager [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.065 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163068.0623035, 91cc1048-141a-4a20-b148-991a883adfa9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.065 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.070 222021 DEBUG nova.virt.libvirt.driver [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.079 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance spawned successfully.#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.105 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.111 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.200 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.201 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163068.0643337, 91cc1048-141a-4a20-b148-991a883adfa9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.201 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593233 podman[272896]: 2026-01-23 10:11:08.445419398 +0000 UTC m=+0.047601477 container create 76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:11:08 np0005593233 systemd[1]: Started libpod-conmon-76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0.scope.
Jan 23 05:11:08 np0005593233 podman[272896]: 2026-01-23 10:11:08.422121659 +0000 UTC m=+0.024303778 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:11:08 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:11:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06cf8b9d13305ed91935a90e9602dbe4a0e72d4717f4062dd56939f1b576a01d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:11:08 np0005593233 podman[272896]: 2026-01-23 10:11:08.579031607 +0000 UTC m=+0.181213726 container init 76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:11:08 np0005593233 podman[272896]: 2026-01-23 10:11:08.586672523 +0000 UTC m=+0.188854602 container start 76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:11:08 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [NOTICE]   (272917) : New worker (272919) forked
Jan 23 05:11:08 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [NOTICE]   (272917) : Loading success.
Jan 23 05:11:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:08.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:08.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.671 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.676 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163068.0682964, 91cc1048-141a-4a20-b148-991a883adfa9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.676 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.706 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.711 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:08 np0005593233 nova_compute[222017]: 2026-01-23 10:11:08.742 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:11:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e292 e292: 3 total, 3 up, 3 in
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.446 222021 DEBUG nova.compute.manager [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.447 222021 DEBUG oslo_concurrency.lockutils [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.447 222021 DEBUG oslo_concurrency.lockutils [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.447 222021 DEBUG oslo_concurrency.lockutils [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.448 222021 DEBUG nova.compute.manager [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] No waiting events found dispatching network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.448 222021 WARNING nova.compute.manager [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received unexpected event network-vif-plugged-283da12a-be97-4dbe-9ecf-fd4e5aae8289 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.540 222021 DEBUG nova.network.neutron [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.569 222021 INFO nova.compute.manager [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Took 2.14 seconds to deallocate network for instance.#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.666 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.667 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:09 np0005593233 nova_compute[222017]: 2026-01-23 10:11:09.775 222021 DEBUG oslo_concurrency.processutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:11:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3684257645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.314 222021 DEBUG oslo_concurrency.processutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.539s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.322 222021 DEBUG nova.compute.provider_tree [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:11:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:10.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:10.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.813 222021 DEBUG nova.compute.manager [req-f1c77b49-4b45-4848-967e-7930aeeba774 req-efd6b9e2-2586-4fe3-85d9-6a7073c27fe8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.814 222021 DEBUG oslo_concurrency.lockutils [req-f1c77b49-4b45-4848-967e-7930aeeba774 req-efd6b9e2-2586-4fe3-85d9-6a7073c27fe8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.814 222021 DEBUG oslo_concurrency.lockutils [req-f1c77b49-4b45-4848-967e-7930aeeba774 req-efd6b9e2-2586-4fe3-85d9-6a7073c27fe8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.814 222021 DEBUG oslo_concurrency.lockutils [req-f1c77b49-4b45-4848-967e-7930aeeba774 req-efd6b9e2-2586-4fe3-85d9-6a7073c27fe8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.815 222021 DEBUG nova.compute.manager [req-f1c77b49-4b45-4848-967e-7930aeeba774 req-efd6b9e2-2586-4fe3-85d9-6a7073c27fe8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.815 222021 WARNING nova.compute.manager [req-f1c77b49-4b45-4848-967e-7930aeeba774 req-efd6b9e2-2586-4fe3-85d9-6a7073c27fe8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.823 222021 DEBUG nova.compute.manager [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.840 222021 DEBUG nova.scheduler.client.report [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:11:10 np0005593233 nova_compute[222017]: 2026-01-23 10:11:10.959 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:11 np0005593233 nova_compute[222017]: 2026-01-23 10:11:11.352 222021 DEBUG oslo_concurrency.lockutils [None req-c3497107-5a93-4add-ac4c-11d17d67afc6 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 21.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:11 np0005593233 nova_compute[222017]: 2026-01-23 10:11:11.360 222021 INFO nova.scheduler.client.report [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Deleted allocations for instance 1b7661cc-4a60-4a80-967f-f9243a031c9f#033[00m
Jan 23 05:11:11 np0005593233 nova_compute[222017]: 2026-01-23 10:11:11.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:11 np0005593233 nova_compute[222017]: 2026-01-23 10:11:11.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:11 np0005593233 nova_compute[222017]: 2026-01-23 10:11:11.932 222021 DEBUG nova.compute.manager [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Received event network-vif-deleted-283da12a-be97-4dbe-9ecf-fd4e5aae8289 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:12 np0005593233 nova_compute[222017]: 2026-01-23 10:11:12.017 222021 DEBUG oslo_concurrency.lockutils [None req-6ac73979-6ede-4f60-a2a4-86769037df7c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "1b7661cc-4a60-4a80-967f-f9243a031c9f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:12.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:12.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:14.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:14.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:16 np0005593233 podman[272950]: 2026-01-23 10:11:16.079139509 +0000 UTC m=+0.069123167 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:11:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:16 np0005593233 nova_compute[222017]: 2026-01-23 10:11:16.410 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:16.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:16.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:16 np0005593233 nova_compute[222017]: 2026-01-23 10:11:16.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 e293: 3 total, 3 up, 3 in
Jan 23 05:11:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:11:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.8 total, 600.0 interval#012Cumulative writes: 38K writes, 151K keys, 38K commit groups, 1.0 writes per commit group, ingest: 0.14 GB, 0.04 MB/s#012Cumulative WAL: 38K writes, 14K syncs, 2.76 writes per sync, written: 0.14 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8449 writes, 34K keys, 8449 commit groups, 1.0 writes per commit group, ingest: 35.60 MB, 0.06 MB/s#012Interval WAL: 8449 writes, 3327 syncs, 2.54 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:11:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:18.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:18.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:20.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:20.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:21 np0005593233 nova_compute[222017]: 2026-01-23 10:11:21.285 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163066.2838972, 1b7661cc-4a60-4a80-967f-f9243a031c9f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:21 np0005593233 nova_compute[222017]: 2026-01-23 10:11:21.286 222021 INFO nova.compute.manager [-] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:11:21 np0005593233 nova_compute[222017]: 2026-01-23 10:11:21.397 222021 DEBUG nova.compute.manager [None req-a9491484-7a93-48aa-bfe4-f9fcaef69498 - - - - - -] [instance: 1b7661cc-4a60-4a80-967f-f9243a031c9f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:21 np0005593233 nova_compute[222017]: 2026-01-23 10:11:21.415 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:21 np0005593233 nova_compute[222017]: 2026-01-23 10:11:21.785 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:22 np0005593233 nova_compute[222017]: 2026-01-23 10:11:22.219 222021 DEBUG nova.objects.instance [None req-a40465e1-2227-4507-8c86-9f49c51487e2 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'pci_devices' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:22 np0005593233 nova_compute[222017]: 2026-01-23 10:11:22.247 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163082.2469022, 91cc1048-141a-4a20-b148-991a883adfa9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:22 np0005593233 nova_compute[222017]: 2026-01-23 10:11:22.248 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:11:22 np0005593233 nova_compute[222017]: 2026-01-23 10:11:22.343 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:22 np0005593233 nova_compute[222017]: 2026-01-23 10:11:22.349 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:22 np0005593233 nova_compute[222017]: 2026-01-23 10:11:22.490 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 23 05:11:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:22.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:22.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:23 np0005593233 kernel: tap8bb3c318-ff (unregistering): left promiscuous mode
Jan 23 05:11:23 np0005593233 NetworkManager[48871]: <info>  [1769163083.2547] device (tap8bb3c318-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:23Z|00557|binding|INFO|Releasing lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 from this chassis (sb_readonly=0)
Jan 23 05:11:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:23Z|00558|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 down in Southbound
Jan 23 05:11:23 np0005593233 nova_compute[222017]: 2026-01-23 10:11:23.269 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:23Z|00559|binding|INFO|Removing iface tap8bb3c318-ff ovn-installed in OVS
Jan 23 05:11:23 np0005593233 nova_compute[222017]: 2026-01-23 10:11:23.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:23 np0005593233 nova_compute[222017]: 2026-01-23 10:11:23.291 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:23 np0005593233 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 23 05:11:23 np0005593233 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000079.scope: Consumed 13.820s CPU time.
Jan 23 05:11:23 np0005593233 systemd-machined[190954]: Machine qemu-60-instance-00000079 terminated.
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.403 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ad:46 10.100.0.12'], port_security=['fa:16:3e:5b:ad:46 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91cc1048-141a-4a20-b148-991a883adfa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '9', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8bb3c318-ff77-47e8-a160-22e4c278fc88) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.404 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb3c318-ff77-47e8-a160-22e4c278fc88 in datapath 63877f45-8244-4c80-903a-80901a7d83cb unbound from our chassis#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.406 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63877f45-8244-4c80-903a-80901a7d83cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.407 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e47596e-7411-4105-9a85-b7c72045d705]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.408 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb namespace which is not needed anymore#033[00m
Jan 23 05:11:23 np0005593233 nova_compute[222017]: 2026-01-23 10:11:23.428 222021 DEBUG nova.compute.manager [None req-a40465e1-2227-4507-8c86-9f49c51487e2 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [NOTICE]   (272917) : haproxy version is 2.8.14-c23fe91
Jan 23 05:11:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [NOTICE]   (272917) : path to executable is /usr/sbin/haproxy
Jan 23 05:11:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [WARNING]  (272917) : Exiting Master process...
Jan 23 05:11:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [ALERT]    (272917) : Current worker (272919) exited with code 143 (Terminated)
Jan 23 05:11:23 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[272913]: [WARNING]  (272917) : All workers exited. Exiting... (0)
Jan 23 05:11:23 np0005593233 systemd[1]: libpod-76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0.scope: Deactivated successfully.
Jan 23 05:11:23 np0005593233 podman[273009]: 2026-01-23 10:11:23.557016928 +0000 UTC m=+0.054260996 container died 76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:11:23 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0-userdata-shm.mount: Deactivated successfully.
Jan 23 05:11:23 np0005593233 systemd[1]: var-lib-containers-storage-overlay-06cf8b9d13305ed91935a90e9602dbe4a0e72d4717f4062dd56939f1b576a01d-merged.mount: Deactivated successfully.
Jan 23 05:11:23 np0005593233 podman[273009]: 2026-01-23 10:11:23.6026837 +0000 UTC m=+0.099927768 container cleanup 76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:11:23 np0005593233 systemd[1]: libpod-conmon-76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0.scope: Deactivated successfully.
Jan 23 05:11:23 np0005593233 podman[273040]: 2026-01-23 10:11:23.681752906 +0000 UTC m=+0.051017604 container remove 76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.690 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2092b7-dde8-4619-8863-6acf58ece905]: (4, ('Fri Jan 23 10:11:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb (76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0)\n76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0\nFri Jan 23 10:11:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb (76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0)\n76681321f1e3ccfe8ff30ddd788873203a28eec0f4be8cd3f0b4184ffb7508e0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.692 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[47d5474a-7e5f-4a19-b744-47560afeeb9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.693 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:23 np0005593233 nova_compute[222017]: 2026-01-23 10:11:23.696 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:23 np0005593233 kernel: tap63877f45-80: left promiscuous mode
Jan 23 05:11:23 np0005593233 nova_compute[222017]: 2026-01-23 10:11:23.719 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.724 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[91bf2d78-fb86-455c-8fcb-283d7adf16f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.744 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a66f27-1283-4f1a-a031-9131fc0c4274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.746 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[60e045e4-603e-4b1c-b8af-5c6d98ff87d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.767 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eeaa4202-6a05-4868-a9ed-f9b295d75359]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 698032, 'reachable_time': 20934, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273059, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.771 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:11:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:23.771 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[464219a2-ac68-4dbd-8578-7b38180f7787]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:23 np0005593233 systemd[1]: run-netns-ovnmeta\x2d63877f45\x2d8244\x2d4c80\x2d903a\x2d80901a7d83cb.mount: Deactivated successfully.
Jan 23 05:11:24 np0005593233 nova_compute[222017]: 2026-01-23 10:11:24.067 222021 DEBUG nova.compute.manager [req-9c0d762d-8ee1-430d-b412-6ce7594cca0d req-9bdc30a4-9657-4f87-b1d0-3550e2073854 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:24 np0005593233 nova_compute[222017]: 2026-01-23 10:11:24.068 222021 DEBUG oslo_concurrency.lockutils [req-9c0d762d-8ee1-430d-b412-6ce7594cca0d req-9bdc30a4-9657-4f87-b1d0-3550e2073854 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:24 np0005593233 nova_compute[222017]: 2026-01-23 10:11:24.068 222021 DEBUG oslo_concurrency.lockutils [req-9c0d762d-8ee1-430d-b412-6ce7594cca0d req-9bdc30a4-9657-4f87-b1d0-3550e2073854 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:24 np0005593233 nova_compute[222017]: 2026-01-23 10:11:24.068 222021 DEBUG oslo_concurrency.lockutils [req-9c0d762d-8ee1-430d-b412-6ce7594cca0d req-9bdc30a4-9657-4f87-b1d0-3550e2073854 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:24 np0005593233 nova_compute[222017]: 2026-01-23 10:11:24.068 222021 DEBUG nova.compute.manager [req-9c0d762d-8ee1-430d-b412-6ce7594cca0d req-9bdc30a4-9657-4f87-b1d0-3550e2073854 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:24 np0005593233 nova_compute[222017]: 2026-01-23 10:11:24.068 222021 WARNING nova.compute.manager [req-9c0d762d-8ee1-430d-b412-6ce7594cca0d req-9bdc30a4-9657-4f87-b1d0-3550e2073854 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state suspended and task_state None.#033[00m
Jan 23 05:11:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:24.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:24.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:25 np0005593233 nova_compute[222017]: 2026-01-23 10:11:25.345 222021 INFO nova.compute.manager [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Resuming#033[00m
Jan 23 05:11:25 np0005593233 nova_compute[222017]: 2026-01-23 10:11:25.346 222021 DEBUG nova.objects.instance [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'flavor' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:25 np0005593233 nova_compute[222017]: 2026-01-23 10:11:25.453 222021 DEBUG oslo_concurrency.lockutils [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:25 np0005593233 nova_compute[222017]: 2026-01-23 10:11:25.454 222021 DEBUG oslo_concurrency.lockutils [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:25 np0005593233 nova_compute[222017]: 2026-01-23 10:11:25.454 222021 DEBUG nova.network.neutron [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.222 222021 DEBUG nova.compute.manager [req-24cdec11-82bf-4027-ba28-623fbb73366b req-b5766fc5-ff72-48d2-bb03-a44d8ab41cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.222 222021 DEBUG oslo_concurrency.lockutils [req-24cdec11-82bf-4027-ba28-623fbb73366b req-b5766fc5-ff72-48d2-bb03-a44d8ab41cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.223 222021 DEBUG oslo_concurrency.lockutils [req-24cdec11-82bf-4027-ba28-623fbb73366b req-b5766fc5-ff72-48d2-bb03-a44d8ab41cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.223 222021 DEBUG oslo_concurrency.lockutils [req-24cdec11-82bf-4027-ba28-623fbb73366b req-b5766fc5-ff72-48d2-bb03-a44d8ab41cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.223 222021 DEBUG nova.compute.manager [req-24cdec11-82bf-4027-ba28-623fbb73366b req-b5766fc5-ff72-48d2-bb03-a44d8ab41cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.223 222021 WARNING nova.compute.manager [req-24cdec11-82bf-4027-ba28-623fbb73366b req-b5766fc5-ff72-48d2-bb03-a44d8ab41cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 23 05:11:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.464 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:26.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:26.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:26 np0005593233 nova_compute[222017]: 2026-01-23 10:11:26.785 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.257 222021 DEBUG nova.network.neutron [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.308 222021 DEBUG oslo_concurrency.lockutils [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.315 222021 DEBUG nova.virt.libvirt.vif [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:11:23Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.316 222021 DEBUG nova.network.os_vif_util [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.317 222021 DEBUG nova.network.os_vif_util [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.317 222021 DEBUG os_vif [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.319 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.319 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.323 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.323 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bb3c318-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.323 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8bb3c318-ff, col_values=(('external_ids', {'iface-id': '8bb3c318-ff77-47e8-a160-22e4c278fc88', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:ad:46', 'vm-uuid': '91cc1048-141a-4a20-b148-991a883adfa9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.324 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.325 222021 INFO os_vif [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff')#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.394 222021 DEBUG nova.objects.instance [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'numa_topology' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:28 np0005593233 kernel: tap8bb3c318-ff: entered promiscuous mode
Jan 23 05:11:28 np0005593233 NetworkManager[48871]: <info>  [1769163088.5138] manager: (tap8bb3c318-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.514 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:28Z|00560|binding|INFO|Claiming lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 for this chassis.
Jan 23 05:11:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:28Z|00561|binding|INFO|8bb3c318-ff77-47e8-a160-22e4c278fc88: Claiming fa:16:3e:5b:ad:46 10.100.0.12
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.524 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ad:46 10.100.0.12'], port_security=['fa:16:3e:5b:ad:46 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91cc1048-141a-4a20-b148-991a883adfa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8bb3c318-ff77-47e8-a160-22e4c278fc88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.525 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb3c318-ff77-47e8-a160-22e4c278fc88 in datapath 63877f45-8244-4c80-903a-80901a7d83cb bound to our chassis#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.527 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63877f45-8244-4c80-903a-80901a7d83cb#033[00m
Jan 23 05:11:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:28Z|00562|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 up in Southbound
Jan 23 05:11:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:28Z|00563|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 ovn-installed in OVS
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.535 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.543 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1ec357-69e3-4a47-99c8-66e9d905156d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.545 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap63877f45-81 in ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.547 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap63877f45-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.547 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a18d352f-1031-40a0-9d02-d37b86f62e57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.548 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[868457bd-f0ce-4daa-9647-70209cdee674]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 systemd-udevd[273074]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.564 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2d4747-a75b-4b59-8a57-5ab24d9afd7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 NetworkManager[48871]: <info>  [1769163088.5673] device (tap8bb3c318-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:11:28 np0005593233 NetworkManager[48871]: <info>  [1769163088.5684] device (tap8bb3c318-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:11:28 np0005593233 systemd-machined[190954]: New machine qemu-61-instance-00000079.
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.581 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4a231980-ba42-4e2c-8702-7488f910c9f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 systemd[1]: Started Virtual Machine qemu-61-instance-00000079.
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.625 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[187e98d4-260a-4024-a091-94f28c74e836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.631 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8993e320-c979-482f-b60b-6da6a7104ffe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 NetworkManager[48871]: <info>  [1769163088.6328] manager: (tap63877f45-80): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.676 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[685dd2ce-72ed-4d87-82f4-d2f6fbb226d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.682 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[877946cd-0101-40b6-8673-33cce198c70a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:28.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:28.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:28 np0005593233 NetworkManager[48871]: <info>  [1769163088.7163] device (tap63877f45-80): carrier: link connected
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.727 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[27fd7a93-024c-43c8-8a73-ae6544bf7a14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.752 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3083aa3a-e5e4-4435-a8b7-d6b2d9552f7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700134, 'reachable_time': 35869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273107, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.776 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[722925f8-48c9-4590-ab1c-fdaf0dcb4f37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:59a0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 700134, 'tstamp': 700134}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273108, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.800 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e954e26c-f98a-4a50-9f3a-5f47e874956b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63877f45-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:59:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700134, 'reachable_time': 35869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273109, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.840 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[daad766d-6e3e-4084-94d4-dc3278dbbb6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.938 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f7eb50-3403-4de9-9ec2-c1d5fb269bf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.940 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.940 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.941 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63877f45-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:28 np0005593233 NetworkManager[48871]: <info>  [1769163088.9437] manager: (tap63877f45-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 23 05:11:28 np0005593233 kernel: tap63877f45-80: entered promiscuous mode
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.943 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.946 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63877f45-80, col_values=(('external_ids', {'iface-id': 'c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:28Z|00564|binding|INFO|Releasing lport c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23 from this chassis (sb_readonly=0)
Jan 23 05:11:28 np0005593233 nova_compute[222017]: 2026-01-23 10:11:28.962 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.963 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.964 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[38e80271-2df0-4ca4-b859-8cf7e5ddedc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.966 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-63877f45-8244-4c80-903a-80901a7d83cb
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/63877f45-8244-4c80-903a-80901a7d83cb.pid.haproxy
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 63877f45-8244-4c80-903a-80901a7d83cb
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:11:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:28.966 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'env', 'PROCESS_TAG=haproxy-63877f45-8244-4c80-903a-80901a7d83cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/63877f45-8244-4c80-903a-80901a7d83cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.114 222021 DEBUG nova.compute.manager [req-d4aa0407-2e0f-45e6-a3c3-5cee2e51a6e5 req-cf905ad2-461e-406c-9b98-a10d10bd8e94 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.115 222021 DEBUG oslo_concurrency.lockutils [req-d4aa0407-2e0f-45e6-a3c3-5cee2e51a6e5 req-cf905ad2-461e-406c-9b98-a10d10bd8e94 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.115 222021 DEBUG oslo_concurrency.lockutils [req-d4aa0407-2e0f-45e6-a3c3-5cee2e51a6e5 req-cf905ad2-461e-406c-9b98-a10d10bd8e94 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.115 222021 DEBUG oslo_concurrency.lockutils [req-d4aa0407-2e0f-45e6-a3c3-5cee2e51a6e5 req-cf905ad2-461e-406c-9b98-a10d10bd8e94 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.116 222021 DEBUG nova.compute.manager [req-d4aa0407-2e0f-45e6-a3c3-5cee2e51a6e5 req-cf905ad2-461e-406c-9b98-a10d10bd8e94 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.116 222021 WARNING nova.compute.manager [req-d4aa0407-2e0f-45e6-a3c3-5cee2e51a6e5 req-cf905ad2-461e-406c-9b98-a10d10bd8e94 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.341 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 91cc1048-141a-4a20-b148-991a883adfa9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.342 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163089.3406768, 91cc1048-141a-4a20-b148-991a883adfa9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.343 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:11:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:29Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:ad:46 10.100.0.12
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.390 222021 DEBUG nova.compute.manager [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.391 222021 DEBUG nova.objects.instance [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'pci_devices' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:29 np0005593233 podman[273183]: 2026-01-23 10:11:29.444247097 +0000 UTC m=+0.069800555 container create 14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:11:29 np0005593233 systemd[1]: Started libpod-conmon-14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1.scope.
Jan 23 05:11:29 np0005593233 podman[273183]: 2026-01-23 10:11:29.409956738 +0000 UTC m=+0.035510196 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:11:29 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:11:29 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07d5a211c04a22ba5c7fa84f02d7cd9c9abfe2c81f50df204fd74a6aac8dd5a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:11:29 np0005593233 podman[273183]: 2026-01-23 10:11:29.535773576 +0000 UTC m=+0.161327044 container init 14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:11:29 np0005593233 podman[273183]: 2026-01-23 10:11:29.542644521 +0000 UTC m=+0.168197969 container start 14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:11:29 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [NOTICE]   (273202) : New worker (273204) forked
Jan 23 05:11:29 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [NOTICE]   (273202) : Loading success.
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.712 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.718 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.906 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.907 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163089.3456655, 91cc1048-141a-4a20-b148-991a883adfa9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.908 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.915 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance running successfully.#033[00m
Jan 23 05:11:29 np0005593233 virtqemud[221325]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.919 222021 DEBUG nova.virt.libvirt.guest [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.920 222021 DEBUG nova.compute.manager [None req-49698f6b-cd0c-41ab-873e-52242bcf38de cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.961 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:29 np0005593233 nova_compute[222017]: 2026-01-23 10:11:29.967 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:30 np0005593233 nova_compute[222017]: 2026-01-23 10:11:30.003 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 23 05:11:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:30.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:30.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:31Z|00565|binding|INFO|Releasing lport c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23 from this chassis (sb_readonly=0)
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.226 222021 DEBUG nova.compute.manager [req-334f40ab-7631-4994-a706-dc508005b83c req-69f05d98-2198-4ff7-9894-0dd92e82ce57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.227 222021 DEBUG oslo_concurrency.lockutils [req-334f40ab-7631-4994-a706-dc508005b83c req-69f05d98-2198-4ff7-9894-0dd92e82ce57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.228 222021 DEBUG oslo_concurrency.lockutils [req-334f40ab-7631-4994-a706-dc508005b83c req-69f05d98-2198-4ff7-9894-0dd92e82ce57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.228 222021 DEBUG oslo_concurrency.lockutils [req-334f40ab-7631-4994-a706-dc508005b83c req-69f05d98-2198-4ff7-9894-0dd92e82ce57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.229 222021 DEBUG nova.compute.manager [req-334f40ab-7631-4994-a706-dc508005b83c req-69f05d98-2198-4ff7-9894-0dd92e82ce57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.229 222021 WARNING nova.compute.manager [req-334f40ab-7631-4994-a706-dc508005b83c req-69f05d98-2198-4ff7-9894-0dd92e82ce57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:11:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:31Z|00566|binding|INFO|Releasing lport c5d1fee0-5e4c-46f0-81b5-41a5f0b71a23 from this chassis (sb_readonly=0)
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.247 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:31 np0005593233 nova_compute[222017]: 2026-01-23 10:11:31.788 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:32 np0005593233 podman[273345]: 2026-01-23 10:11:32.135591802 +0000 UTC m=+0.135901235 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:11:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:32.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:32.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:11:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:11:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:34.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:36 np0005593233 nova_compute[222017]: 2026-01-23 10:11:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:36 np0005593233 nova_compute[222017]: 2026-01-23 10:11:36.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:36.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:36.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:36 np0005593233 nova_compute[222017]: 2026-01-23 10:11:36.790 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:38 np0005593233 nova_compute[222017]: 2026-01-23 10:11:38.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:38.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:38.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:40.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:40.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.449 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.449 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.450 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.450 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.451 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.506 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:41 np0005593233 nova_compute[222017]: 2026-01-23 10:11:41.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.885960) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163101886074, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1690, "num_deletes": 261, "total_data_size": 3684214, "memory_usage": 3751000, "flush_reason": "Manual Compaction"}
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163101907187, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2420708, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 56978, "largest_seqno": 58663, "table_properties": {"data_size": 2413520, "index_size": 4131, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15848, "raw_average_key_size": 20, "raw_value_size": 2398786, "raw_average_value_size": 3087, "num_data_blocks": 180, "num_entries": 777, "num_filter_entries": 777, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162974, "oldest_key_time": 1769162974, "file_creation_time": 1769163101, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 21289 microseconds, and 11102 cpu microseconds.
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.907265) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2420708 bytes OK
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.907293) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.911270) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.911304) EVENT_LOG_v1 {"time_micros": 1769163101911294, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.911334) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3676239, prev total WAL file size 3676239, number of live WAL files 2.
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.912561) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303036' seq:72057594037927935, type:22 .. '6C6F676D0032323630' seq:0, type:0; will stop at (end)
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2363KB)], [114(10213KB)]
Jan 23 05:11:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163101912626, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 12879048, "oldest_snapshot_seqno": -1}
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8153 keys, 12737217 bytes, temperature: kUnknown
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163102006649, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 12737217, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12682430, "index_size": 33299, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20421, "raw_key_size": 211223, "raw_average_key_size": 25, "raw_value_size": 12536958, "raw_average_value_size": 1537, "num_data_blocks": 1314, "num_entries": 8153, "num_filter_entries": 8153, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163101, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3099429844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.007191) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 12737217 bytes
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.021731) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.8 rd, 135.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 10.0 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(10.6) write-amplify(5.3) OK, records in: 8693, records dropped: 540 output_compression: NoCompression
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.021780) EVENT_LOG_v1 {"time_micros": 1769163102021760, "job": 72, "event": "compaction_finished", "compaction_time_micros": 94153, "compaction_time_cpu_micros": 30425, "output_level": 6, "num_output_files": 1, "total_output_size": 12737217, "num_input_records": 8693, "num_output_records": 8153, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163102022707, "job": 72, "event": "table_file_deletion", "file_number": 116}
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163102026609, "job": 72, "event": "table_file_deletion", "file_number": 114}
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:41.912422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.026704) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.026713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.026715) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.026716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:11:42.026718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.033 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.135 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.136 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000079 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.339 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.341 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4210MB free_disk=20.718894958496094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.342 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.342 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.453 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 91cc1048-141a-4a20-b148-991a883adfa9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.455 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.455 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:11:42 np0005593233 nova_compute[222017]: 2026-01-23 10:11:42.517 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:42.675 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:42.676 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:42.677 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:42.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:42.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:11:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4106043368' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:11:43 np0005593233 nova_compute[222017]: 2026-01-23 10:11:43.006 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:43 np0005593233 nova_compute[222017]: 2026-01-23 10:11:43.016 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:11:43 np0005593233 nova_compute[222017]: 2026-01-23 10:11:43.039 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:11:43 np0005593233 nova_compute[222017]: 2026-01-23 10:11:43.077 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:11:43 np0005593233 nova_compute[222017]: 2026-01-23 10:11:43.078 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:44.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:45 np0005593233 nova_compute[222017]: 2026-01-23 10:11:45.079 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:45 np0005593233 nova_compute[222017]: 2026-01-23 10:11:45.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:45 np0005593233 nova_compute[222017]: 2026-01-23 10:11:45.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:11:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:46 np0005593233 nova_compute[222017]: 2026-01-23 10:11:46.510 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:46.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:46.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:46 np0005593233 nova_compute[222017]: 2026-01-23 10:11:46.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:47 np0005593233 podman[273468]: 2026-01-23 10:11:47.072052616 +0000 UTC m=+0.075301491 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.646 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.646 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.646 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:11:48 np0005593233 nova_compute[222017]: 2026-01-23 10:11:48.646 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:48.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:48.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:50.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:50.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:51 np0005593233 nova_compute[222017]: 2026-01-23 10:11:51.102 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [{"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:51 np0005593233 nova_compute[222017]: 2026-01-23 10:11:51.132 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-91cc1048-141a-4a20-b148-991a883adfa9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:51 np0005593233 nova_compute[222017]: 2026-01-23 10:11:51.133 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:11:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:51 np0005593233 nova_compute[222017]: 2026-01-23 10:11:51.512 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593233 nova_compute[222017]: 2026-01-23 10:11:51.796 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:11:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:52.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:11:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:54 np0005593233 nova_compute[222017]: 2026-01-23 10:11:54.127 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:54.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:56 np0005593233 nova_compute[222017]: 2026-01-23 10:11:56.516 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:56.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:11:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 11K writes, 58K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1609 writes, 8209 keys, 1609 commit groups, 1.0 writes per commit group, ingest: 16.51 MB, 0.03 MB/s#012Interval WAL: 1609 writes, 1609 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     46.5      1.59              0.34        36    0.044       0      0       0.0       0.0#012  L6      1/0   12.15 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.5     79.0     66.7      5.00              1.09        35    0.143    220K    19K       0.0       0.0#012 Sum      1/0   12.15 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.5     60.0     61.8      6.60              1.43        71    0.093    220K    19K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.3     49.8     52.1      1.51              0.26        12    0.126     49K   3127       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0     79.0     66.7      5.00              1.09        35    0.143    220K    19K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     46.5      1.59              0.34        35    0.045       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.072, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.40 GB write, 0.10 MB/s write, 0.39 GB read, 0.09 MB/s read, 6.6 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 43.45 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000322 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2501,41.80 MB,13.7501%) FilterBlock(71,621.55 KB,0.199664%) IndexBlock(71,1.05 MB,0.343885%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:11:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:11:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:56.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:11:56 np0005593233 nova_compute[222017]: 2026-01-23 10:11:56.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.700 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.700 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.701 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.701 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.701 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.702 222021 INFO nova.compute.manager [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Terminating instance#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.703 222021 DEBUG nova.compute.manager [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:11:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:58.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:11:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:58.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:58 np0005593233 kernel: tap8bb3c318-ff (unregistering): left promiscuous mode
Jan 23 05:11:58 np0005593233 NetworkManager[48871]: <info>  [1769163118.7656] device (tap8bb3c318-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:58Z|00567|binding|INFO|Releasing lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 from this chassis (sb_readonly=0)
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.816 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:58Z|00568|binding|INFO|Setting lport 8bb3c318-ff77-47e8-a160-22e4c278fc88 down in Southbound
Jan 23 05:11:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:11:58Z|00569|binding|INFO|Removing iface tap8bb3c318-ff ovn-installed in OVS
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.833 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:58.856 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ad:46 10.100.0.12'], port_security=['fa:16:3e:5b:ad:46 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91cc1048-141a-4a20-b148-991a883adfa9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63877f45-8244-4c80-903a-80901a7d83cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746ea02b745c4e21ace4cb49c193899d', 'neutron:revision_number': '11', 'neutron:security_group_ids': '69bbd8fc-29e5-4515-a6ba-c1c319d113ea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fb65718-6c8e-4bc0-9249-e791b1bad19c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=8bb3c318-ff77-47e8-a160-22e4c278fc88) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:58.859 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 8bb3c318-ff77-47e8-a160-22e4c278fc88 in datapath 63877f45-8244-4c80-903a-80901a7d83cb unbound from our chassis#033[00m
Jan 23 05:11:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:58.860 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63877f45-8244-4c80-903a-80901a7d83cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:11:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:58.862 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[25413aaa-1775-413b-ba08-0fdc863cb370]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:58.863 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb namespace which is not needed anymore#033[00m
Jan 23 05:11:58 np0005593233 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000079.scope: Deactivated successfully.
Jan 23 05:11:58 np0005593233 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000079.scope: Consumed 2.339s CPU time.
Jan 23 05:11:58 np0005593233 systemd-machined[190954]: Machine qemu-61-instance-00000079 terminated.
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.942 222021 INFO nova.virt.libvirt.driver [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Instance destroyed successfully.#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.943 222021 DEBUG nova.objects.instance [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lazy-loading 'resources' on Instance uuid 91cc1048-141a-4a20-b148-991a883adfa9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.968 222021 DEBUG nova.virt.libvirt.vif [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:07:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-980180187',display_name='tempest-ServersNegativeTestJSON-server-980180187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-980180187',id=121,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='746ea02b745c4e21ace4cb49c193899d',ramdisk_id='',reservation_id='r-np6t6vpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-623507515',owner_user_name='tempest-ServersNegativeTestJSON-623507515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:11:29Z,user_data=None,user_id='cde472cc8af0464992006a69d047d0d4',uuid=91cc1048-141a-4a20-b148-991a883adfa9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.969 222021 DEBUG nova.network.os_vif_util [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converting VIF {"id": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "address": "fa:16:3e:5b:ad:46", "network": {"id": "63877f45-8244-4c80-903a-80901a7d83cb", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1029972925-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746ea02b745c4e21ace4cb49c193899d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8bb3c318-ff", "ovs_interfaceid": "8bb3c318-ff77-47e8-a160-22e4c278fc88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.970 222021 DEBUG nova.network.os_vif_util [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.970 222021 DEBUG os_vif [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.973 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bb3c318-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.975 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.976 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:58 np0005593233 nova_compute[222017]: 2026-01-23 10:11:58.979 222021 INFO os_vif [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:ad:46,bridge_name='br-int',has_traffic_filtering=True,id=8bb3c318-ff77-47e8-a160-22e4c278fc88,network=Network(63877f45-8244-4c80-903a-80901a7d83cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8bb3c318-ff')#033[00m
Jan 23 05:11:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [NOTICE]   (273202) : haproxy version is 2.8.14-c23fe91
Jan 23 05:11:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [NOTICE]   (273202) : path to executable is /usr/sbin/haproxy
Jan 23 05:11:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [WARNING]  (273202) : Exiting Master process...
Jan 23 05:11:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [WARNING]  (273202) : Exiting Master process...
Jan 23 05:11:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [ALERT]    (273202) : Current worker (273204) exited with code 143 (Terminated)
Jan 23 05:11:59 np0005593233 neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb[273198]: [WARNING]  (273202) : All workers exited. Exiting... (0)
Jan 23 05:11:59 np0005593233 systemd[1]: libpod-14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1.scope: Deactivated successfully.
Jan 23 05:11:59 np0005593233 podman[273519]: 2026-01-23 10:11:59.028606129 +0000 UTC m=+0.056778837 container died 14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:11:59 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1-userdata-shm.mount: Deactivated successfully.
Jan 23 05:11:59 np0005593233 systemd[1]: var-lib-containers-storage-overlay-07d5a211c04a22ba5c7fa84f02d7cd9c9abfe2c81f50df204fd74a6aac8dd5a2-merged.mount: Deactivated successfully.
Jan 23 05:11:59 np0005593233 podman[273519]: 2026-01-23 10:11:59.065181473 +0000 UTC m=+0.093354181 container cleanup 14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:11:59 np0005593233 systemd[1]: libpod-conmon-14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1.scope: Deactivated successfully.
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.136 222021 DEBUG nova.compute.manager [req-d3bdbe13-1286-4e56-abcb-44d1756ea279 req-19c8b9d5-a32b-4620-bac8-574276cfbd55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.137 222021 DEBUG oslo_concurrency.lockutils [req-d3bdbe13-1286-4e56-abcb-44d1756ea279 req-19c8b9d5-a32b-4620-bac8-574276cfbd55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.137 222021 DEBUG oslo_concurrency.lockutils [req-d3bdbe13-1286-4e56-abcb-44d1756ea279 req-19c8b9d5-a32b-4620-bac8-574276cfbd55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.137 222021 DEBUG oslo_concurrency.lockutils [req-d3bdbe13-1286-4e56-abcb-44d1756ea279 req-19c8b9d5-a32b-4620-bac8-574276cfbd55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.137 222021 DEBUG nova.compute.manager [req-d3bdbe13-1286-4e56-abcb-44d1756ea279 req-19c8b9d5-a32b-4620-bac8-574276cfbd55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.137 222021 DEBUG nova.compute.manager [req-d3bdbe13-1286-4e56-abcb-44d1756ea279 req-19c8b9d5-a32b-4620-bac8-574276cfbd55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-unplugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:11:59 np0005593233 podman[273566]: 2026-01-23 10:11:59.138167308 +0000 UTC m=+0.046563398 container remove 14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.146 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[93f1e5c0-5dc7-47f6-9c62-6d91a30d4dca]: (4, ('Fri Jan 23 10:11:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb (14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1)\n14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1\nFri Jan 23 10:11:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb (14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1)\n14822a1ba01bb115e6880fbd908f73b66d5b971ab70363e3ce8c2fbd2c471be1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.148 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e5699321-2510-45c1-a4fe-89c227b56684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.149 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63877f45-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.151 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:59 np0005593233 kernel: tap63877f45-80: left promiscuous mode
Jan 23 05:11:59 np0005593233 nova_compute[222017]: 2026-01-23 10:11:59.166 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.172 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fefc7778-06df-4f63-a61c-e655c57c1c6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.185 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e2181131-d569-4b88-aa5f-9de63a4e59a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.187 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d4dc7994-31db-48a0-954c-d77f07c78b59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.206 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[39907c32-7cc6-4f25-a0cd-435975359eb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 700124, 'reachable_time': 43641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273581, 'error': None, 'target': 'ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.209 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-63877f45-8244-4c80-903a-80901a7d83cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:11:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:11:59.210 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[07188fa2-a89f-4d7a-91d9-8a143c2badde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:59 np0005593233 systemd[1]: run-netns-ovnmeta\x2d63877f45\x2d8244\x2d4c80\x2d903a\x2d80901a7d83cb.mount: Deactivated successfully.
Jan 23 05:11:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4113730414' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e294 e294: 3 total, 3 up, 3 in
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.088 222021 INFO nova.virt.libvirt.driver [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deleting instance files /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9_del#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.089 222021 INFO nova.virt.libvirt.driver [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deletion of /var/lib/nova/instances/91cc1048-141a-4a20-b148-991a883adfa9_del complete#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.195 222021 INFO nova.compute.manager [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Took 1.49 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.195 222021 DEBUG oslo.service.loopingcall [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.196 222021 DEBUG nova.compute.manager [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.197 222021 DEBUG nova.network.neutron [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:12:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:00.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:00.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:00.957 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.958 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:00.959 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:12:00 np0005593233 nova_compute[222017]: 2026-01-23 10:12:00.978 222021 DEBUG nova.network.neutron [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.030 222021 INFO nova.compute.manager [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Took 0.83 seconds to deallocate network for instance.#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.117 222021 DEBUG nova.compute.manager [req-f770126f-e4ad-4cb7-9e7b-64832e01d458 req-0e035d39-465e-486d-9e45-57bb43674e14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-deleted-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.119 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.120 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.198 222021 DEBUG oslo_concurrency.processutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.292 222021 DEBUG nova.compute.manager [req-fad78440-1d45-4a44-950d-40d79a6353a7 req-3ce02086-e6f0-4983-9dc0-c0bf4c95c9cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.293 222021 DEBUG oslo_concurrency.lockutils [req-fad78440-1d45-4a44-950d-40d79a6353a7 req-3ce02086-e6f0-4983-9dc0-c0bf4c95c9cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "91cc1048-141a-4a20-b148-991a883adfa9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.294 222021 DEBUG oslo_concurrency.lockutils [req-fad78440-1d45-4a44-950d-40d79a6353a7 req-3ce02086-e6f0-4983-9dc0-c0bf4c95c9cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.294 222021 DEBUG oslo_concurrency.lockutils [req-fad78440-1d45-4a44-950d-40d79a6353a7 req-3ce02086-e6f0-4983-9dc0-c0bf4c95c9cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.294 222021 DEBUG nova.compute.manager [req-fad78440-1d45-4a44-950d-40d79a6353a7 req-3ce02086-e6f0-4983-9dc0-c0bf4c95c9cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] No waiting events found dispatching network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.294 222021 WARNING nova.compute.manager [req-fad78440-1d45-4a44-950d-40d79a6353a7 req-3ce02086-e6f0-4983-9dc0-c0bf4c95c9cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Received unexpected event network-vif-plugged-8bb3c318-ff77-47e8-a160-22e4c278fc88 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:12:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/147695870' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.643 222021 DEBUG oslo_concurrency.processutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.650 222021 DEBUG nova.compute.provider_tree [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.673 222021 DEBUG nova.scheduler.client.report [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.712 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.743 222021 INFO nova.scheduler.client.report [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Deleted allocations for instance 91cc1048-141a-4a20-b148-991a883adfa9#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.803 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:01 np0005593233 nova_compute[222017]: 2026-01-23 10:12:01.822 222021 DEBUG oslo_concurrency.lockutils [None req-f2f957a7-3830-4f27-8df0-2f0e15daa988 cde472cc8af0464992006a69d047d0d4 746ea02b745c4e21ace4cb49c193899d - - default default] Lock "91cc1048-141a-4a20-b148-991a883adfa9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:02.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:02.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:03 np0005593233 podman[273605]: 2026-01-23 10:12:03.112403869 +0000 UTC m=+0.113817430 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:12:04 np0005593233 nova_compute[222017]: 2026-01-23 10:12:04.038 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:04.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:04.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:04.962 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:05 np0005593233 nova_compute[222017]: 2026-01-23 10:12:05.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:06.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:06.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:06 np0005593233 nova_compute[222017]: 2026-01-23 10:12:06.806 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:07 np0005593233 nova_compute[222017]: 2026-01-23 10:12:07.132 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:08.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:08.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:09 np0005593233 nova_compute[222017]: 2026-01-23 10:12:09.049 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:10.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 05:12:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:10.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:11 np0005593233 nova_compute[222017]: 2026-01-23 10:12:11.807 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 05:12:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:12.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:12.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:13 np0005593233 nova_compute[222017]: 2026-01-23 10:12:13.940 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163118.9384413, 91cc1048-141a-4a20-b148-991a883adfa9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:13 np0005593233 nova_compute[222017]: 2026-01-23 10:12:13.941 222021 INFO nova.compute.manager [-] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:12:14 np0005593233 nova_compute[222017]: 2026-01-23 10:12:14.053 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:14 np0005593233 nova_compute[222017]: 2026-01-23 10:12:14.253 222021 DEBUG nova.compute.manager [None req-d51aea62-cb59-45b3-a23c-94524d6b0c1b - - - - - -] [instance: 91cc1048-141a-4a20-b148-991a883adfa9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c68d6f0 =====
Jan 23 05:12:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:14.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c68d6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c68d6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:14.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:16.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:16.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:16 np0005593233 nova_compute[222017]: 2026-01-23 10:12:16.809 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e295 e295: 3 total, 3 up, 3 in
Jan 23 05:12:18 np0005593233 podman[273631]: 2026-01-23 10:12:18.10034897 +0000 UTC m=+0.095702168 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:12:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:18.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:18.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:19 np0005593233 nova_compute[222017]: 2026-01-23 10:12:19.057 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.026 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.028 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.051 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.187 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.188 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.196 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.197 222021 INFO nova.compute.claims [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.406 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:20.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:20.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1335025545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.957 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:20 np0005593233 nova_compute[222017]: 2026-01-23 10:12:20.966 222021 DEBUG nova.compute.provider_tree [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.077 222021 DEBUG nova.scheduler.client.report [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.206 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.208 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:12:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.282 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.283 222021 DEBUG nova.network.neutron [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.333 222021 INFO nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.362 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.482 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.484 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.484 222021 INFO nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Creating image(s)#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.518 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.564 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.603 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.609 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.701 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.703 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.704 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.705 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.747 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.752 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:21 np0005593233 nova_compute[222017]: 2026-01-23 10:12:21.845 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.099 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.347s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.188 222021 DEBUG nova.policy [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.195 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.360 222021 DEBUG nova.objects.instance [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 0c274757-9612-49ca-b1fa-8ae80aa5f510 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.396 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.397 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Ensure instance console log exists: /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.398 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.399 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:22 np0005593233 nova_compute[222017]: 2026-01-23 10:12:22.400 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:22.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:22.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:24 np0005593233 nova_compute[222017]: 2026-01-23 10:12:24.100 222021 DEBUG nova.network.neutron [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Successfully created port: 6096923e-378a-47e1-96e5-11c98a23abfa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:12:24 np0005593233 nova_compute[222017]: 2026-01-23 10:12:24.117 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:24.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:24.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 e296: 3 total, 3 up, 3 in
Jan 23 05:12:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:26.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:26.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:26 np0005593233 nova_compute[222017]: 2026-01-23 10:12:26.847 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.238 222021 DEBUG nova.network.neutron [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Successfully updated port: 6096923e-378a-47e1-96e5-11c98a23abfa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.275 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.276 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.276 222021 DEBUG nova.network.neutron [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.571 222021 DEBUG nova.compute.manager [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-changed-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.572 222021 DEBUG nova.compute.manager [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Refreshing instance network info cache due to event network-changed-6096923e-378a-47e1-96e5-11c98a23abfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:12:27 np0005593233 nova_compute[222017]: 2026-01-23 10:12:27.573 222021 DEBUG oslo_concurrency.lockutils [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:28 np0005593233 nova_compute[222017]: 2026-01-23 10:12:28.102 222021 DEBUG nova.network.neutron [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:12:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:28.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:28.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:29 np0005593233 nova_compute[222017]: 2026-01-23 10:12:29.160 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:29 np0005593233 nova_compute[222017]: 2026-01-23 10:12:29.963 222021 DEBUG nova.network.neutron [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.039 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.040 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Instance network_info: |[{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.041 222021 DEBUG oslo_concurrency.lockutils [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.041 222021 DEBUG nova.network.neutron [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Refreshing network info cache for port 6096923e-378a-47e1-96e5-11c98a23abfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.044 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Start _get_guest_xml network_info=[{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.050 222021 WARNING nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.056 222021 DEBUG nova.virt.libvirt.host [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.057 222021 DEBUG nova.virt.libvirt.host [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.070 222021 DEBUG nova.virt.libvirt.host [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.071 222021 DEBUG nova.virt.libvirt.host [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.072 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.073 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.073 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.073 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.073 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.074 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.074 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.074 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.074 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.075 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.075 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.075 222021 DEBUG nova.virt.hardware [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.078 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/216472534' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.545 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.577 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:30 np0005593233 nova_compute[222017]: 2026-01-23 10:12:30.584 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:30.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:30.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3442717590' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.128 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.131 222021 DEBUG nova.virt.libvirt.vif [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1143807706',display_name='tempest-TestNetworkBasicOps-server-1143807706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1143807706',id=132,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCimIZk2pOzEd7S89dRSmUHOOjnboCQFRq00t4KqTPGCkTIF8AjIYlOUd1UqUpNzeDj0eIsu4Yppl0TNbbIpwzFXQsRBfBhSqF/JqKHD5mgeWA8E9Qvf/RqS4B++mXDB+Q==',key_name='tempest-TestNetworkBasicOps-1966834096',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-f3zugqgz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:21Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=0c274757-9612-49ca-b1fa-8ae80aa5f510,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.132 222021 DEBUG nova.network.os_vif_util [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.135 222021 DEBUG nova.network.os_vif_util [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.138 222021 DEBUG nova.objects.instance [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0c274757-9612-49ca-b1fa-8ae80aa5f510 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.941 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <uuid>0c274757-9612-49ca-b1fa-8ae80aa5f510</uuid>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <name>instance-00000084</name>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkBasicOps-server-1143807706</nova:name>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:12:30</nova:creationTime>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <nova:port uuid="6096923e-378a-47e1-96e5-11c98a23abfa">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <entry name="serial">0c274757-9612-49ca-b1fa-8ae80aa5f510</entry>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <entry name="uuid">0c274757-9612-49ca-b1fa-8ae80aa5f510</entry>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0c274757-9612-49ca-b1fa-8ae80aa5f510_disk">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0c274757-9612-49ca-b1fa-8ae80aa5f510_disk.config">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:5f:9d:7a"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <target dev="tap6096923e-37"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/console.log" append="off"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:12:31 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:12:31 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:12:31 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:12:31 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.943 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Preparing to wait for external event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.944 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.944 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.945 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.946 222021 DEBUG nova.virt.libvirt.vif [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1143807706',display_name='tempest-TestNetworkBasicOps-server-1143807706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1143807706',id=132,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCimIZk2pOzEd7S89dRSmUHOOjnboCQFRq00t4KqTPGCkTIF8AjIYlOUd1UqUpNzeDj0eIsu4Yppl0TNbbIpwzFXQsRBfBhSqF/JqKHD5mgeWA8E9Qvf/RqS4B++mXDB+Q==',key_name='tempest-TestNetworkBasicOps-1966834096',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-f3zugqgz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:21Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=0c274757-9612-49ca-b1fa-8ae80aa5f510,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.947 222021 DEBUG nova.network.os_vif_util [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.948 222021 DEBUG nova.network.os_vif_util [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.949 222021 DEBUG os_vif [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.950 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.950 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.951 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.957 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6096923e-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.958 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6096923e-37, col_values=(('external_ids', {'iface-id': '6096923e-378a-47e1-96e5-11c98a23abfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:9d:7a', 'vm-uuid': '0c274757-9612-49ca-b1fa-8ae80aa5f510'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.960 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:31 np0005593233 NetworkManager[48871]: <info>  [1769163151.9617] manager: (tap6096923e-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.963 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.969 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:31 np0005593233 nova_compute[222017]: 2026-01-23 10:12:31.970 222021 INFO os_vif [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37')#033[00m
Jan 23 05:12:32 np0005593233 nova_compute[222017]: 2026-01-23 10:12:32.588 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:12:32 np0005593233 nova_compute[222017]: 2026-01-23 10:12:32.589 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:12:32 np0005593233 nova_compute[222017]: 2026-01-23 10:12:32.589 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:5f:9d:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:12:32 np0005593233 nova_compute[222017]: 2026-01-23 10:12:32.590 222021 INFO nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Using config drive#033[00m
Jan 23 05:12:32 np0005593233 nova_compute[222017]: 2026-01-23 10:12:32.620 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:32.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:32.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.372 222021 INFO nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Creating config drive at /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/disk.config#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.378 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52fm1u5x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.438 222021 DEBUG nova.network.neutron [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updated VIF entry in instance network info cache for port 6096923e-378a-47e1-96e5-11c98a23abfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.439 222021 DEBUG nova.network.neutron [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.467 222021 DEBUG oslo_concurrency.lockutils [req-7a51c0f2-537b-4d19-b330-ec5b7387e631 req-7ead70ee-40c0-4980-98d1-e6a9c3ad923c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.551 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52fm1u5x" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.600 222021 DEBUG nova.storage.rbd_utils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.606 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/disk.config 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.840 222021 DEBUG oslo_concurrency.processutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/disk.config 0c274757-9612-49ca-b1fa-8ae80aa5f510_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:33 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.842 222021 INFO nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Deleting local config drive /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510/disk.config because it was imported into RBD.#033[00m
Jan 23 05:12:33 np0005593233 kernel: tap6096923e-37: entered promiscuous mode
Jan 23 05:12:33 np0005593233 NetworkManager[48871]: <info>  [1769163153.9383] manager: (tap6096923e-37): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:33.999 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:33Z|00570|binding|INFO|Claiming lport 6096923e-378a-47e1-96e5-11c98a23abfa for this chassis.
Jan 23 05:12:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:33Z|00571|binding|INFO|6096923e-378a-47e1-96e5-11c98a23abfa: Claiming fa:16:3e:5f:9d:7a 10.100.0.6
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.018 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:9d:7a 10.100.0.6'], port_security=['fa:16:3e:5f:9d:7a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0c274757-9612-49ca-b1fa-8ae80aa5f510', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b179a4bf-7ae4-400d-ac79-526f3efe132c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=159e112b-eac7-4007-8b5d-563cdf20827d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=6096923e-378a-47e1-96e5-11c98a23abfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.021 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 6096923e-378a-47e1-96e5-11c98a23abfa in datapath b909e0f1-3092-44cc-b25b-a0acc5a5cd0c bound to our chassis#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.023 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b909e0f1-3092-44cc-b25b-a0acc5a5cd0c#033[00m
Jan 23 05:12:34 np0005593233 systemd-udevd[273988]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.055 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[79ee3f9b-d8f6-4c81-b288-c9656e729a9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.057 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb909e0f1-31 in ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:12:34 np0005593233 systemd-machined[190954]: New machine qemu-62-instance-00000084.
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.063 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb909e0f1-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.063 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cf445c-6771-4aa0-86b4-d7ead35691a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 NetworkManager[48871]: <info>  [1769163154.0657] device (tap6096923e-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.066 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78d80ed4-352c-44f8-bab3-4f6b27ad6b02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 NetworkManager[48871]: <info>  [1769163154.0673] device (tap6096923e-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.084 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb1fd0f-8b9d-4b66-834f-531d62b48d33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 systemd[1]: Started Virtual Machine qemu-62-instance-00000084.
Jan 23 05:12:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:34Z|00572|binding|INFO|Setting lport 6096923e-378a-47e1-96e5-11c98a23abfa ovn-installed in OVS
Jan 23 05:12:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:34Z|00573|binding|INFO|Setting lport 6096923e-378a-47e1-96e5-11c98a23abfa up in Southbound
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.093 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:34 np0005593233 podman[273962]: 2026-01-23 10:12:34.11093879 +0000 UTC m=+0.210768953 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.110 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[24513089-f720-4e6c-831f-fd35bf3838cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.149 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[89c1d80e-84f6-47ed-adc2-2f755bf62c95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.155 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3e0c09-b307-4b5c-890e-ca4cfad723a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 NetworkManager[48871]: <info>  [1769163154.1597] manager: (tapb909e0f1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.194 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4e5c04-c1fb-4312-84a7-028b81e7eb08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.198 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7c845793-a550-46ec-9f0a-2e978db8b735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 NetworkManager[48871]: <info>  [1769163154.2269] device (tapb909e0f1-30): carrier: link connected
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.232 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c776cd-db19-405e-9fff-ded14c682766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.252 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7839b3-efec-45a1-9991-493d01450307]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb909e0f1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:70:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706685, 'reachable_time': 44877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274031, 'error': None, 'target': 'ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.271 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[94116662-b628-4474-854e-83bd607b18c3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:7059'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706685, 'tstamp': 706685}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274032, 'error': None, 'target': 'ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.295 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[50f184d2-e962-48cb-9713-4b6604cdd010]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb909e0f1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:70:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706685, 'reachable_time': 44877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274033, 'error': None, 'target': 'ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.332 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f38c5056-7c3f-43f2-87e1-9dacda9fc828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.406 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[200a0d22-9625-455b-b86c-0df032e5eb85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.408 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb909e0f1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.409 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.410 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb909e0f1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:34 np0005593233 NetworkManager[48871]: <info>  [1769163154.4134] manager: (tapb909e0f1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 23 05:12:34 np0005593233 kernel: tapb909e0f1-30: entered promiscuous mode
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.417 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb909e0f1-30, col_values=(('external_ids', {'iface-id': '2f33e608-79d7-4774-b3de-9f2488a517e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.418 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:34Z|00574|binding|INFO|Releasing lport 2f33e608-79d7-4774-b3de-9f2488a517e6 from this chassis (sb_readonly=0)
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.438 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.440 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b909e0f1-3092-44cc-b25b-a0acc5a5cd0c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b909e0f1-3092-44cc-b25b-a0acc5a5cd0c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.442 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[83241e71-182c-41df-86f2-d742efdaceee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.443 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/b909e0f1-3092-44cc-b25b-a0acc5a5cd0c.pid.haproxy
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID b909e0f1-3092-44cc-b25b-a0acc5a5cd0c
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:12:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:34.444 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'env', 'PROCESS_TAG=haproxy-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b909e0f1-3092-44cc-b25b-a0acc5a5cd0c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:12:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:34.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:34 np0005593233 podman[274065]: 2026-01-23 10:12:34.812419301 +0000 UTC m=+0.054217184 container create 3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 05:12:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:34.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:34 np0005593233 systemd[1]: Started libpod-conmon-3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646.scope.
Jan 23 05:12:34 np0005593233 podman[274065]: 2026-01-23 10:12:34.784212314 +0000 UTC m=+0.026010237 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:12:34 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:12:34 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6370f98d07b9d51896abc3dcae65f51297ea8a8d80e432c367c0487b2f20354/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:12:34 np0005593233 podman[274065]: 2026-01-23 10:12:34.907709217 +0000 UTC m=+0.149507190 container init 3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:12:34 np0005593233 podman[274065]: 2026-01-23 10:12:34.918379849 +0000 UTC m=+0.160177762 container start 3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 05:12:34 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [NOTICE]   (274125) : New worker (274127) forked
Jan 23 05:12:34 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [NOTICE]   (274125) : Loading success.
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.993 222021 DEBUG nova.compute.manager [req-0c9baaab-9115-461e-a8b2-778247f4bc28 req-746e8646-c627-46e2-9f28-7020a6278c88 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.994 222021 DEBUG oslo_concurrency.lockutils [req-0c9baaab-9115-461e-a8b2-778247f4bc28 req-746e8646-c627-46e2-9f28-7020a6278c88 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.994 222021 DEBUG oslo_concurrency.lockutils [req-0c9baaab-9115-461e-a8b2-778247f4bc28 req-746e8646-c627-46e2-9f28-7020a6278c88 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.994 222021 DEBUG oslo_concurrency.lockutils [req-0c9baaab-9115-461e-a8b2-778247f4bc28 req-746e8646-c627-46e2-9f28-7020a6278c88 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:34 np0005593233 nova_compute[222017]: 2026-01-23 10:12:34.995 222021 DEBUG nova.compute.manager [req-0c9baaab-9115-461e-a8b2-778247f4bc28 req-746e8646-c627-46e2-9f28-7020a6278c88 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Processing event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.010 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163155.0096262, 0c274757-9612-49ca-b1fa-8ae80aa5f510 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.010 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] VM Started (Lifecycle Event)#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.013 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.017 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.022 222021 INFO nova.virt.libvirt.driver [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Instance spawned successfully.#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.022 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.037 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.045 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.049 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.049 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.050 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.050 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.051 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.052 222021 DEBUG nova.virt.libvirt.driver [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.082 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.083 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163155.0098798, 0c274757-9612-49ca-b1fa-8ae80aa5f510 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.084 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.123 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.127 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163155.016567, 0c274757-9612-49ca-b1fa-8ae80aa5f510 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.127 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.148 222021 INFO nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Took 13.67 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.149 222021 DEBUG nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.193 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.196 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.217 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.231 222021 INFO nova.compute.manager [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Took 15.09 seconds to build instance.#033[00m
Jan 23 05:12:35 np0005593233 nova_compute[222017]: 2026-01-23 10:12:35.264 222021 DEBUG oslo_concurrency.lockutils [None req-01536fe2-ac36-49fc-8dc5-ecdf0979d52b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:36 np0005593233 nova_compute[222017]: 2026-01-23 10:12:36.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:36.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:36.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:36 np0005593233 nova_compute[222017]: 2026-01-23 10:12:36.852 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:36 np0005593233 nova_compute[222017]: 2026-01-23 10:12:36.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:37 np0005593233 nova_compute[222017]: 2026-01-23 10:12:37.104 222021 DEBUG nova.compute.manager [req-9f7b0f2c-20f4-4f2a-b24b-94fb83c9276e req-21641c61-b039-4a62-9d27-235e87de02d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:37 np0005593233 nova_compute[222017]: 2026-01-23 10:12:37.105 222021 DEBUG oslo_concurrency.lockutils [req-9f7b0f2c-20f4-4f2a-b24b-94fb83c9276e req-21641c61-b039-4a62-9d27-235e87de02d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:37 np0005593233 nova_compute[222017]: 2026-01-23 10:12:37.106 222021 DEBUG oslo_concurrency.lockutils [req-9f7b0f2c-20f4-4f2a-b24b-94fb83c9276e req-21641c61-b039-4a62-9d27-235e87de02d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:37 np0005593233 nova_compute[222017]: 2026-01-23 10:12:37.107 222021 DEBUG oslo_concurrency.lockutils [req-9f7b0f2c-20f4-4f2a-b24b-94fb83c9276e req-21641c61-b039-4a62-9d27-235e87de02d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:37 np0005593233 nova_compute[222017]: 2026-01-23 10:12:37.107 222021 DEBUG nova.compute.manager [req-9f7b0f2c-20f4-4f2a-b24b-94fb83c9276e req-21641c61-b039-4a62-9d27-235e87de02d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] No waiting events found dispatching network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:37 np0005593233 nova_compute[222017]: 2026-01-23 10:12:37.108 222021 WARNING nova.compute.manager [req-9f7b0f2c-20f4-4f2a-b24b-94fb83c9276e req-21641c61-b039-4a62-9d27-235e87de02d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received unexpected event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa for instance with vm_state active and task_state None.#033[00m
Jan 23 05:12:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:38.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:38.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:39 np0005593233 NetworkManager[48871]: <info>  [1769163159.8642] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 23 05:12:39 np0005593233 NetworkManager[48871]: <info>  [1769163159.8650] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 23 05:12:39 np0005593233 nova_compute[222017]: 2026-01-23 10:12:39.863 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.060 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:40Z|00575|binding|INFO|Releasing lport 2f33e608-79d7-4774-b3de-9f2488a517e6 from this chassis (sb_readonly=0)
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.163 222021 DEBUG nova.compute.manager [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-changed-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.163 222021 DEBUG nova.compute.manager [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Refreshing instance network info cache due to event network-changed-6096923e-378a-47e1-96e5-11c98a23abfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.164 222021 DEBUG oslo_concurrency.lockutils [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.164 222021 DEBUG oslo_concurrency.lockutils [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.164 222021 DEBUG nova.network.neutron [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Refreshing network info cache for port 6096923e-378a-47e1-96e5-11c98a23abfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:12:40 np0005593233 nova_compute[222017]: 2026-01-23 10:12:40.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:40.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:40.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.423 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.424 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.855 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/914597307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.936 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:41 np0005593233 nova_compute[222017]: 2026-01-23 10:12:41.964 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.455 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000084 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.457 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000084 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.650 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.652 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4322MB free_disk=20.8909912109375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.652 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.652 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:42.675 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:42.676 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:12:42.677 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.781 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0c274757-9612-49ca-b1fa-8ae80aa5f510 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.782 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.782 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.805 222021 DEBUG nova.network.neutron [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updated VIF entry in instance network info cache for port 6096923e-378a-47e1-96e5-11c98a23abfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.806 222021 DEBUG nova.network.neutron [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:42.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:42.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.842 222021 DEBUG oslo_concurrency.lockutils [req-dd7b824d-c85a-4691-9877-e165da6ad71b req-d918f664-e9e0-4646-8ec5-efaccbf6f725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:42 np0005593233 nova_compute[222017]: 2026-01-23 10:12:42.845 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:12:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:12:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1433679818' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:43 np0005593233 nova_compute[222017]: 2026-01-23 10:12:43.320 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:43 np0005593233 nova_compute[222017]: 2026-01-23 10:12:43.327 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:43 np0005593233 nova_compute[222017]: 2026-01-23 10:12:43.357 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:43 np0005593233 nova_compute[222017]: 2026-01-23 10:12:43.387 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:12:43 np0005593233 nova_compute[222017]: 2026-01-23 10:12:43.388 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:44 np0005593233 nova_compute[222017]: 2026-01-23 10:12:44.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:44 np0005593233 nova_compute[222017]: 2026-01-23 10:12:44.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:44.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:44.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:45 np0005593233 nova_compute[222017]: 2026-01-23 10:12:45.170 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593233 nova_compute[222017]: 2026-01-23 10:12:45.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:45 np0005593233 nova_compute[222017]: 2026-01-23 10:12:45.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:45 np0005593233 nova_compute[222017]: 2026-01-23 10:12:45.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:12:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:46.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:12:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:46.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:12:46 np0005593233 nova_compute[222017]: 2026-01-23 10:12:46.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:46 np0005593233 nova_compute[222017]: 2026-01-23 10:12:46.966 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:48 np0005593233 nova_compute[222017]: 2026-01-23 10:12:48.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:48 np0005593233 nova_compute[222017]: 2026-01-23 10:12:48.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:12:48 np0005593233 nova_compute[222017]: 2026-01-23 10:12:48.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:12:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:48.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:48.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:49 np0005593233 nova_compute[222017]: 2026-01-23 10:12:49.119 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:49 np0005593233 nova_compute[222017]: 2026-01-23 10:12:49.119 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:49 np0005593233 nova_compute[222017]: 2026-01-23 10:12:49.120 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:12:49 np0005593233 nova_compute[222017]: 2026-01-23 10:12:49.120 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0c274757-9612-49ca-b1fa-8ae80aa5f510 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:49 np0005593233 podman[274316]: 2026-01-23 10:12:49.126153786 +0000 UTC m=+0.125755888 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:12:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:50Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:9d:7a 10.100.0.6
Jan 23 05:12:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:12:50Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:9d:7a 10.100.0.6
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.172081) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170172143, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 1010, "num_deletes": 252, "total_data_size": 1903715, "memory_usage": 1937504, "flush_reason": "Manual Compaction"}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170184159, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 1255303, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58668, "largest_seqno": 59673, "table_properties": {"data_size": 1250773, "index_size": 2118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10853, "raw_average_key_size": 20, "raw_value_size": 1241281, "raw_average_value_size": 2311, "num_data_blocks": 93, "num_entries": 537, "num_filter_entries": 537, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163102, "oldest_key_time": 1769163102, "file_creation_time": 1769163170, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 12163 microseconds, and 5093 cpu microseconds.
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.184234) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 1255303 bytes OK
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.184269) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.186800) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.186881) EVENT_LOG_v1 {"time_micros": 1769163170186863, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.186982) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 1898565, prev total WAL file size 1919362, number of live WAL files 2.
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.188410) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(1225KB)], [117(12MB)]
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170188505, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 13992520, "oldest_snapshot_seqno": -1}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8169 keys, 12128405 bytes, temperature: kUnknown
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170294361, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 12128405, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12074021, "index_size": 32832, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20485, "raw_key_size": 212412, "raw_average_key_size": 26, "raw_value_size": 11928811, "raw_average_value_size": 1460, "num_data_blocks": 1287, "num_entries": 8169, "num_filter_entries": 8169, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163170, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.295150) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 12128405 bytes
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.297009) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.7 rd, 114.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 12.1 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(20.8) write-amplify(9.7) OK, records in: 8690, records dropped: 521 output_compression: NoCompression
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.297051) EVENT_LOG_v1 {"time_micros": 1769163170297035, "job": 74, "event": "compaction_finished", "compaction_time_micros": 106220, "compaction_time_cpu_micros": 35073, "output_level": 6, "num_output_files": 1, "total_output_size": 12128405, "num_input_records": 8690, "num_output_records": 8169, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170297463, "job": 74, "event": "table_file_deletion", "file_number": 119}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170299867, "job": 74, "event": "table_file_deletion", "file_number": 117}
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.188296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.299944) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.299950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.299952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.299954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:12:50.299955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.678 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.679 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.699 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.821 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.822 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:50.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.829 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:12:50 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.830 222021 INFO nova.compute.claims [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:12:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:50.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:50.999 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/579676088' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.499 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.508 222021 DEBUG nova.compute.provider_tree [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.551 222021 DEBUG nova.scheduler.client.report [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.592 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.593 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.735 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.736 222021 DEBUG nova.network.neutron [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.797 222021 INFO nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.831 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:51 np0005593233 nova_compute[222017]: 2026-01-23 10:12:51.968 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.019 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.021 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.022 222021 INFO nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Creating image(s)#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.067 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.112 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.157 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.163 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.224 222021 DEBUG nova.policy [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1d2dc288e284cfdb76abe73a933efa5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2353af0cf8d5454eb7611f611cec5a05', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.241 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.242 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.243 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.244 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.277 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.283 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 26fc3a7b-7f23-4938-9544-1bf54912f642_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.325 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.349 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.350 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.681 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 26fc3a7b-7f23-4938-9544-1bf54912f642_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.782 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] resizing rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:12:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:52.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:52.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:52 np0005593233 nova_compute[222017]: 2026-01-23 10:12:52.912 222021 DEBUG nova.objects.instance [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lazy-loading 'migration_context' on Instance uuid 26fc3a7b-7f23-4938-9544-1bf54912f642 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:54.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:54.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:55 np0005593233 nova_compute[222017]: 2026-01-23 10:12:55.083 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:12:55 np0005593233 nova_compute[222017]: 2026-01-23 10:12:55.083 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Ensure instance console log exists: /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:12:55 np0005593233 nova_compute[222017]: 2026-01-23 10:12:55.084 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:55 np0005593233 nova_compute[222017]: 2026-01-23 10:12:55.084 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:55 np0005593233 nova_compute[222017]: 2026-01-23 10:12:55.084 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:55 np0005593233 nova_compute[222017]: 2026-01-23 10:12:55.343 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:56.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:12:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:56.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:12:56 np0005593233 nova_compute[222017]: 2026-01-23 10:12:56.862 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:56 np0005593233 nova_compute[222017]: 2026-01-23 10:12:56.970 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:57 np0005593233 nova_compute[222017]: 2026-01-23 10:12:57.135 222021 DEBUG nova.network.neutron [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Successfully created port: 412f682b-b5ed-4018-bce1-5f9fd55baa78 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:12:57 np0005593233 nova_compute[222017]: 2026-01-23 10:12:57.387 222021 INFO nova.compute.manager [None req-0e79c1e9-45ce-42a1-bd8a-4a70f4b510b0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Get console output#033[00m
Jan 23 05:12:57 np0005593233 nova_compute[222017]: 2026-01-23 10:12:57.526 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:12:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:58.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:12:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:58.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.261 222021 DEBUG nova.network.neutron [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Successfully updated port: 412f682b-b5ed-4018-bce1-5f9fd55baa78 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.562 222021 DEBUG nova.compute.manager [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-changed-412f682b-b5ed-4018-bce1-5f9fd55baa78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.562 222021 DEBUG nova.compute.manager [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Refreshing instance network info cache due to event network-changed-412f682b-b5ed-4018-bce1-5f9fd55baa78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.562 222021 DEBUG oslo_concurrency.lockutils [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-26fc3a7b-7f23-4938-9544-1bf54912f642" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.563 222021 DEBUG oslo_concurrency.lockutils [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-26fc3a7b-7f23-4938-9544-1bf54912f642" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.563 222021 DEBUG nova.network.neutron [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Refreshing network info cache for port 412f682b-b5ed-4018-bce1-5f9fd55baa78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:13:00 np0005593233 nova_compute[222017]: 2026-01-23 10:13:00.629 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "refresh_cache-26fc3a7b-7f23-4938-9544-1bf54912f642" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:13:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:00.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:00.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:01 np0005593233 nova_compute[222017]: 2026-01-23 10:13:01.153 222021 DEBUG nova.network.neutron [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:13:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:01 np0005593233 nova_compute[222017]: 2026-01-23 10:13:01.866 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:01 np0005593233 nova_compute[222017]: 2026-01-23 10:13:01.971 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:02 np0005593233 nova_compute[222017]: 2026-01-23 10:13:02.578 222021 DEBUG nova.network.neutron [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:13:02 np0005593233 nova_compute[222017]: 2026-01-23 10:13:02.594 222021 DEBUG oslo_concurrency.lockutils [req-f2e32e88-25a5-412e-8fa7-bc142f463e3a req-fa137415-7370-4a39-b7e7-1e8b643f5882 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-26fc3a7b-7f23-4938-9544-1bf54912f642" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:13:02 np0005593233 nova_compute[222017]: 2026-01-23 10:13:02.595 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquired lock "refresh_cache-26fc3a7b-7f23-4938-9544-1bf54912f642" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:13:02 np0005593233 nova_compute[222017]: 2026-01-23 10:13:02.595 222021 DEBUG nova.network.neutron [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:13:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:02.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:02 np0005593233 nova_compute[222017]: 2026-01-23 10:13:02.849 222021 DEBUG nova.network.neutron [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:13:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:02.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.404 222021 DEBUG nova.network.neutron [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Updating instance_info_cache with network_info: [{"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.433 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Releasing lock "refresh_cache-26fc3a7b-7f23-4938-9544-1bf54912f642" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.434 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Instance network_info: |[{"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.437 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Start _get_guest_xml network_info=[{"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.443 222021 WARNING nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.451 222021 DEBUG nova.virt.libvirt.host [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.451 222021 DEBUG nova.virt.libvirt.host [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.455 222021 DEBUG nova.virt.libvirt.host [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.455 222021 DEBUG nova.virt.libvirt.host [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.457 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.457 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.458 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.458 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.459 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.459 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.459 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.459 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.460 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.460 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.460 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.461 222021 DEBUG nova.virt.hardware [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.464 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:04.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:04.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:13:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/668757486' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:13:04 np0005593233 nova_compute[222017]: 2026-01-23 10:13:04.973 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.011 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.017 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:05 np0005593233 podman[274594]: 2026-01-23 10:13:05.098597596 +0000 UTC m=+0.104950379 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 05:13:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:13:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3584466881' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.524 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.526 222021 DEBUG nova.virt.libvirt.vif [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-562777535',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-562777535',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-562777535',id=136,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2353af0cf8d5454eb7611f611cec5a05',ramdisk_id='',reservation_id='r-oo3fxkbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1093375526',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1093375526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:51Z,user_data=None,user_id='e1d2dc288e284cfdb76abe73a933efa5',uuid=26fc3a7b-7f23-4938-9544-1bf54912f642,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.526 222021 DEBUG nova.network.os_vif_util [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Converting VIF {"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.528 222021 DEBUG nova.network.os_vif_util [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.529 222021 DEBUG nova.objects.instance [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lazy-loading 'pci_devices' on Instance uuid 26fc3a7b-7f23-4938-9544-1bf54912f642 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.554 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <uuid>26fc3a7b-7f23-4938-9544-1bf54912f642</uuid>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <name>instance-00000088</name>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-562777535</nova:name>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:13:04</nova:creationTime>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:user uuid="e1d2dc288e284cfdb76abe73a933efa5">tempest-ServersNegativeTestMultiTenantJSON-1093375526-project-member</nova:user>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:project uuid="2353af0cf8d5454eb7611f611cec5a05">tempest-ServersNegativeTestMultiTenantJSON-1093375526</nova:project>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <nova:port uuid="412f682b-b5ed-4018-bce1-5f9fd55baa78">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <entry name="serial">26fc3a7b-7f23-4938-9544-1bf54912f642</entry>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <entry name="uuid">26fc3a7b-7f23-4938-9544-1bf54912f642</entry>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/26fc3a7b-7f23-4938-9544-1bf54912f642_disk">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/26fc3a7b-7f23-4938-9544-1bf54912f642_disk.config">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:b3:3c:d0"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <target dev="tap412f682b-b5"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/console.log" append="off"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:13:05 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:13:05 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:13:05 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:13:05 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.556 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Preparing to wait for external event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.556 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.556 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.556 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.557 222021 DEBUG nova.virt.libvirt.vif [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-562777535',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-562777535',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-562777535',id=136,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2353af0cf8d5454eb7611f611cec5a05',ramdisk_id='',reservation_id='r-oo3fxkbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1093375526',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1093375526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:51Z,user_data=None,user_id='e1d2dc288e284cfdb76abe73a933efa5',uuid=26fc3a7b-7f23-4938-9544-1bf54912f642,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.557 222021 DEBUG nova.network.os_vif_util [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Converting VIF {"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.558 222021 DEBUG nova.network.os_vif_util [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.558 222021 DEBUG os_vif [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.559 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.560 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.560 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.564 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.564 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap412f682b-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.565 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap412f682b-b5, col_values=(('external_ids', {'iface-id': '412f682b-b5ed-4018-bce1-5f9fd55baa78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:3c:d0', 'vm-uuid': '26fc3a7b-7f23-4938-9544-1bf54912f642'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.566 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:05 np0005593233 NetworkManager[48871]: <info>  [1769163185.5680] manager: (tap412f682b-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.569 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.577 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.578 222021 INFO os_vif [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5')#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.650 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.651 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.651 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] No VIF found with MAC fa:16:3e:b3:3c:d0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.652 222021 INFO nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Using config drive#033[00m
Jan 23 05:13:05 np0005593233 nova_compute[222017]: 2026-01-23 10:13:05.684 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.157 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.156 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.158 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:13:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.368 222021 INFO nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Creating config drive at /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/disk.config#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.376 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6xsh7oc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.528 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6xsh7oc" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.565 222021 DEBUG nova.storage.rbd_utils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] rbd image 26fc3a7b-7f23-4938-9544-1bf54912f642_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.570 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/disk.config 26fc3a7b-7f23-4938-9544-1bf54912f642_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:06.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.871 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:06.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.878 222021 DEBUG oslo_concurrency.processutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/disk.config 26fc3a7b-7f23-4938-9544-1bf54912f642_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.309s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.879 222021 INFO nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Deleting local config drive /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642/disk.config because it was imported into RBD.#033[00m
Jan 23 05:13:06 np0005593233 kernel: tap412f682b-b5: entered promiscuous mode
Jan 23 05:13:06 np0005593233 NetworkManager[48871]: <info>  [1769163186.9509] manager: (tap412f682b-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:06Z|00576|binding|INFO|Claiming lport 412f682b-b5ed-4018-bce1-5f9fd55baa78 for this chassis.
Jan 23 05:13:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:06Z|00577|binding|INFO|412f682b-b5ed-4018-bce1-5f9fd55baa78: Claiming fa:16:3e:b3:3c:d0 10.100.0.13
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.960 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:3c:d0 10.100.0.13'], port_security=['fa:16:3e:b3:3c:d0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '26fc3a7b-7f23-4938-9544-1bf54912f642', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2353af0cf8d5454eb7611f611cec5a05', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e18d9781-4a58-4e23-b22f-3453072530c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=00c938c3-8f24-4e1c-91b3-3c377b3168ea, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=412f682b-b5ed-4018-bce1-5f9fd55baa78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.961 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 412f682b-b5ed-4018-bce1-5f9fd55baa78 in datapath 58448e5a-7023-4dc6-a80d-27f2b1d9fffe bound to our chassis#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.963 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58448e5a-7023-4dc6-a80d-27f2b1d9fffe#033[00m
Jan 23 05:13:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:06Z|00578|binding|INFO|Setting lport 412f682b-b5ed-4018-bce1-5f9fd55baa78 ovn-installed in OVS
Jan 23 05:13:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:06Z|00579|binding|INFO|Setting lport 412f682b-b5ed-4018-bce1-5f9fd55baa78 up in Southbound
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:06 np0005593233 nova_compute[222017]: 2026-01-23 10:13:06.976 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.982 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04914d35-0a7d-47c8-aceb-9c2e4e162238]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.983 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58448e5a-71 in ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.985 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58448e5a-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.985 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bb129aa7-2e9a-4e4d-970f-7a439059e65d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.986 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78d00555-db42-4b5d-8001-1990bfd66c95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:06.999 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c78b4090-54c8-4b68-82e9-3e607364520e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 systemd-machined[190954]: New machine qemu-63-instance-00000088.
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.016 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[09a3c5f2-75e5-416c-84ff-49d4c054e87c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 systemd[1]: Started Virtual Machine qemu-63-instance-00000088.
Jan 23 05:13:07 np0005593233 systemd-udevd[274737]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:13:07 np0005593233 NetworkManager[48871]: <info>  [1769163187.0521] device (tap412f682b-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:13:07 np0005593233 NetworkManager[48871]: <info>  [1769163187.0530] device (tap412f682b-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.061 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[103c4936-66d3-4d83-9aa8-2fe053959204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 NetworkManager[48871]: <info>  [1769163187.0690] manager: (tap58448e5a-70): new Veth device (/org/freedesktop/NetworkManager/Devices/274)
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.069 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0b5b7f-b589-4fc0-a64d-f894b27a31e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.109 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[989e3d55-1267-40c7-a763-c2d12a226b86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.113 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0ec11a-9392-4cb4-b36f-b8faa13fc750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 NetworkManager[48871]: <info>  [1769163187.1422] device (tap58448e5a-70): carrier: link connected
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.149 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c78f4382-a296-4710-8028-514309afd25f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.176 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bfeb96c2-59fd-43a9-875e-1a3176ca5793]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58448e5a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:38:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 709977, 'reachable_time': 36182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274766, 'error': None, 'target': 'ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.201 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0972c7-92e7-4511-a19e-284a0ae7b42f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:3882'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 709977, 'tstamp': 709977}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274767, 'error': None, 'target': 'ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.226 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c220f413-03c0-4191-8557-3520f2bcc153]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58448e5a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:38:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 709977, 'reachable_time': 36182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274768, 'error': None, 'target': 'ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.236 222021 DEBUG nova.compute.manager [req-eaec54c5-42ba-4903-9f08-b99ffce594cf req-a57350cc-dcf9-4142-a74f-e423c3a7ee1e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.236 222021 DEBUG oslo_concurrency.lockutils [req-eaec54c5-42ba-4903-9f08-b99ffce594cf req-a57350cc-dcf9-4142-a74f-e423c3a7ee1e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.236 222021 DEBUG oslo_concurrency.lockutils [req-eaec54c5-42ba-4903-9f08-b99ffce594cf req-a57350cc-dcf9-4142-a74f-e423c3a7ee1e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.237 222021 DEBUG oslo_concurrency.lockutils [req-eaec54c5-42ba-4903-9f08-b99ffce594cf req-a57350cc-dcf9-4142-a74f-e423c3a7ee1e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.237 222021 DEBUG nova.compute.manager [req-eaec54c5-42ba-4903-9f08-b99ffce594cf req-a57350cc-dcf9-4142-a74f-e423c3a7ee1e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Processing event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.283 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7965dc-d068-458b-ac06-b33b3ae5b7da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.375 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[404f185a-65d6-40cf-a230-7184dc750ef5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.377 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58448e5a-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.377 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.377 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58448e5a-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.379 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:07 np0005593233 NetworkManager[48871]: <info>  [1769163187.3803] manager: (tap58448e5a-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Jan 23 05:13:07 np0005593233 kernel: tap58448e5a-70: entered promiscuous mode
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.383 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58448e5a-70, col_values=(('external_ids', {'iface-id': 'b9754143-ef75-49d0-a9ce-42690cc9cdfe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.384 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:07Z|00580|binding|INFO|Releasing lport b9754143-ef75-49d0-a9ce-42690cc9cdfe from this chassis (sb_readonly=0)
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.400 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58448e5a-7023-4dc6-a80d-27f2b1d9fffe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58448e5a-7023-4dc6-a80d-27f2b1d9fffe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.401 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aabf0848-6634-438b-922b-0df7aa91a744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.402 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-58448e5a-7023-4dc6-a80d-27f2b1d9fffe
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/58448e5a-7023-4dc6-a80d-27f2b1d9fffe.pid.haproxy
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 58448e5a-7023-4dc6-a80d-27f2b1d9fffe
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:13:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:07.403 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'env', 'PROCESS_TAG=haproxy-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58448e5a-7023-4dc6-a80d-27f2b1d9fffe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:13:07 np0005593233 podman[274837]: 2026-01-23 10:13:07.817096049 +0000 UTC m=+0.060338027 container create 487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:13:07 np0005593233 systemd[1]: Started libpod-conmon-487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9.scope.
Jan 23 05:13:07 np0005593233 podman[274837]: 2026-01-23 10:13:07.788889891 +0000 UTC m=+0.032131889 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:13:07 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:13:07 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a43fbeef7f3ea36d61060e3543e327cd111ba0f283dca1addc68b070c22f08b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.907 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.908 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163187.906209, 26fc3a7b-7f23-4938-9544-1bf54912f642 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.908 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] VM Started (Lifecycle Event)#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.911 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.916 222021 INFO nova.virt.libvirt.driver [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Instance spawned successfully.#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.916 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.938 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:07 np0005593233 podman[274837]: 2026-01-23 10:13:07.945806239 +0000 UTC m=+0.189048287 container init 487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.945 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.951 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.952 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.953 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.953 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:13:07 np0005593233 podman[274837]: 2026-01-23 10:13:07.954083433 +0000 UTC m=+0.197325411 container start 487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.954 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.954 222021 DEBUG nova.virt.libvirt.driver [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:13:07 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [NOTICE]   (274861) : New worker (274863) forked
Jan 23 05:13:07 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [NOTICE]   (274861) : Loading success.
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.992 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.993 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163187.9064033, 26fc3a7b-7f23-4938-9544-1bf54912f642 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:13:07 np0005593233 nova_compute[222017]: 2026-01-23 10:13:07.993 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.029 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.034 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163187.911065, 26fc3a7b-7f23-4938-9544-1bf54912f642 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.034 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.044 222021 INFO nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Took 16.02 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.045 222021 DEBUG nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.060 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.065 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.097 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.125 222021 INFO nova.compute.manager [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Took 17.33 seconds to build instance.#033[00m
Jan 23 05:13:08 np0005593233 nova_compute[222017]: 2026-01-23 10:13:08.146 222021 DEBUG oslo_concurrency.lockutils [None req-7134c231-c7b5-41d9-b34b-0ecb9aea6a0d e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:08.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:08.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:09 np0005593233 nova_compute[222017]: 2026-01-23 10:13:09.551 222021 DEBUG nova.compute.manager [req-5330c488-7ba9-4d35-9a94-a48662b4fe56 req-1bdc1296-d8ba-4500-92ab-21a4b676801c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:09 np0005593233 nova_compute[222017]: 2026-01-23 10:13:09.552 222021 DEBUG oslo_concurrency.lockutils [req-5330c488-7ba9-4d35-9a94-a48662b4fe56 req-1bdc1296-d8ba-4500-92ab-21a4b676801c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:09 np0005593233 nova_compute[222017]: 2026-01-23 10:13:09.552 222021 DEBUG oslo_concurrency.lockutils [req-5330c488-7ba9-4d35-9a94-a48662b4fe56 req-1bdc1296-d8ba-4500-92ab-21a4b676801c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:09 np0005593233 nova_compute[222017]: 2026-01-23 10:13:09.552 222021 DEBUG oslo_concurrency.lockutils [req-5330c488-7ba9-4d35-9a94-a48662b4fe56 req-1bdc1296-d8ba-4500-92ab-21a4b676801c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:09 np0005593233 nova_compute[222017]: 2026-01-23 10:13:09.553 222021 DEBUG nova.compute.manager [req-5330c488-7ba9-4d35-9a94-a48662b4fe56 req-1bdc1296-d8ba-4500-92ab-21a4b676801c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] No waiting events found dispatching network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:09 np0005593233 nova_compute[222017]: 2026-01-23 10:13:09.553 222021 WARNING nova.compute.manager [req-5330c488-7ba9-4d35-9a94-a48662b4fe56 req-1bdc1296-d8ba-4500-92ab-21a4b676801c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received unexpected event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:13:10 np0005593233 nova_compute[222017]: 2026-01-23 10:13:10.568 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:10.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:10.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:11.160 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:11 np0005593233 nova_compute[222017]: 2026-01-23 10:13:11.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.073 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.074 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.074 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.074 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.075 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.076 222021 INFO nova.compute.manager [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Terminating instance#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.077 222021 DEBUG nova.compute.manager [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:13:12 np0005593233 kernel: tap412f682b-b5 (unregistering): left promiscuous mode
Jan 23 05:13:12 np0005593233 NetworkManager[48871]: <info>  [1769163192.2389] device (tap412f682b-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:13:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:12Z|00581|binding|INFO|Releasing lport 412f682b-b5ed-4018-bce1-5f9fd55baa78 from this chassis (sb_readonly=0)
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.245 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:12Z|00582|binding|INFO|Setting lport 412f682b-b5ed-4018-bce1-5f9fd55baa78 down in Southbound
Jan 23 05:13:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:12Z|00583|binding|INFO|Removing iface tap412f682b-b5 ovn-installed in OVS
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.259 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:3c:d0 10.100.0.13'], port_security=['fa:16:3e:b3:3c:d0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '26fc3a7b-7f23-4938-9544-1bf54912f642', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2353af0cf8d5454eb7611f611cec5a05', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e18d9781-4a58-4e23-b22f-3453072530c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=00c938c3-8f24-4e1c-91b3-3c377b3168ea, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=412f682b-b5ed-4018-bce1-5f9fd55baa78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.260 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 412f682b-b5ed-4018-bce1-5f9fd55baa78 in datapath 58448e5a-7023-4dc6-a80d-27f2b1d9fffe unbound from our chassis#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.261 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58448e5a-7023-4dc6-a80d-27f2b1d9fffe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.263 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[75ce3d10-136a-4c19-9688-0349c514963a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.263 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe namespace which is not needed anymore#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.274 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000088.scope: Deactivated successfully.
Jan 23 05:13:12 np0005593233 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000088.scope: Consumed 5.169s CPU time.
Jan 23 05:13:12 np0005593233 systemd-machined[190954]: Machine qemu-63-instance-00000088 terminated.
Jan 23 05:13:12 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [NOTICE]   (274861) : haproxy version is 2.8.14-c23fe91
Jan 23 05:13:12 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [NOTICE]   (274861) : path to executable is /usr/sbin/haproxy
Jan 23 05:13:12 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [WARNING]  (274861) : Exiting Master process...
Jan 23 05:13:12 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [ALERT]    (274861) : Current worker (274863) exited with code 143 (Terminated)
Jan 23 05:13:12 np0005593233 neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe[274857]: [WARNING]  (274861) : All workers exited. Exiting... (0)
Jan 23 05:13:12 np0005593233 systemd[1]: libpod-487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9.scope: Deactivated successfully.
Jan 23 05:13:12 np0005593233 podman[274896]: 2026-01-23 10:13:12.437959029 +0000 UTC m=+0.070422693 container died 487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.507 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.514 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.522 222021 INFO nova.virt.libvirt.driver [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Instance destroyed successfully.#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.523 222021 DEBUG nova.objects.instance [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lazy-loading 'resources' on Instance uuid 26fc3a7b-7f23-4938-9544-1bf54912f642 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.542 222021 DEBUG nova.virt.libvirt.vif [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:12:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-562777535',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-562777535',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-562777535',id=136,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:13:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2353af0cf8d5454eb7611f611cec5a05',ramdisk_id='',reservation_id='r-oo3fxkbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1093375526',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1093375526-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:13:08Z,user_data=None,user_id='e1d2dc288e284cfdb76abe73a933efa5',uuid=26fc3a7b-7f23-4938-9544-1bf54912f642,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.543 222021 DEBUG nova.network.os_vif_util [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Converting VIF {"id": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "address": "fa:16:3e:b3:3c:d0", "network": {"id": "58448e5a-7023-4dc6-a80d-27f2b1d9fffe", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-350779733-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2353af0cf8d5454eb7611f611cec5a05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap412f682b-b5", "ovs_interfaceid": "412f682b-b5ed-4018-bce1-5f9fd55baa78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.544 222021 DEBUG nova.network.os_vif_util [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.545 222021 DEBUG os_vif [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.551 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap412f682b-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.554 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.559 222021 DEBUG nova.compute.manager [req-4d85ef3a-28eb-4007-8b3e-9abe055bf898 req-2c4a2180-8f34-46f1-9191-4f983766bcd4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-vif-unplugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.560 222021 DEBUG oslo_concurrency.lockutils [req-4d85ef3a-28eb-4007-8b3e-9abe055bf898 req-2c4a2180-8f34-46f1-9191-4f983766bcd4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.560 222021 DEBUG oslo_concurrency.lockutils [req-4d85ef3a-28eb-4007-8b3e-9abe055bf898 req-2c4a2180-8f34-46f1-9191-4f983766bcd4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.560 222021 DEBUG oslo_concurrency.lockutils [req-4d85ef3a-28eb-4007-8b3e-9abe055bf898 req-2c4a2180-8f34-46f1-9191-4f983766bcd4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.560 222021 DEBUG nova.compute.manager [req-4d85ef3a-28eb-4007-8b3e-9abe055bf898 req-2c4a2180-8f34-46f1-9191-4f983766bcd4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] No waiting events found dispatching network-vif-unplugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.560 222021 DEBUG nova.compute.manager [req-4d85ef3a-28eb-4007-8b3e-9abe055bf898 req-2c4a2180-8f34-46f1-9191-4f983766bcd4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-vif-unplugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.561 222021 INFO os_vif [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:3c:d0,bridge_name='br-int',has_traffic_filtering=True,id=412f682b-b5ed-4018-bce1-5f9fd55baa78,network=Network(58448e5a-7023-4dc6-a80d-27f2b1d9fffe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap412f682b-b5')#033[00m
Jan 23 05:13:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9-userdata-shm.mount: Deactivated successfully.
Jan 23 05:13:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay-3a43fbeef7f3ea36d61060e3543e327cd111ba0f283dca1addc68b070c22f08b-merged.mount: Deactivated successfully.
Jan 23 05:13:12 np0005593233 podman[274896]: 2026-01-23 10:13:12.593707605 +0000 UTC m=+0.226171249 container cleanup 487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:13:12 np0005593233 systemd[1]: libpod-conmon-487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9.scope: Deactivated successfully.
Jan 23 05:13:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:12.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:12.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:12 np0005593233 podman[274952]: 2026-01-23 10:13:12.922850445 +0000 UTC m=+0.301822619 container remove 487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.931 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d897c3-ea15-40c4-a22b-ac6ba9e5720b]: (4, ('Fri Jan 23 10:13:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe (487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9)\n487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9\nFri Jan 23 10:13:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe (487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9)\n487d8cf29c83dc0e951f9f537958682a3d4d9ccd97bd0a86e1a9cd8038084cf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.933 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[82c567cf-dba3-4866-b2f9-c8cf86d98e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.934 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58448e5a-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.936 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 kernel: tap58448e5a-70: left promiscuous mode
Jan 23 05:13:12 np0005593233 nova_compute[222017]: 2026-01-23 10:13:12.950 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.953 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[15afd5ba-81dd-4683-a27a-c3f13cf2fff9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.975 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ecd052-6e9e-4b30-a435-354bc4474e13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.977 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f0265403-e7e4-44c1-83a9-1ed5282f1d16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:12.998 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5e88c8-a500-4c43-a416-5886b16157c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 709968, 'reachable_time': 39308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274970, 'error': None, 'target': 'ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:13.002 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58448e5a-7023-4dc6-a80d-27f2b1d9fffe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:13:13 np0005593233 systemd[1]: run-netns-ovnmeta\x2d58448e5a\x2d7023\x2d4dc6\x2da80d\x2d27f2b1d9fffe.mount: Deactivated successfully.
Jan 23 05:13:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:13.002 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[6e04b6b4-fde8-4d63-b0a7-bfbbc68f6dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:13 np0005593233 nova_compute[222017]: 2026-01-23 10:13:13.440 222021 INFO nova.virt.libvirt.driver [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Deleting instance files /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642_del#033[00m
Jan 23 05:13:13 np0005593233 nova_compute[222017]: 2026-01-23 10:13:13.442 222021 INFO nova.virt.libvirt.driver [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Deletion of /var/lib/nova/instances/26fc3a7b-7f23-4938-9544-1bf54912f642_del complete#033[00m
Jan 23 05:13:13 np0005593233 nova_compute[222017]: 2026-01-23 10:13:13.526 222021 INFO nova.compute.manager [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Took 1.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:13:13 np0005593233 nova_compute[222017]: 2026-01-23 10:13:13.527 222021 DEBUG oslo.service.loopingcall [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:13:13 np0005593233 nova_compute[222017]: 2026-01-23 10:13:13.527 222021 DEBUG nova.compute.manager [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:13:13 np0005593233 nova_compute[222017]: 2026-01-23 10:13:13.527 222021 DEBUG nova.network.neutron [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:13:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:14.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:14.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:15 np0005593233 nova_compute[222017]: 2026-01-23 10:13:15.820 222021 DEBUG nova.compute.manager [req-e6774171-e31f-40ea-967d-972239be057f req-7ee0a7a8-f0bd-4894-b024-6ab605cab87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:15 np0005593233 nova_compute[222017]: 2026-01-23 10:13:15.820 222021 DEBUG oslo_concurrency.lockutils [req-e6774171-e31f-40ea-967d-972239be057f req-7ee0a7a8-f0bd-4894-b024-6ab605cab87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:15 np0005593233 nova_compute[222017]: 2026-01-23 10:13:15.821 222021 DEBUG oslo_concurrency.lockutils [req-e6774171-e31f-40ea-967d-972239be057f req-7ee0a7a8-f0bd-4894-b024-6ab605cab87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:15 np0005593233 nova_compute[222017]: 2026-01-23 10:13:15.821 222021 DEBUG oslo_concurrency.lockutils [req-e6774171-e31f-40ea-967d-972239be057f req-7ee0a7a8-f0bd-4894-b024-6ab605cab87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:15 np0005593233 nova_compute[222017]: 2026-01-23 10:13:15.821 222021 DEBUG nova.compute.manager [req-e6774171-e31f-40ea-967d-972239be057f req-7ee0a7a8-f0bd-4894-b024-6ab605cab87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] No waiting events found dispatching network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:15 np0005593233 nova_compute[222017]: 2026-01-23 10:13:15.821 222021 WARNING nova.compute.manager [req-e6774171-e31f-40ea-967d-972239be057f req-7ee0a7a8-f0bd-4894-b024-6ab605cab87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received unexpected event network-vif-plugged-412f682b-b5ed-4018-bce1-5f9fd55baa78 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:13:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:16.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:16 np0005593233 nova_compute[222017]: 2026-01-23 10:13:16.876 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:16.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.496 222021 DEBUG nova.network.neutron [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.533 222021 INFO nova.compute.manager [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Took 4.01 seconds to deallocate network for instance.#033[00m
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.596 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.597 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.634 222021 DEBUG nova.compute.manager [req-6e256870-5b8e-4d8b-91ac-221df8b8af4f req-7941260a-3a86-4586-bbac-2e1d88abf9cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Received event network-vif-deleted-412f682b-b5ed-4018-bce1-5f9fd55baa78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:17 np0005593233 nova_compute[222017]: 2026-01-23 10:13:17.684 222021 DEBUG oslo_concurrency.processutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:13:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3592858714' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:13:18 np0005593233 nova_compute[222017]: 2026-01-23 10:13:18.235 222021 DEBUG oslo_concurrency.processutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:18 np0005593233 nova_compute[222017]: 2026-01-23 10:13:18.242 222021 DEBUG nova.compute.provider_tree [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:13:18 np0005593233 nova_compute[222017]: 2026-01-23 10:13:18.561 222021 DEBUG nova.scheduler.client.report [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:13:18 np0005593233 nova_compute[222017]: 2026-01-23 10:13:18.589 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:18 np0005593233 nova_compute[222017]: 2026-01-23 10:13:18.623 222021 INFO nova.scheduler.client.report [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Deleted allocations for instance 26fc3a7b-7f23-4938-9544-1bf54912f642#033[00m
Jan 23 05:13:18 np0005593233 nova_compute[222017]: 2026-01-23 10:13:18.717 222021 DEBUG oslo_concurrency.lockutils [None req-41f90d80-13ea-426a-80c3-e9038a7e0ea4 e1d2dc288e284cfdb76abe73a933efa5 2353af0cf8d5454eb7611f611cec5a05 - - default default] Lock "26fc3a7b-7f23-4938-9544-1bf54912f642" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:18.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:18.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:20 np0005593233 podman[274994]: 2026-01-23 10:13:20.10207155 +0000 UTC m=+0.098292642 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:13:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:20.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:21 np0005593233 nova_compute[222017]: 2026-01-23 10:13:21.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:21 np0005593233 nova_compute[222017]: 2026-01-23 10:13:21.429 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 0c274757-9612-49ca-b1fa-8ae80aa5f510 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:13:21 np0005593233 nova_compute[222017]: 2026-01-23 10:13:21.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:21 np0005593233 nova_compute[222017]: 2026-01-23 10:13:21.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:21 np0005593233 nova_compute[222017]: 2026-01-23 10:13:21.468 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:21 np0005593233 nova_compute[222017]: 2026-01-23 10:13:21.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:22 np0005593233 nova_compute[222017]: 2026-01-23 10:13:22.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:22.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:22.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:24 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:24Z|00584|binding|INFO|Releasing lport 2f33e608-79d7-4774-b3de-9f2488a517e6 from this chassis (sb_readonly=0)
Jan 23 05:13:24 np0005593233 nova_compute[222017]: 2026-01-23 10:13:24.563 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:24.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:24.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:26 np0005593233 nova_compute[222017]: 2026-01-23 10:13:26.879 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:26.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:26.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:27 np0005593233 nova_compute[222017]: 2026-01-23 10:13:27.521 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163192.5194314, 26fc3a7b-7f23-4938-9544-1bf54912f642 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:13:27 np0005593233 nova_compute[222017]: 2026-01-23 10:13:27.522 222021 INFO nova.compute.manager [-] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:13:27 np0005593233 nova_compute[222017]: 2026-01-23 10:13:27.559 222021 DEBUG nova.compute.manager [None req-708bddb2-c079-40dd-8cdf-e24841891e68 - - - - - -] [instance: 26fc3a7b-7f23-4938-9544-1bf54912f642] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:27 np0005593233 nova_compute[222017]: 2026-01-23 10:13:27.598 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:28.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:28.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:30.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:30.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:31 np0005593233 nova_compute[222017]: 2026-01-23 10:13:31.881 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:32 np0005593233 nova_compute[222017]: 2026-01-23 10:13:32.601 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:32.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:32.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:13:34Z|00585|binding|INFO|Releasing lport 2f33e608-79d7-4774-b3de-9f2488a517e6 from this chassis (sb_readonly=0)
Jan 23 05:13:34 np0005593233 nova_compute[222017]: 2026-01-23 10:13:34.487 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:34.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:34.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e297 e297: 3 total, 3 up, 3 in
Jan 23 05:13:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e298 e298: 3 total, 3 up, 3 in
Jan 23 05:13:36 np0005593233 podman[275014]: 2026-01-23 10:13:36.138575995 +0000 UTC m=+0.137775219 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:13:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:36 np0005593233 nova_compute[222017]: 2026-01-23 10:13:36.885 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:36.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:36.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e299 e299: 3 total, 3 up, 3 in
Jan 23 05:13:37 np0005593233 nova_compute[222017]: 2026-01-23 10:13:37.426 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:37 np0005593233 nova_compute[222017]: 2026-01-23 10:13:37.604 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:38 np0005593233 nova_compute[222017]: 2026-01-23 10:13:38.649 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:38.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:38.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:40 np0005593233 nova_compute[222017]: 2026-01-23 10:13:40.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:40.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:40.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:41 np0005593233 nova_compute[222017]: 2026-01-23 10:13:41.888 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e300 e300: 3 total, 3 up, 3 in
Jan 23 05:13:42 np0005593233 nova_compute[222017]: 2026-01-23 10:13:42.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:42.676 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:42.677 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:13:42.678 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:13:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:42.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:13:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:42.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.410 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.439 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.439 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.440 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.440 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.440 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:13:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3004500598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:13:43 np0005593233 nova_compute[222017]: 2026-01-23 10:13:43.907 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:44 np0005593233 nova_compute[222017]: 2026-01-23 10:13:44.773 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000084 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:44 np0005593233 nova_compute[222017]: 2026-01-23 10:13:44.774 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000084 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:44.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.004 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.006 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4215MB free_disk=20.68169403076172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.006 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.007 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.728 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0c274757-9612-49ca-b1fa-8ae80aa5f510 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.729 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.730 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:13:45 np0005593233 nova_compute[222017]: 2026-01-23 10:13:45.956 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:46 np0005593233 nova_compute[222017]: 2026-01-23 10:13:46.532 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.575s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:46 np0005593233 nova_compute[222017]: 2026-01-23 10:13:46.538 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:13:46 np0005593233 nova_compute[222017]: 2026-01-23 10:13:46.567 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:13:46 np0005593233 nova_compute[222017]: 2026-01-23 10:13:46.600 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:13:46 np0005593233 nova_compute[222017]: 2026-01-23 10:13:46.601 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:46 np0005593233 nova_compute[222017]: 2026-01-23 10:13:46.892 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:46.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:47 np0005593233 nova_compute[222017]: 2026-01-23 10:13:47.576 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:47 np0005593233 nova_compute[222017]: 2026-01-23 10:13:47.578 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:47 np0005593233 nova_compute[222017]: 2026-01-23 10:13:47.578 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:47 np0005593233 nova_compute[222017]: 2026-01-23 10:13:47.578 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:47 np0005593233 nova_compute[222017]: 2026-01-23 10:13:47.579 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:13:47 np0005593233 nova_compute[222017]: 2026-01-23 10:13:47.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.388 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.389 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.389 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.703 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.703 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.703 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:13:48 np0005593233 nova_compute[222017]: 2026-01-23 10:13:48.704 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0c274757-9612-49ca-b1fa-8ae80aa5f510 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:48.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:48.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:49 np0005593233 nova_compute[222017]: 2026-01-23 10:13:49.382 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:50 np0005593233 podman[275111]: 2026-01-23 10:13:50.814879614 +0000 UTC m=+0.069161967 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:13:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:50.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:51 np0005593233 nova_compute[222017]: 2026-01-23 10:13:51.379 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:13:51 np0005593233 nova_compute[222017]: 2026-01-23 10:13:51.519 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:13:51 np0005593233 nova_compute[222017]: 2026-01-23 10:13:51.519 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:13:51 np0005593233 nova_compute[222017]: 2026-01-23 10:13:51.893 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:13:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:13:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:13:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:13:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:13:52 np0005593233 nova_compute[222017]: 2026-01-23 10:13:52.611 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:52.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:52.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:13:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/214289845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:13:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:54.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:54.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:13:55 np0005593233 nova_compute[222017]: 2026-01-23 10:13:55.511 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:56 np0005593233 nova_compute[222017]: 2026-01-23 10:13:56.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:56.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:13:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:56.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:13:57 np0005593233 nova_compute[222017]: 2026-01-23 10:13:57.614 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:58 np0005593233 nova_compute[222017]: 2026-01-23 10:13:58.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:58.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:13:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:13:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:58.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:14:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:14:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:00.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:00.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:01 np0005593233 nova_compute[222017]: 2026-01-23 10:14:01.157 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:01 np0005593233 nova_compute[222017]: 2026-01-23 10:14:01.899 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:02 np0005593233 nova_compute[222017]: 2026-01-23 10:14:02.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:02.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:02.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:04.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:04.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:06.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:06 np0005593233 nova_compute[222017]: 2026-01-23 10:14:06.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:06.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:07 np0005593233 podman[275407]: 2026-01-23 10:14:07.102023795 +0000 UTC m=+0.109270752 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:14:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:07Z|00586|binding|INFO|Releasing lport 2f33e608-79d7-4774-b3de-9f2488a517e6 from this chassis (sb_readonly=0)
Jan 23 05:14:07 np0005593233 nova_compute[222017]: 2026-01-23 10:14:07.453 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:07 np0005593233 nova_compute[222017]: 2026-01-23 10:14:07.651 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.124 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.125 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.126 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.126 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.126 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.127 222021 INFO nova.compute.manager [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Terminating instance#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.128 222021 DEBUG nova.compute.manager [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:14:08 np0005593233 kernel: tap6096923e-37 (unregistering): left promiscuous mode
Jan 23 05:14:08 np0005593233 NetworkManager[48871]: <info>  [1769163248.2448] device (tap6096923e-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:14:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:08Z|00587|binding|INFO|Releasing lport 6096923e-378a-47e1-96e5-11c98a23abfa from this chassis (sb_readonly=0)
Jan 23 05:14:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:08Z|00588|binding|INFO|Setting lport 6096923e-378a-47e1-96e5-11c98a23abfa down in Southbound
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.254 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:08Z|00589|binding|INFO|Removing iface tap6096923e-37 ovn-installed in OVS
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.256 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.263 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:9d:7a 10.100.0.6'], port_security=['fa:16:3e:5f:9d:7a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0c274757-9612-49ca-b1fa-8ae80aa5f510', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b179a4bf-7ae4-400d-ac79-526f3efe132c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=159e112b-eac7-4007-8b5d-563cdf20827d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=6096923e-378a-47e1-96e5-11c98a23abfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.266 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 6096923e-378a-47e1-96e5-11c98a23abfa in datapath b909e0f1-3092-44cc-b25b-a0acc5a5cd0c unbound from our chassis#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.269 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.272 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.273 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad45047-0c48-444f-a778-f67428d2a280]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.275 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c namespace which is not needed anymore#033[00m
Jan 23 05:14:08 np0005593233 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000084.scope: Deactivated successfully.
Jan 23 05:14:08 np0005593233 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000084.scope: Consumed 18.946s CPU time.
Jan 23 05:14:08 np0005593233 systemd-machined[190954]: Machine qemu-62-instance-00000084 terminated.
Jan 23 05:14:08 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [NOTICE]   (274125) : haproxy version is 2.8.14-c23fe91
Jan 23 05:14:08 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [NOTICE]   (274125) : path to executable is /usr/sbin/haproxy
Jan 23 05:14:08 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [WARNING]  (274125) : Exiting Master process...
Jan 23 05:14:08 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [WARNING]  (274125) : Exiting Master process...
Jan 23 05:14:08 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [ALERT]    (274125) : Current worker (274127) exited with code 143 (Terminated)
Jan 23 05:14:08 np0005593233 neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c[274114]: [WARNING]  (274125) : All workers exited. Exiting... (0)
Jan 23 05:14:08 np0005593233 systemd[1]: libpod-3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646.scope: Deactivated successfully.
Jan 23 05:14:08 np0005593233 podman[275459]: 2026-01-23 10:14:08.43828235 +0000 UTC m=+0.045851808 container died 3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:14:08 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646-userdata-shm.mount: Deactivated successfully.
Jan 23 05:14:08 np0005593233 systemd[1]: var-lib-containers-storage-overlay-d6370f98d07b9d51896abc3dcae65f51297ea8a8d80e432c367c0487b2f20354-merged.mount: Deactivated successfully.
Jan 23 05:14:08 np0005593233 podman[275459]: 2026-01-23 10:14:08.480290939 +0000 UTC m=+0.087860397 container cleanup 3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:14:08 np0005593233 systemd[1]: libpod-conmon-3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646.scope: Deactivated successfully.
Jan 23 05:14:08 np0005593233 podman[275489]: 2026-01-23 10:14:08.551902315 +0000 UTC m=+0.047724212 container remove 3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:14:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e301 e301: 3 total, 3 up, 3 in
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.559 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0affe2-bf31-4f30-82c1-22abc9ed5790]: (4, ('Fri Jan 23 10:14:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c (3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646)\n3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646\nFri Jan 23 10:14:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c (3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646)\n3a6bca910b12877ad3cc70f7fd8d0cc70737737007e4341f772d49831c0ca646\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.563 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b4be4ad0-ae73-4dcd-b235-2c5ce487f93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.564 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb909e0f1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:08 np0005593233 kernel: tapb909e0f1-30: left promiscuous mode
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.566 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.575 222021 INFO nova.virt.libvirt.driver [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Instance destroyed successfully.#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.576 222021 DEBUG nova.objects.instance [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid 0c274757-9612-49ca-b1fa-8ae80aa5f510 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.587 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.590 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78ab4a42-3ec5-4198-90a2-c14db417d676]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.593 222021 DEBUG nova.virt.libvirt.vif [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:12:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1143807706',display_name='tempest-TestNetworkBasicOps-server-1143807706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1143807706',id=132,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCimIZk2pOzEd7S89dRSmUHOOjnboCQFRq00t4KqTPGCkTIF8AjIYlOUd1UqUpNzeDj0eIsu4Yppl0TNbbIpwzFXQsRBfBhSqF/JqKHD5mgeWA8E9Qvf/RqS4B++mXDB+Q==',key_name='tempest-TestNetworkBasicOps-1966834096',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:12:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-f3zugqgz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:12:35Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=0c274757-9612-49ca-b1fa-8ae80aa5f510,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.594 222021 DEBUG nova.network.os_vif_util [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.595 222021 DEBUG nova.network.os_vif_util [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.595 222021 DEBUG os_vif [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.597 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.598 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6096923e-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.599 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.605 222021 INFO os_vif [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:9d:7a,bridge_name='br-int',has_traffic_filtering=True,id=6096923e-378a-47e1-96e5-11c98a23abfa,network=Network(b909e0f1-3092-44cc-b25b-a0acc5a5cd0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6096923e-37')#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.606 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.607 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[560a8be0-ba8d-4774-bd16-2f9232fcf94e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.609 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[639b2c35-a60f-4e54-84a5-db887e45488b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.630 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[99b5523b-df20-4b29-b030-327f153625cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706677, 'reachable_time': 33841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275525, 'error': None, 'target': 'ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 systemd[1]: run-netns-ovnmeta\x2db909e0f1\x2d3092\x2d44cc\x2db25b\x2da0acc5a5cd0c.mount: Deactivated successfully.
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.634 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b909e0f1-3092-44cc-b25b-a0acc5a5cd0c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.634 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c6198e82-9680-44de-834d-2f4b2a5424e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:08.635 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.793 222021 DEBUG nova.compute.manager [req-a1d2c8cf-6252-4291-8ede-ce95d6d01822 req-9875b01e-f45c-4a77-b555-a535dd67caf7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-vif-unplugged-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.793 222021 DEBUG oslo_concurrency.lockutils [req-a1d2c8cf-6252-4291-8ede-ce95d6d01822 req-9875b01e-f45c-4a77-b555-a535dd67caf7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.794 222021 DEBUG oslo_concurrency.lockutils [req-a1d2c8cf-6252-4291-8ede-ce95d6d01822 req-9875b01e-f45c-4a77-b555-a535dd67caf7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.794 222021 DEBUG oslo_concurrency.lockutils [req-a1d2c8cf-6252-4291-8ede-ce95d6d01822 req-9875b01e-f45c-4a77-b555-a535dd67caf7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.794 222021 DEBUG nova.compute.manager [req-a1d2c8cf-6252-4291-8ede-ce95d6d01822 req-9875b01e-f45c-4a77-b555-a535dd67caf7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] No waiting events found dispatching network-vif-unplugged-6096923e-378a-47e1-96e5-11c98a23abfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:14:08 np0005593233 nova_compute[222017]: 2026-01-23 10:14:08.794 222021 DEBUG nova.compute.manager [req-a1d2c8cf-6252-4291-8ede-ce95d6d01822 req-9875b01e-f45c-4a77-b555-a535dd67caf7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-vif-unplugged-6096923e-378a-47e1-96e5-11c98a23abfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:14:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:08.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:08.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.146 222021 INFO nova.virt.libvirt.driver [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Deleting instance files /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510_del#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.147 222021 INFO nova.virt.libvirt.driver [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Deletion of /var/lib/nova/instances/0c274757-9612-49ca-b1fa-8ae80aa5f510_del complete#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.219 222021 INFO nova.compute.manager [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Took 1.09 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.220 222021 DEBUG oslo.service.loopingcall [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.220 222021 DEBUG nova.compute.manager [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.221 222021 DEBUG nova.network.neutron [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.245 222021 DEBUG nova.compute.manager [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-changed-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.245 222021 DEBUG nova.compute.manager [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Refreshing instance network info cache due to event network-changed-6096923e-378a-47e1-96e5-11c98a23abfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.246 222021 DEBUG oslo_concurrency.lockutils [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.246 222021 DEBUG oslo_concurrency.lockutils [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.247 222021 DEBUG nova.network.neutron [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Refreshing network info cache for port 6096923e-378a-47e1-96e5-11c98a23abfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:14:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e302 e302: 3 total, 3 up, 3 in
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.784 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.827 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:09 np0005593233 nova_compute[222017]: 2026-01-23 10:14:09.827 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.052 222021 DEBUG nova.network.neutron [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.091 222021 INFO nova.compute.manager [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Took 0.87 seconds to deallocate network for instance.#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.148 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.149 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.225 222021 DEBUG oslo_concurrency.processutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3340292720' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.676 222021 DEBUG oslo_concurrency.processutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.685 222021 DEBUG nova.compute.provider_tree [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.706 222021 DEBUG nova.scheduler.client.report [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.731 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e303 e303: 3 total, 3 up, 3 in
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.761 222021 INFO nova.scheduler.client.report [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance 0c274757-9612-49ca-b1fa-8ae80aa5f510#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.814 222021 DEBUG nova.compute.manager [req-2ed9ff97-286a-4b77-a3c2-6a20d4be7cf7 req-b0d41572-d741-4f8a-a580-fc070eb69981 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-vif-deleted-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.822 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.832 222021 DEBUG nova.network.neutron [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updated VIF entry in instance network info cache for port 6096923e-378a-47e1-96e5-11c98a23abfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.832 222021 DEBUG nova.network.neutron [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Updating instance_info_cache with network_info: [{"id": "6096923e-378a-47e1-96e5-11c98a23abfa", "address": "fa:16:3e:5f:9d:7a", "network": {"id": "b909e0f1-3092-44cc-b25b-a0acc5a5cd0c", "bridge": "br-int", "label": "tempest-network-smoke--1717323184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6096923e-37", "ovs_interfaceid": "6096923e-378a-47e1-96e5-11c98a23abfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.857 222021 DEBUG oslo_concurrency.lockutils [None req-b3ece788-ba45-459c-ae84-0f89d7a7d6d6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.861 222021 DEBUG oslo_concurrency.lockutils [req-a7d03055-e748-44fe-82c3-906eb8ae5759 req-16f91b71-e348-4247-85df-64ed034295ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0c274757-9612-49ca-b1fa-8ae80aa5f510" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:10.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.974 222021 DEBUG nova.compute.manager [req-9c5b459e-8ed2-4f50-8733-c740bcfe6532 req-cc68f34b-8666-4d5b-a546-ec4af191fc8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.975 222021 DEBUG oslo_concurrency.lockutils [req-9c5b459e-8ed2-4f50-8733-c740bcfe6532 req-cc68f34b-8666-4d5b-a546-ec4af191fc8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.975 222021 DEBUG oslo_concurrency.lockutils [req-9c5b459e-8ed2-4f50-8733-c740bcfe6532 req-cc68f34b-8666-4d5b-a546-ec4af191fc8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.976 222021 DEBUG oslo_concurrency.lockutils [req-9c5b459e-8ed2-4f50-8733-c740bcfe6532 req-cc68f34b-8666-4d5b-a546-ec4af191fc8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0c274757-9612-49ca-b1fa-8ae80aa5f510-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.976 222021 DEBUG nova.compute.manager [req-9c5b459e-8ed2-4f50-8733-c740bcfe6532 req-cc68f34b-8666-4d5b-a546-ec4af191fc8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] No waiting events found dispatching network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:14:10 np0005593233 nova_compute[222017]: 2026-01-23 10:14:10.977 222021 WARNING nova.compute.manager [req-9c5b459e-8ed2-4f50-8733-c740bcfe6532 req-cc68f34b-8666-4d5b-a546-ec4af191fc8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Received unexpected event network-vif-plugged-6096923e-378a-47e1-96e5-11c98a23abfa for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:14:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:10.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:11 np0005593233 nova_compute[222017]: 2026-01-23 10:14:11.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:12 np0005593233 nova_compute[222017]: 2026-01-23 10:14:12.406 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:12 np0005593233 nova_compute[222017]: 2026-01-23 10:14:12.407 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:14:12 np0005593233 nova_compute[222017]: 2026-01-23 10:14:12.435 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:14:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:12.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:12.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:13 np0005593233 nova_compute[222017]: 2026-01-23 10:14:13.601 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:14.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:14.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:15 np0005593233 nova_compute[222017]: 2026-01-23 10:14:15.105 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.497 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.498 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.541 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.664 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.835 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.864 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.865 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.874 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.874 222021 INFO nova.compute.claims [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:14:16 np0005593233 nova_compute[222017]: 2026-01-23 10:14:16.960 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:16.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:16.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.017 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1294116114' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.481 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.490 222021 DEBUG nova.compute.provider_tree [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.513 222021 DEBUG nova.scheduler.client.report [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.536 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.537 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.589 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.589 222021 DEBUG nova.network.neutron [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.614 222021 INFO nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.637 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.729 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.731 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.732 222021 INFO nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Creating image(s)#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.771 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.804 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.835 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.839 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.916 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.917 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.918 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.918 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.952 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:17 np0005593233 nova_compute[222017]: 2026-01-23 10:14:17.957 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cca30801-d289-4e95-89b2-afcc3d0199a7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.123 222021 DEBUG nova.policy [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fae914e59ec54f6b80928ef3cc68dbdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.522 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cca30801-d289-4e95-89b2-afcc3d0199a7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.634 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:18.637 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.643 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] resizing rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:14:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e304 e304: 3 total, 3 up, 3 in
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.776 222021 DEBUG nova.objects.instance [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'migration_context' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.794 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.794 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Ensure instance console log exists: /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.795 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.795 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:18 np0005593233 nova_compute[222017]: 2026-01-23 10:14:18.814 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:18.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:19.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:20 np0005593233 nova_compute[222017]: 2026-01-23 10:14:20.435 222021 DEBUG nova.network.neutron [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Successfully created port: b7f30c18-45b6-4931-8d26-193df386ae94 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:14:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:20.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:21.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:21 np0005593233 podman[275749]: 2026-01-23 10:14:21.118278445 +0000 UTC m=+0.110789654 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:14:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e304 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:21 np0005593233 nova_compute[222017]: 2026-01-23 10:14:21.962 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e305 e305: 3 total, 3 up, 3 in
Jan 23 05:14:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:22.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:23.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:23 np0005593233 nova_compute[222017]: 2026-01-23 10:14:23.575 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163248.5730953, 0c274757-9612-49ca-b1fa-8ae80aa5f510 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:23 np0005593233 nova_compute[222017]: 2026-01-23 10:14:23.576 222021 INFO nova.compute.manager [-] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:14:23 np0005593233 nova_compute[222017]: 2026-01-23 10:14:23.636 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:24 np0005593233 nova_compute[222017]: 2026-01-23 10:14:24.209 222021 DEBUG nova.network.neutron [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Successfully updated port: b7f30c18-45b6-4931-8d26-193df386ae94 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:14:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:24.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:25.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:26.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:27.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:27 np0005593233 nova_compute[222017]: 2026-01-23 10:14:27.099 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:28 np0005593233 nova_compute[222017]: 2026-01-23 10:14:28.639 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:28.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:29.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:29 np0005593233 nova_compute[222017]: 2026-01-23 10:14:29.095 222021 DEBUG nova.compute.manager [None req-411fb57c-4e1a-48a3-baf6-a2cc8ad3c36d - - - - - -] [instance: 0c274757-9612-49ca-b1fa-8ae80aa5f510] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:29 np0005593233 nova_compute[222017]: 2026-01-23 10:14:29.106 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:29 np0005593233 nova_compute[222017]: 2026-01-23 10:14:29.106 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:29 np0005593233 nova_compute[222017]: 2026-01-23 10:14:29.107 222021 DEBUG nova.network.neutron [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:14:30 np0005593233 nova_compute[222017]: 2026-01-23 10:14:30.241 222021 DEBUG nova.network.neutron [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:14:30 np0005593233 nova_compute[222017]: 2026-01-23 10:14:30.684 222021 DEBUG nova.compute.manager [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:30 np0005593233 nova_compute[222017]: 2026-01-23 10:14:30.684 222021 DEBUG nova.compute.manager [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing instance network info cache due to event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:14:30 np0005593233 nova_compute[222017]: 2026-01-23 10:14:30.685 222021 DEBUG oslo_concurrency.lockutils [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:30.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:31.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:32 np0005593233 nova_compute[222017]: 2026-01-23 10:14:32.102 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:32.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:33.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:33 np0005593233 nova_compute[222017]: 2026-01-23 10:14:33.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.590 222021 DEBUG nova.network.neutron [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.757 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.758 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance network_info: |[{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.759 222021 DEBUG oslo_concurrency.lockutils [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.759 222021 DEBUG nova.network.neutron [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.762 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Start _get_guest_xml network_info=[{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.767 222021 WARNING nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.779 222021 DEBUG nova.virt.libvirt.host [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.780 222021 DEBUG nova.virt.libvirt.host [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.787 222021 DEBUG nova.virt.libvirt.host [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.787 222021 DEBUG nova.virt.libvirt.host [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.789 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.790 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.790 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.791 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.791 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.791 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.791 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.791 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.791 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.792 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.792 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.792 222021 DEBUG nova.virt.hardware [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:14:34 np0005593233 nova_compute[222017]: 2026-01-23 10:14:34.795 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:34.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:35.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:14:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3812440563' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.261 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.294 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.298 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:14:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3437505308' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.830 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.833 222021 DEBUG nova.virt.libvirt.vif [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1133977145',display_name='tempest-ServerRescueNegativeTestJSON-server-1133977145',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1133977145',id=139,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA8EX9GibC86Iq3T2qC/IrlQ+r5p/SWpdEE8xXluu9bLU1lGamBBUxI8TbZp+bpGCmD0iIc57s4GniCMNRyOJdr1+wsaQQ63CuK5/FMyW72KViBWccZ5JBdjf1rnPzLdhg==',key_name='tempest-keypair-1649200342',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-ajqso3l2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=cca30801-d289-4e95-89b2-afcc3d0199a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.834 222021 DEBUG nova.network.os_vif_util [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.836 222021 DEBUG nova.network.os_vif_util [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:14:35 np0005593233 nova_compute[222017]: 2026-01-23 10:14:35.839 222021 DEBUG nova.objects.instance [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.029 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <uuid>cca30801-d289-4e95-89b2-afcc3d0199a7</uuid>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <name>instance-0000008b</name>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1133977145</nova:name>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:14:34</nova:creationTime>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:user uuid="fae914e59ec54f6b80928ef3cc68dbdb">tempest-ServerRescueNegativeTestJSON-87224704-project-member</nova:user>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:project uuid="0a6ba16c4b9d49d3bc24cd7b44935d1f">tempest-ServerRescueNegativeTestJSON-87224704</nova:project>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <nova:port uuid="b7f30c18-45b6-4931-8d26-193df386ae94">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <entry name="serial">cca30801-d289-4e95-89b2-afcc3d0199a7</entry>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <entry name="uuid">cca30801-d289-4e95-89b2-afcc3d0199a7</entry>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cca30801-d289-4e95-89b2-afcc3d0199a7_disk">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:b8:65:5d"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <target dev="tapb7f30c18-45"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/console.log" append="off"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:14:36 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:14:36 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:14:36 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:14:36 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.031 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Preparing to wait for external event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.032 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.033 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.033 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.034 222021 DEBUG nova.virt.libvirt.vif [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1133977145',display_name='tempest-ServerRescueNegativeTestJSON-server-1133977145',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1133977145',id=139,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA8EX9GibC86Iq3T2qC/IrlQ+r5p/SWpdEE8xXluu9bLU1lGamBBUxI8TbZp+bpGCmD0iIc57s4GniCMNRyOJdr1+wsaQQ63CuK5/FMyW72KViBWccZ5JBdjf1rnPzLdhg==',key_name='tempest-keypair-1649200342',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-ajqso3l2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=cca30801-d289-4e95-89b2-afcc3d0199a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.035 222021 DEBUG nova.network.os_vif_util [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.036 222021 DEBUG nova.network.os_vif_util [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.037 222021 DEBUG os_vif [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.038 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.039 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.040 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.045 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.046 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7f30c18-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.046 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7f30c18-45, col_values=(('external_ids', {'iface-id': 'b7f30c18-45b6-4931-8d26-193df386ae94', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:65:5d', 'vm-uuid': 'cca30801-d289-4e95-89b2-afcc3d0199a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.049 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:36 np0005593233 NetworkManager[48871]: <info>  [1769163276.0517] manager: (tapb7f30c18-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.053 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.064 222021 INFO os_vif [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45')#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.134 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.135 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.136 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No VIF found with MAC fa:16:3e:b8:65:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.136 222021 INFO nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Using config drive#033[00m
Jan 23 05:14:36 np0005593233 nova_compute[222017]: 2026-01-23 10:14:36.165 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:36.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:37.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.132 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.136 222021 INFO nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Creating config drive at /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.144 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl9skh0bj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.235 222021 DEBUG nova.network.neutron [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updated VIF entry in instance network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.236 222021 DEBUG nova.network.neutron [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.287 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl9skh0bj" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.315 222021 DEBUG nova.storage.rbd_utils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.319 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.517 222021 DEBUG oslo_concurrency.processutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.520 222021 INFO nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Deleting local config drive /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config because it was imported into RBD.#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.562 222021 DEBUG oslo_concurrency.lockutils [req-3a54223d-e6e8-4c44-aae8-c9b3ddbdbca9 req-970ffd1c-5426-4758-9a78-fcbada7a7e64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:37 np0005593233 kernel: tapb7f30c18-45: entered promiscuous mode
Jan 23 05:14:37 np0005593233 NetworkManager[48871]: <info>  [1769163277.6067] manager: (tapb7f30c18-45): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Jan 23 05:14:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:37Z|00590|binding|INFO|Claiming lport b7f30c18-45b6-4931-8d26-193df386ae94 for this chassis.
Jan 23 05:14:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:37Z|00591|binding|INFO|b7f30c18-45b6-4931-8d26-193df386ae94: Claiming fa:16:3e:b8:65:5d 10.100.0.13
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.624 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:65:5d 10.100.0.13'], port_security=['fa:16:3e:b8:65:5d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca30801-d289-4e95-89b2-afcc3d0199a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd87239e3-bcf1-4a1e-b5bc-c1125963e6fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=b7f30c18-45b6-4931-8d26-193df386ae94) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.626 140224 INFO neutron.agent.ovn.metadata.agent [-] Port b7f30c18-45b6-4931-8d26-193df386ae94 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 bound to our chassis#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.627 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.645 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1b5eba-ead2-4b9b-a227-8125855f7754]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.646 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00bd3319-b1 in ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.648 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00bd3319-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.649 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2f60e4fa-9e15-41fd-81a8-b7229552127e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.649 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4f5758-f184-4b47-a344-0d2a73604f19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.668 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[46c6293d-fbe9-489f-a32e-b6600ae3953f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 systemd-machined[190954]: New machine qemu-64-instance-0000008b.
Jan 23 05:14:37 np0005593233 systemd[1]: Started Virtual Machine qemu-64-instance-0000008b.
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:37Z|00592|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 ovn-installed in OVS
Jan 23 05:14:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:37Z|00593|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 up in Southbound
Jan 23 05:14:37 np0005593233 systemd-udevd[275920]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:14:37 np0005593233 nova_compute[222017]: 2026-01-23 10:14:37.689 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.700 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d54ab24f-b4a1-4474-92a8-044669330086]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 NetworkManager[48871]: <info>  [1769163277.7122] device (tapb7f30c18-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:14:37 np0005593233 NetworkManager[48871]: <info>  [1769163277.7130] device (tapb7f30c18-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.745 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b233529c-2b09-4e8f-b150-deb73cd4cdc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 NetworkManager[48871]: <info>  [1769163277.7551] manager: (tap00bd3319-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/278)
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.756 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[684ba498-d4fd-44f3-9293-dc4e3210e247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 podman[275905]: 2026-01-23 10:14:37.795743326 +0000 UTC m=+0.137497061 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.796 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0be54cf4-de30-4835-b857-0667e4c02a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.800 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab9eded-d4c6-486d-9c8b-f51e580a7d8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 NetworkManager[48871]: <info>  [1769163277.8308] device (tap00bd3319-b0): carrier: link connected
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.839 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f1273d8f-001d-4202-a2c2-1c5a394fd651]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.861 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[10c236a0-5ec0-469e-9240-36c258c1ae27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 719046, 'reachable_time': 20444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275968, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.882 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c61612e0-6dd5-46b8-9173-8d51a22e0b5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:83f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 719046, 'tstamp': 719046}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275969, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.904 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5565af45-65d9-43eb-9bed-35c250f15d4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 719046, 'reachable_time': 20444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275970, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:37.951 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bc38cd-8608-40a9-a4b3-f671c619c13c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.050 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04f4be88-8394-426c-bed4-a40fdd6380a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.052 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.053 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.053 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00bd3319-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.055 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:38 np0005593233 NetworkManager[48871]: <info>  [1769163278.0562] manager: (tap00bd3319-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 23 05:14:38 np0005593233 kernel: tap00bd3319-b0: entered promiscuous mode
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.059 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.061 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00bd3319-b0, col_values=(('external_ids', {'iface-id': '1788b5e6-601b-4e3d-a584-c0138c3308f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:38 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:38Z|00594|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.079 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.081 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[95e3d606-997a-4471-86f6-90d39070e8e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.081 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:14:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:38.083 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'env', 'PROCESS_TAG=haproxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00bd3319-bfe5-4acd-b2e4-17830ee847f9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.111 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163278.1105196, cca30801-d289-4e95-89b2-afcc3d0199a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.112 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Started (Lifecycle Event)#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.190 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.196 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163278.1107304, cca30801-d289-4e95-89b2-afcc3d0199a7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.196 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.262 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.265 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:14:38 np0005593233 nova_compute[222017]: 2026-01-23 10:14:38.309 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:14:38 np0005593233 podman[276044]: 2026-01-23 10:14:38.515295738 +0000 UTC m=+0.086588730 container create 15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:14:38 np0005593233 podman[276044]: 2026-01-23 10:14:38.467568818 +0000 UTC m=+0.038861810 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:14:38 np0005593233 systemd[1]: Started libpod-conmon-15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb.scope.
Jan 23 05:14:38 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:14:38 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ed73050fce28d75f5a559eb184ae93b872bb5e69ef230657de7582df51a266/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:14:38 np0005593233 podman[276044]: 2026-01-23 10:14:38.624891818 +0000 UTC m=+0.196184850 container init 15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:14:38 np0005593233 podman[276044]: 2026-01-23 10:14:38.63343006 +0000 UTC m=+0.204723072 container start 15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:14:38 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [NOTICE]   (276064) : New worker (276066) forked
Jan 23 05:14:38 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [NOTICE]   (276064) : Loading success.
Jan 23 05:14:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:38.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:39.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.287 222021 DEBUG nova.compute.manager [req-805db50f-b579-4b06-8218-330dca6cdb03 req-c5ad59eb-d2a5-4c44-8f09-99c51aad750a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.288 222021 DEBUG oslo_concurrency.lockutils [req-805db50f-b579-4b06-8218-330dca6cdb03 req-c5ad59eb-d2a5-4c44-8f09-99c51aad750a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.288 222021 DEBUG oslo_concurrency.lockutils [req-805db50f-b579-4b06-8218-330dca6cdb03 req-c5ad59eb-d2a5-4c44-8f09-99c51aad750a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.288 222021 DEBUG oslo_concurrency.lockutils [req-805db50f-b579-4b06-8218-330dca6cdb03 req-c5ad59eb-d2a5-4c44-8f09-99c51aad750a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.288 222021 DEBUG nova.compute.manager [req-805db50f-b579-4b06-8218-330dca6cdb03 req-c5ad59eb-d2a5-4c44-8f09-99c51aad750a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Processing event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.289 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.294 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163279.2942166, cca30801-d289-4e95-89b2-afcc3d0199a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.294 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.297 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.301 222021 INFO nova.virt.libvirt.driver [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance spawned successfully.#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.302 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.415 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.692 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.702 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.708 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.709 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.709 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.710 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.711 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.712 222021 DEBUG nova.virt.libvirt.driver [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.749 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.801 222021 INFO nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Took 22.07 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:14:39 np0005593233 nova_compute[222017]: 2026-01-23 10:14:39.801 222021 DEBUG nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:40 np0005593233 nova_compute[222017]: 2026-01-23 10:14:40.162 222021 INFO nova.compute.manager [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Took 23.33 seconds to build instance.#033[00m
Jan 23 05:14:40 np0005593233 nova_compute[222017]: 2026-01-23 10:14:40.205 222021 DEBUG oslo_concurrency.lockutils [None req-f537dc72-c5e6-41e7-a184-ace57b64ed89 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:40.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:41.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.051 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.433 222021 DEBUG nova.compute.manager [req-48cd6555-d268-4de5-bf4c-6098b72d40f5 req-bb91ffcf-2500-4815-8a4f-4c0bd18b0ebb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.434 222021 DEBUG oslo_concurrency.lockutils [req-48cd6555-d268-4de5-bf4c-6098b72d40f5 req-bb91ffcf-2500-4815-8a4f-4c0bd18b0ebb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.434 222021 DEBUG oslo_concurrency.lockutils [req-48cd6555-d268-4de5-bf4c-6098b72d40f5 req-bb91ffcf-2500-4815-8a4f-4c0bd18b0ebb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.435 222021 DEBUG oslo_concurrency.lockutils [req-48cd6555-d268-4de5-bf4c-6098b72d40f5 req-bb91ffcf-2500-4815-8a4f-4c0bd18b0ebb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.435 222021 DEBUG nova.compute.manager [req-48cd6555-d268-4de5-bf4c-6098b72d40f5 req-bb91ffcf-2500-4815-8a4f-4c0bd18b0ebb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:14:41 np0005593233 nova_compute[222017]: 2026-01-23 10:14:41.435 222021 WARNING nova.compute.manager [req-48cd6555-d268-4de5-bf4c-6098b72d40f5 req-bb91ffcf-2500-4815-8a4f-4c0bd18b0ebb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:14:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:42 np0005593233 nova_compute[222017]: 2026-01-23 10:14:42.200 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:42.678 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:42.678 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:14:42.679 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:42.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:43.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:43 np0005593233 NetworkManager[48871]: <info>  [1769163283.4029] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 23 05:14:43 np0005593233 NetworkManager[48871]: <info>  [1769163283.4041] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.503 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.503 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.504 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.504 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.504 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.584 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:43Z|00595|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.621 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.906 222021 DEBUG nova.compute.manager [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.906 222021 DEBUG nova.compute.manager [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing instance network info cache due to event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.907 222021 DEBUG oslo_concurrency.lockutils [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.908 222021 DEBUG oslo_concurrency.lockutils [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.908 222021 DEBUG nova.network.neutron [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:14:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1637421043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:43 np0005593233 nova_compute[222017]: 2026-01-23 10:14:43.992 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.097 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.098 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.300 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.301 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4278MB free_disk=20.809764862060547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.302 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.302 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.399 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance cca30801-d289-4e95-89b2-afcc3d0199a7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.401 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.401 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.545 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.583 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.583 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.608 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.643 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:14:44 np0005593233 nova_compute[222017]: 2026-01-23 10:14:44.703 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:44.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:45.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3246014823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.180 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.186 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.203 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.234 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.234 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.932s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.866 222021 DEBUG nova.network.neutron [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updated VIF entry in instance network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:14:45 np0005593233 nova_compute[222017]: 2026-01-23 10:14:45.867 222021 DEBUG nova.network.neutron [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:46 np0005593233 nova_compute[222017]: 2026-01-23 10:14:46.054 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:46 np0005593233 nova_compute[222017]: 2026-01-23 10:14:46.235 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:46 np0005593233 nova_compute[222017]: 2026-01-23 10:14:46.235 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:46 np0005593233 nova_compute[222017]: 2026-01-23 10:14:46.236 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:46 np0005593233 nova_compute[222017]: 2026-01-23 10:14:46.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:46 np0005593233 nova_compute[222017]: 2026-01-23 10:14:46.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:14:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:46.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:47.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:47 np0005593233 nova_compute[222017]: 2026-01-23 10:14:47.210 222021 DEBUG oslo_concurrency.lockutils [req-a4982edb-e2d2-4376-8e83-b8ac52bbef18 req-b0a0ff46-09a9-433b-848f-8575b5a1955c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:47 np0005593233 nova_compute[222017]: 2026-01-23 10:14:47.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.338 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.338 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.365 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.462 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.463 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.471 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.472 222021 INFO nova.compute.claims [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:14:48 np0005593233 nova_compute[222017]: 2026-01-23 10:14:48.642 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:48.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:49.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1048755449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.148 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.156 222021 DEBUG nova.compute.provider_tree [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.185 222021 DEBUG nova.scheduler.client.report [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.209 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.210 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.256 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.257 222021 DEBUG nova.network.neutron [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.283 222021 INFO nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.319 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.381 222021 INFO nova.virt.block_device [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Booting with volume 3bbb4d6e-08b6-40cf-b719-94bd5e591c16 at /dev/vda#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.425 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.541 222021 DEBUG os_brick.utils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.544 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.559 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.560 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[354132ea-1128-4122-8b00-73abe06eea33]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.562 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.573 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.574 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[1bba359e-5817-48c0-be84-6c2ac887bf46]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.576 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.587 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.587 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[36ed3929-8a98-4ddc-a38c-4886ea876900]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.589 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef80b01-5c64-4beb-b005-4116604284cd]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.590 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.630 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.631 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.632 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.632 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.635 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "nvme version" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.638 222021 DEBUG os_brick.initiator.connectors.lightos [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.638 222021 DEBUG os_brick.initiator.connectors.lightos [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.638 222021 DEBUG os_brick.initiator.connectors.lightos [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.639 222021 DEBUG os_brick.utils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] <== get_connector_properties: return (96ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:14:49 np0005593233 nova_compute[222017]: 2026-01-23 10:14:49.639 222021 DEBUG nova.virt.block_device [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating existing volume attachment record: 32d7d55a-4143-403b-9980-a3decf26d208 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:14:50 np0005593233 nova_compute[222017]: 2026-01-23 10:14:50.377 222021 DEBUG nova.policy [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95ac13194f0940128d42af3d45d130fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ae621f21a8e438fb95152309b38cee5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:14:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e306 e306: 3 total, 3 up, 3 in
Jan 23 05:14:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:50.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.056 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:51.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.221 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.223 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.224 222021 INFO nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Creating image(s)#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.224 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.224 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Ensure instance console log exists: /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.225 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.225 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.226 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:51 np0005593233 nova_compute[222017]: 2026-01-23 10:14:51.795 222021 DEBUG nova.network.neutron [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Successfully created port: 10b1482b-63d3-4411-b752-5d8f34f77403 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:14:52 np0005593233 podman[276150]: 2026-01-23 10:14:52.052106936 +0000 UTC m=+0.058522536 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:14:52 np0005593233 nova_compute[222017]: 2026-01-23 10:14:52.254 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593233 nova_compute[222017]: 2026-01-23 10:14:52.278 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:52 np0005593233 nova_compute[222017]: 2026-01-23 10:14:52.309 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:52 np0005593233 nova_compute[222017]: 2026-01-23 10:14:52.310 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:14:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:14:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:52.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:14:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:53.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:53Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:65:5d 10.100.0.13
Jan 23 05:14:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:14:53Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:65:5d 10.100.0.13
Jan 23 05:14:54 np0005593233 nova_compute[222017]: 2026-01-23 10:14:54.965 222021 DEBUG nova.network.neutron [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Successfully updated port: 10b1482b-63d3-4411-b752-5d8f34f77403 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:14:54 np0005593233 nova_compute[222017]: 2026-01-23 10:14:54.990 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:54 np0005593233 nova_compute[222017]: 2026-01-23 10:14:54.991 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:54 np0005593233 nova_compute[222017]: 2026-01-23 10:14:54.991 222021 DEBUG nova.network.neutron [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:14:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:55.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:55.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:55 np0005593233 nova_compute[222017]: 2026-01-23 10:14:55.100 222021 DEBUG nova.compute.manager [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:55 np0005593233 nova_compute[222017]: 2026-01-23 10:14:55.100 222021 DEBUG nova.compute.manager [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:14:55 np0005593233 nova_compute[222017]: 2026-01-23 10:14:55.101 222021 DEBUG oslo_concurrency.lockutils [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:55 np0005593233 nova_compute[222017]: 2026-01-23 10:14:55.230 222021 DEBUG nova.network.neutron [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:14:56 np0005593233 nova_compute[222017]: 2026-01-23 10:14:56.060 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:57.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:14:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:57.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:14:57 np0005593233 nova_compute[222017]: 2026-01-23 10:14:57.294 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:57 np0005593233 nova_compute[222017]: 2026-01-23 10:14:57.302 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:57 np0005593233 nova_compute[222017]: 2026-01-23 10:14:57.727 222021 DEBUG nova.network.neutron [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e307 e307: 3 total, 3 up, 3 in
Jan 23 05:14:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:59.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:14:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:59.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.495 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.496 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Instance network_info: |[{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.496 222021 DEBUG oslo_concurrency.lockutils [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.496 222021 DEBUG nova.network.neutron [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.499 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Start _get_guest_xml network_info=[{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-3bbb4d6e-08b6-40cf-b719-94bd5e591c16', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '3bbb4d6e-08b6-40cf-b719-94bd5e591c16', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78', 'attached_at': '', 'detached_at': '', 'volume_id': '3bbb4d6e-08b6-40cf-b719-94bd5e591c16', 'serial': '3bbb4d6e-08b6-40cf-b719-94bd5e591c16'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '32d7d55a-4143-403b-9980-a3decf26d208', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.504 222021 WARNING nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.510 222021 DEBUG nova.virt.libvirt.host [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.511 222021 DEBUG nova.virt.libvirt.host [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.515 222021 DEBUG nova.virt.libvirt.host [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.516 222021 DEBUG nova.virt.libvirt.host [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.518 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.519 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.519 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.519 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.520 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.520 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.520 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.520 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.520 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.521 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.521 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.521 222021 DEBUG nova.virt.hardware [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.560 222021 DEBUG nova.storage.rbd_utils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] rbd image fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:59 np0005593233 nova_compute[222017]: 2026-01-23 10:14:59.566 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:15:00 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/633114137' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.092 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.143 222021 DEBUG nova.virt.libvirt.vif [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1890261987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1890261987',id=141,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPuMczToXGmZUNyxG5fVGeV6xaoJVOpQ6Lh9dx5t6v22bv4xalVGQLUjYNEpg7ajkuOU/WHiNfvMhffjZHY/YojnQQYOX+q0GTa9+NPbkGDFf1XELa+vTNvIe6ZV8CwP9g==',key_name='tempest-TestInstancesWithCinderVolumes-232096272',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ae621f21a8e438fb95152309b38cee5',ramdisk_id='',reservation_id='r-ryo6mpg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-565485208',owner_user_name='tempest-TestInstancesWithCinderVolumes-565485208-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:49Z,user_data=None,user_id='95ac13194f0940128d42af3d45d130fa',uuid=fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.144 222021 DEBUG nova.network.os_vif_util [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converting VIF {"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.145 222021 DEBUG nova.network.os_vif_util [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.147 222021 DEBUG nova.objects.instance [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'pci_devices' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.259 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <uuid>fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78</uuid>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <name>instance-0000008d</name>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-1890261987</nova:name>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:14:59</nova:creationTime>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:user uuid="95ac13194f0940128d42af3d45d130fa">tempest-TestInstancesWithCinderVolumes-565485208-project-member</nova:user>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:project uuid="3ae621f21a8e438fb95152309b38cee5">tempest-TestInstancesWithCinderVolumes-565485208</nova:project>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <nova:port uuid="10b1482b-63d3-4411-b752-5d8f34f77403">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <entry name="serial">fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78</entry>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <entry name="uuid">fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78</entry>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_disk.config">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-3bbb4d6e-08b6-40cf-b719-94bd5e591c16">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <serial>3bbb4d6e-08b6-40cf-b719-94bd5e591c16</serial>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:42:68:ac"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <target dev="tap10b1482b-63"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/console.log" append="off"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:15:00 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:15:00 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:15:00 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:15:00 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.261 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Preparing to wait for external event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.261 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.261 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.262 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.262 222021 DEBUG nova.virt.libvirt.vif [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1890261987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1890261987',id=141,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPuMczToXGmZUNyxG5fVGeV6xaoJVOpQ6Lh9dx5t6v22bv4xalVGQLUjYNEpg7ajkuOU/WHiNfvMhffjZHY/YojnQQYOX+q0GTa9+NPbkGDFf1XELa+vTNvIe6ZV8CwP9g==',key_name='tempest-TestInstancesWithCinderVolumes-232096272',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ae621f21a8e438fb95152309b38cee5',ramdisk_id='',reservation_id='r-ryo6mpg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-565485208',owner_user_name='tempest-TestInstancesWithCinderVolumes-565485208-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:49Z,user_data=None,user_id='95ac13194f0940128d42af3d45d130fa',uuid=fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.263 222021 DEBUG nova.network.os_vif_util [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converting VIF {"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.263 222021 DEBUG nova.network.os_vif_util [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.264 222021 DEBUG os_vif [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.265 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.266 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.270 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.270 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10b1482b-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.271 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10b1482b-63, col_values=(('external_ids', {'iface-id': '10b1482b-63d3-4411-b752-5d8f34f77403', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:68:ac', 'vm-uuid': 'fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.273 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:00 np0005593233 NetworkManager[48871]: <info>  [1769163300.2742] manager: (tap10b1482b-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.276 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.285 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.287 222021 INFO os_vif [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63')#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.428 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.429 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.429 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No VIF found with MAC fa:16:3e:42:68:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.429 222021 INFO nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Using config drive#033[00m
Jan 23 05:15:00 np0005593233 nova_compute[222017]: 2026-01-23 10:15:00.473 222021 DEBUG nova.storage.rbd_utils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] rbd image fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:01.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:01.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:01 np0005593233 nova_compute[222017]: 2026-01-23 10:15:01.143 222021 DEBUG oslo_concurrency.lockutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:01 np0005593233 nova_compute[222017]: 2026-01-23 10:15:01.144 222021 DEBUG oslo_concurrency.lockutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:01 np0005593233 nova_compute[222017]: 2026-01-23 10:15:01.217 222021 DEBUG nova.objects.instance [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'flavor' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:01 np0005593233 nova_compute[222017]: 2026-01-23 10:15:01.511 222021 DEBUG oslo_concurrency.lockutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e307 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e308 e308: 3 total, 3 up, 3 in
Jan 23 05:15:02 np0005593233 nova_compute[222017]: 2026-01-23 10:15:02.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:15:02 np0005593233 nova_compute[222017]: 2026-01-23 10:15:02.666 222021 INFO nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Creating config drive at /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/disk.config#033[00m
Jan 23 05:15:02 np0005593233 nova_compute[222017]: 2026-01-23 10:15:02.674 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp35k93irt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:02 np0005593233 nova_compute[222017]: 2026-01-23 10:15:02.816 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp35k93irt" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:02 np0005593233 nova_compute[222017]: 2026-01-23 10:15:02.885 222021 DEBUG nova.storage.rbd_utils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] rbd image fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:02 np0005593233 nova_compute[222017]: 2026-01-23 10:15:02.890 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/disk.config fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:03.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:03.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.101 222021 DEBUG oslo_concurrency.lockutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.101 222021 DEBUG oslo_concurrency.lockutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.102 222021 INFO nova.compute.manager [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Attaching volume c633bb7d-c04b-4d7b-a574-1649ca824520 to /dev/vdb#033[00m
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.188 222021 DEBUG oslo_concurrency.processutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/disk.config fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.190 222021 INFO nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Deleting local config drive /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78/disk.config because it was imported into RBD.#033[00m
Jan 23 05:15:03 np0005593233 kernel: tap10b1482b-63: entered promiscuous mode
Jan 23 05:15:03 np0005593233 NetworkManager[48871]: <info>  [1769163303.2515] manager: (tap10b1482b-63): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Jan 23 05:15:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:03Z|00596|binding|INFO|Claiming lport 10b1482b-63d3-4411-b752-5d8f34f77403 for this chassis.
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.273 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:03Z|00597|binding|INFO|10b1482b-63d3-4411-b752-5d8f34f77403: Claiming fa:16:3e:42:68:ac 10.100.0.9
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.284 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:68:ac 10.100.0.9'], port_security=['fa:16:3e:42:68:ac 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ae621f21a8e438fb95152309b38cee5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b0a0b41-45a8-4582-a4d2-a9aff1f1a18c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5888498-07d6-4c96-95ee-546974eebd82, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=10b1482b-63d3-4411-b752-5d8f34f77403) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.285 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 10b1482b-63d3-4411-b752-5d8f34f77403 in datapath f98d79de-4a23-4f29-9848-c5d4c5683a5d bound to our chassis#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.288 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f98d79de-4a23-4f29-9848-c5d4c5683a5d#033[00m
Jan 23 05:15:03 np0005593233 systemd-udevd[276407]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:15:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:03Z|00598|binding|INFO|Setting lport 10b1482b-63d3-4411-b752-5d8f34f77403 ovn-installed in OVS
Jan 23 05:15:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:03Z|00599|binding|INFO|Setting lport 10b1482b-63d3-4411-b752-5d8f34f77403 up in Southbound
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.293 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:03 np0005593233 NetworkManager[48871]: <info>  [1769163303.3066] device (tap10b1482b-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:15:03 np0005593233 NetworkManager[48871]: <info>  [1769163303.3071] device (tap10b1482b-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.305 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e40fa5cc-1f73-45b6-a95f-43d854d72a05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.307 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf98d79de-41 in ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.311 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf98d79de-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.311 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e46b03-5799-4e4d-aa1b-83694c28bfd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.312 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[485c2342-a492-4c54-88d7-d292d63dcd09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 systemd-machined[190954]: New machine qemu-65-instance-0000008d.
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.330 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[bb53363e-ac26-4ff0-adb6-575066aabbc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 systemd[1]: Started Virtual Machine qemu-65-instance-0000008d.
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.349 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[18d8a020-1a17-4c77-a047-4789cbfea00c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.386 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[35113328-a071-4c41-b8ae-b0bb53b69dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.391 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7bade729-ec88-4a47-969f-ed655bc7d0ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 NetworkManager[48871]: <info>  [1769163303.3930] manager: (tapf98d79de-40): new Veth device (/org/freedesktop/NetworkManager/Devices/284)
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.437 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[42f87afb-160f-4aa5-8be0-0adeb21ffa9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.444 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[cba4fe25-c5b3-4bf9-b63e-1e1bed81d509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 NetworkManager[48871]: <info>  [1769163303.4813] device (tapf98d79de-40): carrier: link connected
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.488 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7db29426-8ff2-4396-8725-6e3651534b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.512 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e832cbd9-9a5f-4083-b3cf-8985bf727b82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf98d79de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3d:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 721611, 'reachable_time': 20715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276443, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.532 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e8b76d-792c-4079-ba07-51fe81be8413]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:3d5f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 721611, 'tstamp': 721611}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276444, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.559 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5dfbb3e5-110e-4723-9222-e44333353352]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf98d79de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3d:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 721611, 'reachable_time': 20715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276445, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.606 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[84427b28-3f22-4f89-afc6-7998732d5d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.688 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7b91613c-04f0-4dfa-a9e1-883e3ccbb8e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.690 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf98d79de-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.690 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.690 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf98d79de-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:03 np0005593233 NetworkManager[48871]: <info>  [1769163303.6933] manager: (tapf98d79de-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:03 np0005593233 kernel: tapf98d79de-40: entered promiscuous mode
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.699 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf98d79de-40, col_values=(('external_ids', {'iface-id': '2c16e447-27d9-4516-bf23-ec948f375c10'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:03Z|00600|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:03 np0005593233 nova_compute[222017]: 2026-01-23 10:15:03.717 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.718 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f98d79de-4a23-4f29-9848-c5d4c5683a5d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f98d79de-4a23-4f29-9848-c5d4c5683a5d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.719 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a638e268-e881-4b7c-ad98-52668bb3f98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.720 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-f98d79de-4a23-4f29-9848-c5d4c5683a5d
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/f98d79de-4a23-4f29-9848-c5d4c5683a5d.pid.haproxy
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID f98d79de-4a23-4f29-9848-c5d4c5683a5d
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:15:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:03.721 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'env', 'PROCESS_TAG=haproxy-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f98d79de-4a23-4f29-9848-c5d4c5683a5d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.031 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163304.0300684, fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.031 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] VM Started (Lifecycle Event)#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.078 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.083 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163304.0316875, fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.083 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.151 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.156 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:15:04 np0005593233 podman[276519]: 2026-01-23 10:15:04.161706556 +0000 UTC m=+0.056397246 container create c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.184 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:15:04 np0005593233 systemd[1]: Started libpod-conmon-c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764.scope.
Jan 23 05:15:04 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:15:04 np0005593233 podman[276519]: 2026-01-23 10:15:04.134132366 +0000 UTC m=+0.028823076 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:15:04 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f839200124fde9106996e5116aa450621a11da822a96e76bfe65398066e12919/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:15:04 np0005593233 podman[276519]: 2026-01-23 10:15:04.252875055 +0000 UTC m=+0.147565835 container init c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:15:04 np0005593233 podman[276519]: 2026-01-23 10:15:04.260521591 +0000 UTC m=+0.155212311 container start c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:15:04 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [NOTICE]   (276538) : New worker (276540) forked
Jan 23 05:15:04 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [NOTICE]   (276538) : Loading success.
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.412 222021 DEBUG os_brick.utils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.413 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.427 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.427 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[a067205e-82b1-4502-89ac-4dfaa2101973]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.429 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.438 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.438 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[5c74bdcd-1ab5-4d20-961a-3199321bcfb2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.441 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.449 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.450 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[28aa0b7e-c8ee-4c3f-aa63-6a55dcb017f4]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.451 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[cb850683-ce81-4f67-befd-f540fe92dcd3]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.452 222021 DEBUG oslo_concurrency.processutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.492 222021 DEBUG oslo_concurrency.processutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "nvme version" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.497 222021 DEBUG os_brick.initiator.connectors.lightos [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.497 222021 DEBUG os_brick.initiator.connectors.lightos [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.497 222021 DEBUG os_brick.initiator.connectors.lightos [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.498 222021 DEBUG os_brick.utils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] <== get_connector_properties: return (85ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:15:04 np0005593233 nova_compute[222017]: 2026-01-23 10:15:04.498 222021 DEBUG nova.virt.block_device [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating existing volume attachment record: be7f16e3-d7f9-4185-88d3-6fc36f851537 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:15:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:05.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:05.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.273 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.958 222021 DEBUG nova.compute.manager [req-5a2d4f30-3008-44fb-8919-32fd86787971 req-fa2e85e0-5deb-4cdb-a944-d71521398e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.959 222021 DEBUG oslo_concurrency.lockutils [req-5a2d4f30-3008-44fb-8919-32fd86787971 req-fa2e85e0-5deb-4cdb-a944-d71521398e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.959 222021 DEBUG oslo_concurrency.lockutils [req-5a2d4f30-3008-44fb-8919-32fd86787971 req-fa2e85e0-5deb-4cdb-a944-d71521398e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.959 222021 DEBUG oslo_concurrency.lockutils [req-5a2d4f30-3008-44fb-8919-32fd86787971 req-fa2e85e0-5deb-4cdb-a944-d71521398e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.959 222021 DEBUG nova.compute.manager [req-5a2d4f30-3008-44fb-8919-32fd86787971 req-fa2e85e0-5deb-4cdb-a944-d71521398e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Processing event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.960 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.964 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163305.9641702, fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.964 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.967 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.970 222021 INFO nova.virt.libvirt.driver [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Instance spawned successfully.#033[00m
Jan 23 05:15:05 np0005593233 nova_compute[222017]: 2026-01-23 10:15:05.970 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.001 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.002 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.002 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.003 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.003 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.003 222021 DEBUG nova.virt.libvirt.driver [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.141 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.146 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.253 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.338 222021 INFO nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Took 15.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.339 222021 DEBUG nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.478 222021 INFO nova.compute.manager [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Took 18.05 seconds to build instance.#033[00m
Jan 23 05:15:06 np0005593233 nova_compute[222017]: 2026-01-23 10:15:06.628 222021 DEBUG oslo_concurrency.lockutils [None req-a2b6dc6a-ec1b-4ca3-9151-9219bffda16a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:07.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 e309: 3 total, 3 up, 3 in
Jan 23 05:15:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:07.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.348 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.360 222021 DEBUG nova.objects.instance [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'flavor' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.420 222021 DEBUG nova.virt.libvirt.driver [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Attempting to attach volume c633bb7d-c04b-4d7b-a574-1649ca824520 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.423 222021 DEBUG nova.virt.libvirt.guest [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-c633bb7d-c04b-4d7b-a574-1649ca824520">
Jan 23 05:15:07 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:15:07 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:15:07 np0005593233 nova_compute[222017]:  <serial>c633bb7d-c04b-4d7b-a574-1649ca824520</serial>
Jan 23 05:15:07 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:15:07 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.566 222021 DEBUG nova.network.neutron [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.567 222021 DEBUG nova.network.neutron [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.673 222021 DEBUG oslo_concurrency.lockutils [req-73d20b03-df80-4ab8-b3a4-6905f1cd952e req-9ef05218-5f37-48a2-886f-33e429641d4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.710 222021 DEBUG nova.virt.libvirt.driver [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.711 222021 DEBUG nova.virt.libvirt.driver [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.711 222021 DEBUG nova.virt.libvirt.driver [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:07 np0005593233 nova_compute[222017]: 2026-01-23 10:15:07.711 222021 DEBUG nova.virt.libvirt.driver [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No VIF found with MAC fa:16:3e:b8:65:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:15:08 np0005593233 podman[276626]: 2026-01-23 10:15:08.122170779 +0000 UTC m=+0.118814592 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.507 222021 DEBUG nova.compute.manager [req-99b4e489-f694-44a7-911f-0e455336b5df req-b982133b-fcca-4baa-9763-0e19344eaa4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.507 222021 DEBUG oslo_concurrency.lockutils [req-99b4e489-f694-44a7-911f-0e455336b5df req-b982133b-fcca-4baa-9763-0e19344eaa4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.507 222021 DEBUG oslo_concurrency.lockutils [req-99b4e489-f694-44a7-911f-0e455336b5df req-b982133b-fcca-4baa-9763-0e19344eaa4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.507 222021 DEBUG oslo_concurrency.lockutils [req-99b4e489-f694-44a7-911f-0e455336b5df req-b982133b-fcca-4baa-9763-0e19344eaa4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.507 222021 DEBUG nova.compute.manager [req-99b4e489-f694-44a7-911f-0e455336b5df req-b982133b-fcca-4baa-9763-0e19344eaa4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] No waiting events found dispatching network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.508 222021 WARNING nova.compute.manager [req-99b4e489-f694-44a7-911f-0e455336b5df req-b982133b-fcca-4baa-9763-0e19344eaa4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received unexpected event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:15:08 np0005593233 nova_compute[222017]: 2026-01-23 10:15:08.690 222021 DEBUG oslo_concurrency.lockutils [None req-99dfa856-df1a-464f-9c08-05608c67015d fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 5.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:09.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:09.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:10 np0005593233 nova_compute[222017]: 2026-01-23 10:15:10.276 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:10 np0005593233 nova_compute[222017]: 2026-01-23 10:15:10.877 222021 INFO nova.compute.manager [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Rescuing#033[00m
Jan 23 05:15:10 np0005593233 nova_compute[222017]: 2026-01-23 10:15:10.878 222021 DEBUG oslo_concurrency.lockutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:10 np0005593233 nova_compute[222017]: 2026-01-23 10:15:10.879 222021 DEBUG oslo_concurrency.lockutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:10 np0005593233 nova_compute[222017]: 2026-01-23 10:15:10.879 222021 DEBUG nova.network.neutron [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:15:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:11.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:12 np0005593233 nova_compute[222017]: 2026-01-23 10:15:12.350 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:13.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:13.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:15.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:15.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:15 np0005593233 nova_compute[222017]: 2026-01-23 10:15:15.278 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:17.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:17 np0005593233 nova_compute[222017]: 2026-01-23 10:15:17.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:18 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Jan 23 05:15:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:19.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:19.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:20 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:20Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:68:ac 10.100.0.9
Jan 23 05:15:20 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:20Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:68:ac 10.100.0.9
Jan 23 05:15:20 np0005593233 nova_compute[222017]: 2026-01-23 10:15:20.280 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:21.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:21.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:22 np0005593233 nova_compute[222017]: 2026-01-23 10:15:22.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:23.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:23 np0005593233 podman[276653]: 2026-01-23 10:15:23.081984616 +0000 UTC m=+0.077884434 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:15:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:23.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:24.621 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:24.622 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:15:24 np0005593233 nova_compute[222017]: 2026-01-23 10:15:24.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:25.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:25.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:25 np0005593233 nova_compute[222017]: 2026-01-23 10:15:25.202 222021 DEBUG nova.network.neutron [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:25 np0005593233 nova_compute[222017]: 2026-01-23 10:15:25.224 222021 DEBUG oslo_concurrency.lockutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:25 np0005593233 nova_compute[222017]: 2026-01-23 10:15:25.282 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:25 np0005593233 nova_compute[222017]: 2026-01-23 10:15:25.794 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:15:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:27.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:27.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:27 np0005593233 nova_compute[222017]: 2026-01-23 10:15:27.459 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 kernel: tapb7f30c18-45 (unregistering): left promiscuous mode
Jan 23 05:15:28 np0005593233 NetworkManager[48871]: <info>  [1769163328.0920] device (tapb7f30c18-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:15:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:28Z|00601|binding|INFO|Releasing lport b7f30c18-45b6-4931-8d26-193df386ae94 from this chassis (sb_readonly=0)
Jan 23 05:15:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:28Z|00602|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 down in Southbound
Jan 23 05:15:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:28Z|00603|binding|INFO|Removing iface tapb7f30c18-45 ovn-installed in OVS
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.116 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.118 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.132 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.142 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:65:5d 10.100.0.13'], port_security=['fa:16:3e:b8:65:5d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca30801-d289-4e95-89b2-afcc3d0199a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd87239e3-bcf1-4a1e-b5bc-c1125963e6fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=b7f30c18-45b6-4931-8d26-193df386ae94) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.143 140224 INFO neutron.agent.ovn.metadata.agent [-] Port b7f30c18-45b6-4931-8d26-193df386ae94 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 unbound from our chassis#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.145 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.146 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fd553916-8af7-440b-b016-5a73f4c109c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.148 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace which is not needed anymore#033[00m
Jan 23 05:15:28 np0005593233 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 23 05:15:28 np0005593233 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008b.scope: Consumed 15.932s CPU time.
Jan 23 05:15:28 np0005593233 systemd-machined[190954]: Machine qemu-64-instance-0000008b terminated.
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.332 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.339 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [NOTICE]   (276064) : haproxy version is 2.8.14-c23fe91
Jan 23 05:15:28 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [NOTICE]   (276064) : path to executable is /usr/sbin/haproxy
Jan 23 05:15:28 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [WARNING]  (276064) : Exiting Master process...
Jan 23 05:15:28 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [ALERT]    (276064) : Current worker (276066) exited with code 143 (Terminated)
Jan 23 05:15:28 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[276060]: [WARNING]  (276064) : All workers exited. Exiting... (0)
Jan 23 05:15:28 np0005593233 systemd[1]: libpod-15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb.scope: Deactivated successfully.
Jan 23 05:15:28 np0005593233 podman[276695]: 2026-01-23 10:15:28.378146788 +0000 UTC m=+0.092430685 container died 15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:15:28 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb-userdata-shm.mount: Deactivated successfully.
Jan 23 05:15:28 np0005593233 systemd[1]: var-lib-containers-storage-overlay-e8ed73050fce28d75f5a559eb184ae93b872bb5e69ef230657de7582df51a266-merged.mount: Deactivated successfully.
Jan 23 05:15:28 np0005593233 podman[276695]: 2026-01-23 10:15:28.617233221 +0000 UTC m=+0.331517088 container cleanup 15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.624 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:28 np0005593233 systemd[1]: libpod-conmon-15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb.scope: Deactivated successfully.
Jan 23 05:15:28 np0005593233 podman[276730]: 2026-01-23 10:15:28.700736703 +0000 UTC m=+0.057730344 container remove 15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.709 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[31500a53-b12f-4cac-9ea9-24c9a430e323]: (4, ('Fri Jan 23 10:15:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb)\n15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb\nFri Jan 23 10:15:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb)\n15ce737c1eeac2042304389993ce20fc4d3e3b57a7118e2ae0217400f8ca3cdb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.712 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6ca32a-3e58-4bbc-b499-08f54748a534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.713 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 kernel: tap00bd3319-b0: left promiscuous mode
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.740 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[06d9cd17-f0f0-4444-ad60-f98adfdbacaf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.761 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[41f84feb-e16a-49ff-a2df-5180f9d72738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.763 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6d10cd-6030-4110-93a1-86f63c78504d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.784 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[016a7836-1fef-4d72-bfff-0ca9f4a987aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 719036, 'reachable_time': 16718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276749, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.787 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:15:28 np0005593233 systemd[1]: run-netns-ovnmeta\x2d00bd3319\x2dbfe5\x2d4acd\x2db2e4\x2d17830ee847f9.mount: Deactivated successfully.
Jan 23 05:15:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:28.788 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[b803c39d-62ac-4aa7-99cc-9eeb660e8594]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.811 222021 DEBUG nova.compute.manager [req-f82c6f52-3ef3-45e2-b715-11c920ef0669 req-60175ea0-3a69-46e5-ac9d-66695e2df1d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.811 222021 DEBUG oslo_concurrency.lockutils [req-f82c6f52-3ef3-45e2-b715-11c920ef0669 req-60175ea0-3a69-46e5-ac9d-66695e2df1d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.812 222021 DEBUG oslo_concurrency.lockutils [req-f82c6f52-3ef3-45e2-b715-11c920ef0669 req-60175ea0-3a69-46e5-ac9d-66695e2df1d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.812 222021 DEBUG oslo_concurrency.lockutils [req-f82c6f52-3ef3-45e2-b715-11c920ef0669 req-60175ea0-3a69-46e5-ac9d-66695e2df1d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.812 222021 DEBUG nova.compute.manager [req-f82c6f52-3ef3-45e2-b715-11c920ef0669 req-60175ea0-3a69-46e5-ac9d-66695e2df1d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.812 222021 WARNING nova.compute.manager [req-f82c6f52-3ef3-45e2-b715-11c920ef0669 req-60175ea0-3a69-46e5-ac9d-66695e2df1d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.816 222021 INFO nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.821 222021 INFO nova.virt.libvirt.driver [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance destroyed successfully.#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.822 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'numa_topology' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.907 222021 INFO nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Attempting rescue#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.908 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.915 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:15:28 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.916 222021 INFO nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Creating image(s)#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:28.965 222021 DEBUG nova.storage.rbd_utils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.010 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'trusted_certs' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:15:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:29.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.068 222021 DEBUG nova.storage.rbd_utils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.101 222021 DEBUG nova.storage.rbd_utils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.107 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:29.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.187 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.188 222021 DEBUG oslo_concurrency.lockutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.189 222021 DEBUG oslo_concurrency.lockutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.189 222021 DEBUG oslo_concurrency.lockutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.223 222021 DEBUG nova.storage.rbd_utils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.229 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.556 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.558 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'migration_context' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.587 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.589 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Start _get_guest_xml network_info=[{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "vif_mac": "fa:16:3e:b8:65:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.589 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'resources' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.620 222021 WARNING nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.633 222021 DEBUG nova.virt.libvirt.host [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.634 222021 DEBUG nova.virt.libvirt.host [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.637 222021 DEBUG nova.virt.libvirt.host [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.638 222021 DEBUG nova.virt.libvirt.host [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.639 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.639 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.640 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.641 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.641 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.641 222021 DEBUG nova.virt.hardware [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.641 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'vcpu_model' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:29 np0005593233 nova_compute[222017]: 2026-01-23 10:15:29.680 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:15:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/764466613' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:15:30 np0005593233 nova_compute[222017]: 2026-01-23 10:15:30.161 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:30 np0005593233 nova_compute[222017]: 2026-01-23 10:15:30.162 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:30 np0005593233 nova_compute[222017]: 2026-01-23 10:15:30.284 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:15:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1885595960' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:15:30 np0005593233 nova_compute[222017]: 2026-01-23 10:15:30.672 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:30 np0005593233 nova_compute[222017]: 2026-01-23 10:15:30.674 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:31.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.058 222021 DEBUG nova.compute.manager [req-6c7a9fc6-16d7-442c-8519-d2dc515bf3de req-171a3e02-b989-484d-adbe-a74eac0f429c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.059 222021 DEBUG oslo_concurrency.lockutils [req-6c7a9fc6-16d7-442c-8519-d2dc515bf3de req-171a3e02-b989-484d-adbe-a74eac0f429c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.059 222021 DEBUG oslo_concurrency.lockutils [req-6c7a9fc6-16d7-442c-8519-d2dc515bf3de req-171a3e02-b989-484d-adbe-a74eac0f429c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.060 222021 DEBUG oslo_concurrency.lockutils [req-6c7a9fc6-16d7-442c-8519-d2dc515bf3de req-171a3e02-b989-484d-adbe-a74eac0f429c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.060 222021 DEBUG nova.compute.manager [req-6c7a9fc6-16d7-442c-8519-d2dc515bf3de req-171a3e02-b989-484d-adbe-a74eac0f429c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.060 222021 WARNING nova.compute.manager [req-6c7a9fc6-16d7-442c-8519-d2dc515bf3de req-171a3e02-b989-484d-adbe-a74eac0f429c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:15:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:31.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:15:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2354302912' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.262 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.264 222021 DEBUG nova.virt.libvirt.vif [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1133977145',display_name='tempest-ServerRescueNegativeTestJSON-server-1133977145',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1133977145',id=139,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA8EX9GibC86Iq3T2qC/IrlQ+r5p/SWpdEE8xXluu9bLU1lGamBBUxI8TbZp+bpGCmD0iIc57s4GniCMNRyOJdr1+wsaQQ63CuK5/FMyW72KViBWccZ5JBdjf1rnPzLdhg==',key_name='tempest-keypair-1649200342',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:14:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-ajqso3l2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=cca30801-d289-4e95-89b2-afcc3d0199a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "vif_mac": "fa:16:3e:b8:65:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.264 222021 DEBUG nova.network.os_vif_util [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "vif_mac": "fa:16:3e:b8:65:5d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.266 222021 DEBUG nova.network.os_vif_util [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.268 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.299 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <uuid>cca30801-d289-4e95-89b2-afcc3d0199a7</uuid>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <name>instance-0000008b</name>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1133977145</nova:name>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:15:29</nova:creationTime>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:user uuid="fae914e59ec54f6b80928ef3cc68dbdb">tempest-ServerRescueNegativeTestJSON-87224704-project-member</nova:user>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:project uuid="0a6ba16c4b9d49d3bc24cd7b44935d1f">tempest-ServerRescueNegativeTestJSON-87224704</nova:project>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <nova:port uuid="b7f30c18-45b6-4931-8d26-193df386ae94">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <entry name="serial">cca30801-d289-4e95-89b2-afcc3d0199a7</entry>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <entry name="uuid">cca30801-d289-4e95-89b2-afcc3d0199a7</entry>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cca30801-d289-4e95-89b2-afcc3d0199a7_disk.rescue">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cca30801-d289-4e95-89b2-afcc3d0199a7_disk">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config.rescue">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:b8:65:5d"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <target dev="tapb7f30c18-45"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/console.log" append="off"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:15:31 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:15:31 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:15:31 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:15:31 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.310 222021 INFO nova.virt.libvirt.driver [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance destroyed successfully.#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.385 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.386 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.386 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.386 222021 DEBUG nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No VIF found with MAC fa:16:3e:b8:65:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.387 222021 INFO nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Using config drive#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.426 222021 DEBUG nova.storage.rbd_utils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.474 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'ec2_ids' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:31 np0005593233 nova_compute[222017]: 2026-01-23 10:15:31.535 222021 DEBUG nova.objects.instance [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'keypairs' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.500 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.511 222021 INFO nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Creating config drive at /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config.rescue#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.519 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpks9gsyey execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.673 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpks9gsyey" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.708 222021 DEBUG nova.storage.rbd_utils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.712 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config.rescue cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.901 222021 DEBUG oslo_concurrency.processutils [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config.rescue cca30801-d289-4e95-89b2-afcc3d0199a7_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.902 222021 INFO nova.virt.libvirt.driver [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Deleting local config drive /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:15:32 np0005593233 kernel: tapb7f30c18-45: entered promiscuous mode
Jan 23 05:15:32 np0005593233 NetworkManager[48871]: <info>  [1769163332.9869] manager: (tapb7f30c18-45): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Jan 23 05:15:32 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:32Z|00604|binding|INFO|Claiming lport b7f30c18-45b6-4931-8d26-193df386ae94 for this chassis.
Jan 23 05:15:32 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:32Z|00605|binding|INFO|b7f30c18-45b6-4931-8d26-193df386ae94: Claiming fa:16:3e:b8:65:5d 10.100.0.13
Jan 23 05:15:32 np0005593233 nova_compute[222017]: 2026-01-23 10:15:32.987 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:32.999 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:65:5d 10.100.0.13'], port_security=['fa:16:3e:b8:65:5d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca30801-d289-4e95-89b2-afcc3d0199a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd87239e3-bcf1-4a1e-b5bc-c1125963e6fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=b7f30c18-45b6-4931-8d26-193df386ae94) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.001 140224 INFO neutron.agent.ovn.metadata.agent [-] Port b7f30c18-45b6-4931-8d26-193df386ae94 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 bound to our chassis#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.003 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9#033[00m
Jan 23 05:15:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:33Z|00606|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 ovn-installed in OVS
Jan 23 05:15:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:33Z|00607|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 up in Southbound
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.012 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:33 np0005593233 systemd-udevd[276983]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.025 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f10ff4-0669-46f8-9476-5539c516028f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.027 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00bd3319-b1 in ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.031 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00bd3319-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.031 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b009fd-aa2b-481d-a9a9-2dd71354f379]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.032 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[46d6241e-59d4-4649-876b-f3a83d94d1bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 systemd-machined[190954]: New machine qemu-66-instance-0000008b.
Jan 23 05:15:33 np0005593233 NetworkManager[48871]: <info>  [1769163333.0453] device (tapb7f30c18-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:15:33 np0005593233 NetworkManager[48871]: <info>  [1769163333.0461] device (tapb7f30c18-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:15:33 np0005593233 systemd[1]: Started Virtual Machine qemu-66-instance-0000008b.
Jan 23 05:15:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:33.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.052 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[888772f8-f8ee-404a-b44e-7d6cf031448c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.079 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1530ba9f-378a-4d08-bede-c1c5987e3731]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:33.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.136 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6c933394-137c-47d5-ac89-df048bd2cbcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 NetworkManager[48871]: <info>  [1769163333.1435] manager: (tap00bd3319-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/287)
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.142 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f97b946d-6502-483f-84d1-136618551bf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.190 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fb871180-e494-4096-ae92-131ad4b21afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.197 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[51b042c6-1149-4099-9bf0-b5594cff3949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 NetworkManager[48871]: <info>  [1769163333.2328] device (tap00bd3319-b0): carrier: link connected
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.244 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4c792d40-20b1-4b22-b917-993fd9d2b098]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.264 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[02d52f25-79ba-4c93-b9c9-6c8a508e7eb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724586, 'reachable_time': 37325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277016, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.279 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f337af0d-19b1-4f1a-8b84-8acef1f59d84]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:83f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 724586, 'tstamp': 724586}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277017, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.294 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[76680e3f-d544-448a-ab27-7fad7fa5e0e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724586, 'reachable_time': 37325, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277018, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.339 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1b552cc5-d88a-4c78-9cf4-2ce1aa720859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.418 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1d43dcf0-9efe-4e3e-9c59-c823decfbd45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.420 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.420 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.421 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00bd3319-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:33 np0005593233 NetworkManager[48871]: <info>  [1769163333.4242] manager: (tap00bd3319-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 23 05:15:33 np0005593233 kernel: tap00bd3319-b0: entered promiscuous mode
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.428 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00bd3319-b0, col_values=(('external_ids', {'iface-id': '1788b5e6-601b-4e3d-a584-c0138c3308f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:33 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:33Z|00608|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.456 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.457 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee91f17-156e-45da-a530-1f00ac586ca9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.458 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:15:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:33.459 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'env', 'PROCESS_TAG=haproxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00bd3319-bfe5-4acd-b2e4-17830ee847f9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.705 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for cca30801-d289-4e95-89b2-afcc3d0199a7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.706 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163333.7042534, cca30801-d289-4e95-89b2-afcc3d0199a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.706 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.715 222021 DEBUG nova.compute.manager [None req-a8ccbae1-ed68-416c-a171-374b1e3bb2a2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.755 222021 DEBUG nova.compute.manager [req-a73b2bd3-e02e-468d-ba10-62f49ad4cd1d req-31de6068-8b97-4d06-9e22-a27e9df85a79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.755 222021 DEBUG oslo_concurrency.lockutils [req-a73b2bd3-e02e-468d-ba10-62f49ad4cd1d req-31de6068-8b97-4d06-9e22-a27e9df85a79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.756 222021 DEBUG oslo_concurrency.lockutils [req-a73b2bd3-e02e-468d-ba10-62f49ad4cd1d req-31de6068-8b97-4d06-9e22-a27e9df85a79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.756 222021 DEBUG oslo_concurrency.lockutils [req-a73b2bd3-e02e-468d-ba10-62f49ad4cd1d req-31de6068-8b97-4d06-9e22-a27e9df85a79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.757 222021 DEBUG nova.compute.manager [req-a73b2bd3-e02e-468d-ba10-62f49ad4cd1d req-31de6068-8b97-4d06-9e22-a27e9df85a79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.757 222021 WARNING nova.compute.manager [req-a73b2bd3-e02e-468d-ba10-62f49ad4cd1d req-31de6068-8b97-4d06-9e22-a27e9df85a79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.786 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.791 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.819 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.820 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163333.704422, cca30801-d289-4e95-89b2-afcc3d0199a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.820 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Started (Lifecycle Event)#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.867 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:33 np0005593233 nova_compute[222017]: 2026-01-23 10:15:33.873 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:15:33 np0005593233 podman[277110]: 2026-01-23 10:15:33.893783557 +0000 UTC m=+0.061305315 container create 697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:15:33 np0005593233 systemd[1]: Started libpod-conmon-697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e.scope.
Jan 23 05:15:33 np0005593233 podman[277110]: 2026-01-23 10:15:33.860072954 +0000 UTC m=+0.027594722 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:15:33 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:15:33 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d70a2fbd16e9ecf4afebdaba8a8c25d227637a9ee9f07aa5db126112d89ee5e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:15:33 np0005593233 podman[277110]: 2026-01-23 10:15:33.995668559 +0000 UTC m=+0.163190387 container init 697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:15:34 np0005593233 podman[277110]: 2026-01-23 10:15:34.004041696 +0000 UTC m=+0.171563474 container start 697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:15:34 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [NOTICE]   (277130) : New worker (277132) forked
Jan 23 05:15:34 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [NOTICE]   (277130) : Loading success.
Jan 23 05:15:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:35.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:35.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:35 np0005593233 nova_compute[222017]: 2026-01-23 10:15:35.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:36 np0005593233 nova_compute[222017]: 2026-01-23 10:15:36.342 222021 DEBUG nova.compute.manager [req-b1140705-6391-43a3-a00f-a762685d2e60 req-be9caf29-b131-4d2c-a2da-4e3cb2027255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:36 np0005593233 nova_compute[222017]: 2026-01-23 10:15:36.344 222021 DEBUG oslo_concurrency.lockutils [req-b1140705-6391-43a3-a00f-a762685d2e60 req-be9caf29-b131-4d2c-a2da-4e3cb2027255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:36 np0005593233 nova_compute[222017]: 2026-01-23 10:15:36.344 222021 DEBUG oslo_concurrency.lockutils [req-b1140705-6391-43a3-a00f-a762685d2e60 req-be9caf29-b131-4d2c-a2da-4e3cb2027255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:36 np0005593233 nova_compute[222017]: 2026-01-23 10:15:36.344 222021 DEBUG oslo_concurrency.lockutils [req-b1140705-6391-43a3-a00f-a762685d2e60 req-be9caf29-b131-4d2c-a2da-4e3cb2027255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:36 np0005593233 nova_compute[222017]: 2026-01-23 10:15:36.344 222021 DEBUG nova.compute.manager [req-b1140705-6391-43a3-a00f-a762685d2e60 req-be9caf29-b131-4d2c-a2da-4e3cb2027255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:36 np0005593233 nova_compute[222017]: 2026-01-23 10:15:36.345 222021 WARNING nova.compute.manager [req-b1140705-6391-43a3-a00f-a762685d2e60 req-be9caf29-b131-4d2c-a2da-4e3cb2027255 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:15:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:37.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:37 np0005593233 nova_compute[222017]: 2026-01-23 10:15:37.567 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:38 np0005593233 nova_compute[222017]: 2026-01-23 10:15:38.289 222021 INFO nova.compute.manager [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Unrescuing#033[00m
Jan 23 05:15:38 np0005593233 nova_compute[222017]: 2026-01-23 10:15:38.290 222021 DEBUG oslo_concurrency.lockutils [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:38 np0005593233 nova_compute[222017]: 2026-01-23 10:15:38.290 222021 DEBUG oslo_concurrency.lockutils [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:38 np0005593233 nova_compute[222017]: 2026-01-23 10:15:38.290 222021 DEBUG nova.network.neutron [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:15:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:39.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:39 np0005593233 podman[277141]: 2026-01-23 10:15:39.086339948 +0000 UTC m=+0.091696505 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Jan 23 05:15:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:39.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:40 np0005593233 nova_compute[222017]: 2026-01-23 10:15:40.289 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:41.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:41.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:41 np0005593233 nova_compute[222017]: 2026-01-23 10:15:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:42 np0005593233 nova_compute[222017]: 2026-01-23 10:15:42.299 222021 DEBUG nova.network.neutron [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:42 np0005593233 nova_compute[222017]: 2026-01-23 10:15:42.346 222021 DEBUG oslo_concurrency.lockutils [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:42 np0005593233 nova_compute[222017]: 2026-01-23 10:15:42.348 222021 DEBUG nova.objects.instance [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'flavor' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:42 np0005593233 nova_compute[222017]: 2026-01-23 10:15:42.569 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:42.678 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:42.679 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:42.680 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:43 np0005593233 kernel: tapb7f30c18-45 (unregistering): left promiscuous mode
Jan 23 05:15:43 np0005593233 NetworkManager[48871]: <info>  [1769163343.0093] device (tapb7f30c18-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00609|binding|INFO|Releasing lport b7f30c18-45b6-4931-8d26-193df386ae94 from this chassis (sb_readonly=0)
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.025 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00610|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 down in Southbound
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00611|binding|INFO|Removing iface tapb7f30c18-45 ovn-installed in OVS
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.042 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:43.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:43 np0005593233 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 23 05:15:43 np0005593233 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008b.scope: Consumed 9.895s CPU time.
Jan 23 05:15:43 np0005593233 systemd-machined[190954]: Machine qemu-66-instance-0000008b terminated.
Jan 23 05:15:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:43.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.188 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:65:5d 10.100.0.13'], port_security=['fa:16:3e:b8:65:5d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca30801-d289-4e95-89b2-afcc3d0199a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd87239e3-bcf1-4a1e-b5bc-c1125963e6fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=b7f30c18-45b6-4931-8d26-193df386ae94) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.190 140224 INFO neutron.agent.ovn.metadata.agent [-] Port b7f30c18-45b6-4931-8d26-193df386ae94 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 unbound from our chassis#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.191 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.192 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dc29c7cf-f92d-4913-916c-f2465f6029f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.193 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace which is not needed anymore#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.285 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.290 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.300 222021 INFO nova.virt.libvirt.driver [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance destroyed successfully.#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.301 222021 DEBUG nova.objects.instance [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'numa_topology' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:43 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [NOTICE]   (277130) : haproxy version is 2.8.14-c23fe91
Jan 23 05:15:43 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [NOTICE]   (277130) : path to executable is /usr/sbin/haproxy
Jan 23 05:15:43 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [WARNING]  (277130) : Exiting Master process...
Jan 23 05:15:43 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [WARNING]  (277130) : Exiting Master process...
Jan 23 05:15:43 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [ALERT]    (277130) : Current worker (277132) exited with code 143 (Terminated)
Jan 23 05:15:43 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277126]: [WARNING]  (277130) : All workers exited. Exiting... (0)
Jan 23 05:15:43 np0005593233 systemd[1]: libpod-697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e.scope: Deactivated successfully.
Jan 23 05:15:43 np0005593233 podman[277193]: 2026-01-23 10:15:43.353200916 +0000 UTC m=+0.054157453 container died 697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:15:43 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e-userdata-shm.mount: Deactivated successfully.
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:43 np0005593233 systemd[1]: var-lib-containers-storage-overlay-4d70a2fbd16e9ecf4afebdaba8a8c25d227637a9ee9f07aa5db126112d89ee5e-merged.mount: Deactivated successfully.
Jan 23 05:15:43 np0005593233 podman[277193]: 2026-01-23 10:15:43.401056539 +0000 UTC m=+0.102013076 container cleanup 697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:15:43 np0005593233 systemd[1]: libpod-conmon-697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e.scope: Deactivated successfully.
Jan 23 05:15:43 np0005593233 kernel: tapb7f30c18-45: entered promiscuous mode
Jan 23 05:15:43 np0005593233 systemd-udevd[277170]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:15:43 np0005593233 NetworkManager[48871]: <info>  [1769163343.4567] manager: (tapb7f30c18-45): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.456 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.456 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.457 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.457 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.457 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00612|binding|INFO|Claiming lport b7f30c18-45b6-4931-8d26-193df386ae94 for this chassis.
Jan 23 05:15:43 np0005593233 NetworkManager[48871]: <info>  [1769163343.4780] device (tapb7f30c18-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:15:43 np0005593233 NetworkManager[48871]: <info>  [1769163343.4787] device (tapb7f30c18-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00613|binding|INFO|b7f30c18-45b6-4931-8d26-193df386ae94: Claiming fa:16:3e:b8:65:5d 10.100.0.13
Jan 23 05:15:43 np0005593233 podman[277231]: 2026-01-23 10:15:43.499547095 +0000 UTC m=+0.067362416 container remove 697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00614|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 ovn-installed in OVS
Jan 23 05:15:43 np0005593233 systemd-machined[190954]: New machine qemu-67-instance-0000008b.
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.507 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.507 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f079765c-57fb-44e0-9702-d3a4f6f3bdb2]: (4, ('Fri Jan 23 10:15:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e)\n697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e\nFri Jan 23 10:15:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e)\n697e4116b0ea2c728cac30851630b5ead0328d0b9787dbc54f2695113dcdbd7e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.509 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[81211ffc-a3fd-4e24-a98c-fb03b0cf5aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.510 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.512 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 kernel: tap00bd3319-b0: left promiscuous mode
Jan 23 05:15:43 np0005593233 systemd[1]: Started Virtual Machine qemu-67-instance-0000008b.
Jan 23 05:15:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:43Z|00615|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 up in Southbound
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.537 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.537 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:65:5d 10.100.0.13'], port_security=['fa:16:3e:b8:65:5d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca30801-d289-4e95-89b2-afcc3d0199a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd87239e3-bcf1-4a1e-b5bc-c1125963e6fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=b7f30c18-45b6-4931-8d26-193df386ae94) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.543 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0d5874-bc61-43e5-ae4c-a0debbb4f784]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.563 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b7df91-c8a1-4378-9915-45c2453b9d10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.564 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0aec88da-1aef-4fd2-8e54-713773b5a8cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.586 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d78200-83e6-42e4-b58d-06b7a6edb697]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 724575, 'reachable_time': 15770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277261, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.590 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.590 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[41bedaa5-2bb0-47d2-9cb0-d6ecec4319f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.590 140224 INFO neutron.agent.ovn.metadata.agent [-] Port b7f30c18-45b6-4931-8d26-193df386ae94 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 unbound from our chassis#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.592 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9#033[00m
Jan 23 05:15:43 np0005593233 systemd[1]: run-netns-ovnmeta\x2d00bd3319\x2dbfe5\x2d4acd\x2db2e4\x2d17830ee847f9.mount: Deactivated successfully.
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.608 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[523f849f-d27c-40ff-81e2-c7a1b6feeee7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.609 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00bd3319-b1 in ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.611 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00bd3319-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.612 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2a73e1-e77a-43e3-bd02-48cf6766eecd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.612 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b653c76f-158e-489b-901f-fd43d9a9b55a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.627 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[9436e88d-6225-44e1-9431-604e414f8a7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.662 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8d1339-2ec4-4fe2-b16e-cead1321a1c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.707 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[775829cd-5ebc-4d74-9f90-c36fb88a15a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 NetworkManager[48871]: <info>  [1769163343.7150] manager: (tap00bd3319-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.716 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e2ac18-1d43-4842-bbd0-817c7024107f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.782 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8af169a3-9fdc-45ed-bcf9-524cc5c6ca56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.787 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[694935bf-7fec-4db8-b9fd-05e7cf1955ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 NetworkManager[48871]: <info>  [1769163343.8180] device (tap00bd3319-b0): carrier: link connected
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.825 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[98b6d148-d9f8-40ec-b845-88434199ce68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.848 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3466eb43-dcb5-4f39-9210-e534ae48d94c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 725644, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277339, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.871 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3972b635-998d-4a3b-888a-3b06d0dc4db2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:83f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 725644, 'tstamp': 725644}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277343, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.899 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6a083e36-9d4b-4958-a766-743d2163ab7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 725644, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277344, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:15:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/626470684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:15:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:43.943 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9083d15a-2a08-48df-bc14-1ba5dcf992ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:43 np0005593233 nova_compute[222017]: 2026-01-23 10:15:43.967 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.020 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0a171537-81cd-46ad-bb2c-67021ecb0240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.021 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.021 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.022 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00bd3319-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:44 np0005593233 NetworkManager[48871]: <info>  [1769163344.0244] manager: (tap00bd3319-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 23 05:15:44 np0005593233 kernel: tap00bd3319-b0: entered promiscuous mode
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.023 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.028 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00bd3319-b0, col_values=(('external_ids', {'iface-id': '1788b5e6-601b-4e3d-a584-c0138c3308f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:44 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:44Z|00616|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.030 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.031 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.032 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eb524ccc-98ab-42cd-86c8-35bd38f002c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.033 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:15:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:15:44.034 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'env', 'PROCESS_TAG=haproxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00bd3319-bfe5-4acd-b2e4-17830ee847f9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.068 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.068 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.075 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.076 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.076 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.084 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for cca30801-d289-4e95-89b2-afcc3d0199a7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.084 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163344.0834332, cca30801-d289-4e95-89b2-afcc3d0199a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.085 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.118 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.123 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.151 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.151 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163344.0846565, cca30801-d289-4e95-89b2-afcc3d0199a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.152 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Started (Lifecycle Event)#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.190 222021 DEBUG nova.compute.manager [req-5687dc66-b8bf-4c82-b6da-09cef36d0a7d req-e047e51c-5ddc-44bc-9838-01c1c85adc58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.190 222021 DEBUG oslo_concurrency.lockutils [req-5687dc66-b8bf-4c82-b6da-09cef36d0a7d req-e047e51c-5ddc-44bc-9838-01c1c85adc58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.190 222021 DEBUG oslo_concurrency.lockutils [req-5687dc66-b8bf-4c82-b6da-09cef36d0a7d req-e047e51c-5ddc-44bc-9838-01c1c85adc58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.191 222021 DEBUG oslo_concurrency.lockutils [req-5687dc66-b8bf-4c82-b6da-09cef36d0a7d req-e047e51c-5ddc-44bc-9838-01c1c85adc58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.191 222021 DEBUG nova.compute.manager [req-5687dc66-b8bf-4c82-b6da-09cef36d0a7d req-e047e51c-5ddc-44bc-9838-01c1c85adc58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.191 222021 WARNING nova.compute.manager [req-5687dc66-b8bf-4c82-b6da-09cef36d0a7d req-e047e51c-5ddc-44bc-9838-01c1c85adc58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.192 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.196 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.227 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.338 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.339 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4132MB free_disk=20.788330078125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.339 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.339 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:44 np0005593233 podman[277419]: 2026-01-23 10:15:44.456243846 +0000 UTC m=+0.066071810 container create 43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.503 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance cca30801-d289-4e95-89b2-afcc3d0199a7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.503 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.504 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.504 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:15:44 np0005593233 systemd[1]: Started libpod-conmon-43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d.scope.
Jan 23 05:15:44 np0005593233 podman[277419]: 2026-01-23 10:15:44.421450991 +0000 UTC m=+0.031278975 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:15:44 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:15:44 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/199e49bdf6398893f526be7c1a163339ce01fda05a6c58f981f1b3d945cbbfa4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:15:44 np0005593233 podman[277419]: 2026-01-23 10:15:44.541935339 +0000 UTC m=+0.151763323 container init 43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:15:44 np0005593233 podman[277419]: 2026-01-23 10:15:44.547895398 +0000 UTC m=+0.157723362 container start 43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:15:44 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [NOTICE]   (277438) : New worker (277440) forked
Jan 23 05:15:44 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [NOTICE]   (277438) : Loading success.
Jan 23 05:15:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:15:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4233451367' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:15:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:15:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4233451367' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:15:44 np0005593233 nova_compute[222017]: 2026-01-23 10:15:44.672 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:45Z|00617|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:15:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:45Z|00618|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:15:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:45.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.072 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.150 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:45.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.158 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.190 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.269 222021 DEBUG nova.compute.manager [None req-c3072c03-f392-4b0c-8a37-d490cd00596c fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.291 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.344 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:15:45 np0005593233 nova_compute[222017]: 2026-01-23 10:15:45.344 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.329 222021 DEBUG nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.330 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.330 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.330 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.330 222021 DEBUG nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.330 222021 WARNING nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 DEBUG nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 DEBUG nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 WARNING nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.331 222021 DEBUG nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.332 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.332 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.332 222021 DEBUG oslo_concurrency.lockutils [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.332 222021 DEBUG nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:46 np0005593233 nova_compute[222017]: 2026-01-23 10:15:46.332 222021 WARNING nova.compute.manager [req-e80d15ae-d2ca-40e5-96b8-f68ee6e70d50 req-5144d90f-a519-444a-82b0-a27f8bf5392e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:15:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.004828) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347005019, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 2226, "num_deletes": 256, "total_data_size": 5089161, "memory_usage": 5159808, "flush_reason": "Manual Compaction"}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347022892, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 2083545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59678, "largest_seqno": 61899, "table_properties": {"data_size": 2076516, "index_size": 3719, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 18820, "raw_average_key_size": 21, "raw_value_size": 2060815, "raw_average_value_size": 2357, "num_data_blocks": 163, "num_entries": 874, "num_filter_entries": 874, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163170, "oldest_key_time": 1769163170, "file_creation_time": 1769163347, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 18133 microseconds, and 10549 cpu microseconds.
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.022983) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 2083545 bytes OK
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.023019) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.025059) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.025082) EVENT_LOG_v1 {"time_micros": 1769163347025074, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.025108) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 5079035, prev total WAL file size 5079035, number of live WAL files 2.
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.026944) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303036' seq:72057594037927935, type:22 .. '6D6772737461740032323539' seq:0, type:0; will stop at (end)
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(2034KB)], [120(11MB)]
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347027003, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 14211950, "oldest_snapshot_seqno": -1}
Jan 23 05:15:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:47.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8591 keys, 11575020 bytes, temperature: kUnknown
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347118437, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 11575020, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11519953, "index_size": 32491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21509, "raw_key_size": 221685, "raw_average_key_size": 25, "raw_value_size": 11369519, "raw_average_value_size": 1323, "num_data_blocks": 1276, "num_entries": 8591, "num_filter_entries": 8591, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163347, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.119240) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 11575020 bytes
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.127542) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.2 rd, 126.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.6 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(12.4) write-amplify(5.6) OK, records in: 9043, records dropped: 452 output_compression: NoCompression
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.127595) EVENT_LOG_v1 {"time_micros": 1769163347127578, "job": 76, "event": "compaction_finished", "compaction_time_micros": 91546, "compaction_time_cpu_micros": 43023, "output_level": 6, "num_output_files": 1, "total_output_size": 11575020, "num_input_records": 9043, "num_output_records": 8591, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347128193, "job": 76, "event": "table_file_deletion", "file_number": 122}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347130793, "job": 76, "event": "table_file_deletion", "file_number": 120}
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.026787) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.130889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.130899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.130900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.130902) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:15:47.130904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:47.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:47 np0005593233 nova_compute[222017]: 2026-01-23 10:15:47.345 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:47 np0005593233 nova_compute[222017]: 2026-01-23 10:15:47.345 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:47 np0005593233 nova_compute[222017]: 2026-01-23 10:15:47.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:47 np0005593233 nova_compute[222017]: 2026-01-23 10:15:47.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:48 np0005593233 nova_compute[222017]: 2026-01-23 10:15:48.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:48 np0005593233 nova_compute[222017]: 2026-01-23 10:15:48.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:15:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:49.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:49.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:50 np0005593233 nova_compute[222017]: 2026-01-23 10:15:50.291 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:50 np0005593233 nova_compute[222017]: 2026-01-23 10:15:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:50 np0005593233 nova_compute[222017]: 2026-01-23 10:15:50.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:15:50 np0005593233 nova_compute[222017]: 2026-01-23 10:15:50.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:15:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:51 np0005593233 nova_compute[222017]: 2026-01-23 10:15:51.111 222021 DEBUG nova.compute.manager [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:51 np0005593233 nova_compute[222017]: 2026-01-23 10:15:51.111 222021 DEBUG nova.compute.manager [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing instance network info cache due to event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:51 np0005593233 nova_compute[222017]: 2026-01-23 10:15:51.111 222021 DEBUG oslo_concurrency.lockutils [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:51 np0005593233 nova_compute[222017]: 2026-01-23 10:15:51.112 222021 DEBUG oslo_concurrency.lockutils [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:51 np0005593233 nova_compute[222017]: 2026-01-23 10:15:51.112 222021 DEBUG nova.network.neutron [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:15:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:51.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:51 np0005593233 nova_compute[222017]: 2026-01-23 10:15:51.463 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:52 np0005593233 nova_compute[222017]: 2026-01-23 10:15:52.596 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:53.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:53.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:53 np0005593233 nova_compute[222017]: 2026-01-23 10:15:53.515 222021 DEBUG nova.compute.manager [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:53 np0005593233 nova_compute[222017]: 2026-01-23 10:15:53.516 222021 DEBUG nova.compute.manager [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing instance network info cache due to event network-changed-b7f30c18-45b6-4931-8d26-193df386ae94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:53 np0005593233 nova_compute[222017]: 2026-01-23 10:15:53.516 222021 DEBUG oslo_concurrency.lockutils [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:54 np0005593233 podman[277472]: 2026-01-23 10:15:54.079976061 +0000 UTC m=+0.078521462 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 05:15:54 np0005593233 nova_compute[222017]: 2026-01-23 10:15:54.235 222021 DEBUG nova.network.neutron [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updated VIF entry in instance network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:15:54 np0005593233 nova_compute[222017]: 2026-01-23 10:15:54.236 222021 DEBUG nova.network.neutron [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:54 np0005593233 nova_compute[222017]: 2026-01-23 10:15:54.267 222021 DEBUG oslo_concurrency.lockutils [req-c6c36d3d-b045-46a1-9d1a-193978d3b66c req-56a80d00-63a5-480c-8d7a-b5b4e4c8ecf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:54 np0005593233 nova_compute[222017]: 2026-01-23 10:15:54.268 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:54 np0005593233 nova_compute[222017]: 2026-01-23 10:15:54.268 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:15:54 np0005593233 nova_compute[222017]: 2026-01-23 10:15:54.268 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:55.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:15:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:55.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:15:55 np0005593233 nova_compute[222017]: 2026-01-23 10:15:55.293 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:15:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1838503530' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:15:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:15:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1838503530' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:15:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:57.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:57.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:57 np0005593233 nova_compute[222017]: 2026-01-23 10:15:57.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:15:58Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:65:5d 10.100.0.13
Jan 23 05:15:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:59.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:15:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:15:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:59.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:15:59 np0005593233 nova_compute[222017]: 2026-01-23 10:15:59.588 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:59 np0005593233 nova_compute[222017]: 2026-01-23 10:15:59.643 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:59 np0005593233 nova_compute[222017]: 2026-01-23 10:15:59.644 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:15:59 np0005593233 nova_compute[222017]: 2026-01-23 10:15:59.644 222021 DEBUG oslo_concurrency.lockutils [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:59 np0005593233 nova_compute[222017]: 2026-01-23 10:15:59.644 222021 DEBUG nova.network.neutron [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Refreshing network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:16:00 np0005593233 nova_compute[222017]: 2026-01-23 10:16:00.296 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:01.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:01.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:02 np0005593233 nova_compute[222017]: 2026-01-23 10:16:02.672 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:03.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:03.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:03 np0005593233 nova_compute[222017]: 2026-01-23 10:16:03.640 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:03 np0005593233 nova_compute[222017]: 2026-01-23 10:16:03.862 222021 DEBUG nova.network.neutron [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updated VIF entry in instance network info cache for port b7f30c18-45b6-4931-8d26-193df386ae94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:16:03 np0005593233 nova_compute[222017]: 2026-01-23 10:16:03.863 222021 DEBUG nova.network.neutron [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [{"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:04 np0005593233 nova_compute[222017]: 2026-01-23 10:16:04.271 222021 DEBUG oslo_concurrency.lockutils [req-74cd321e-d56f-4f4b-8fcb-b7e7690254d7 req-da2d1f71-33a2-480c-aea6-f6f579a8b420 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cca30801-d289-4e95-89b2-afcc3d0199a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:16:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:05.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:05.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:05 np0005593233 nova_compute[222017]: 2026-01-23 10:16:05.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:05 np0005593233 nova_compute[222017]: 2026-01-23 10:16:05.302 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.924886) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366924941, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 445, "num_deletes": 251, "total_data_size": 515422, "memory_usage": 525488, "flush_reason": "Manual Compaction"}
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366939116, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 339893, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 61904, "largest_seqno": 62344, "table_properties": {"data_size": 337452, "index_size": 541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6041, "raw_average_key_size": 18, "raw_value_size": 332596, "raw_average_value_size": 1032, "num_data_blocks": 24, "num_entries": 322, "num_filter_entries": 322, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163348, "oldest_key_time": 1769163348, "file_creation_time": 1769163366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 14307 microseconds, and 2815 cpu microseconds.
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.939187) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 339893 bytes OK
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.939220) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.949146) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.949210) EVENT_LOG_v1 {"time_micros": 1769163366949196, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.949247) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 512644, prev total WAL file size 512644, number of live WAL files 2.
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.950010) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(331KB)], [123(11MB)]
Jan 23 05:16:06 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366950061, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 11914913, "oldest_snapshot_seqno": -1}
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8403 keys, 10025824 bytes, temperature: kUnknown
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163367049731, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 10025824, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9973387, "index_size": 30309, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21061, "raw_key_size": 218472, "raw_average_key_size": 25, "raw_value_size": 9827559, "raw_average_value_size": 1169, "num_data_blocks": 1177, "num_entries": 8403, "num_filter_entries": 8403, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.050671) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 10025824 bytes
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.052618) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 119.2 rd, 100.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.0 +0.0 blob) out(9.6 +0.0 blob), read-write-amplify(64.6) write-amplify(29.5) OK, records in: 8913, records dropped: 510 output_compression: NoCompression
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.052664) EVENT_LOG_v1 {"time_micros": 1769163367052648, "job": 78, "event": "compaction_finished", "compaction_time_micros": 99971, "compaction_time_cpu_micros": 30233, "output_level": 6, "num_output_files": 1, "total_output_size": 10025824, "num_input_records": 8913, "num_output_records": 8403, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163367052939, "job": 78, "event": "table_file_deletion", "file_number": 125}
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163367055508, "job": 78, "event": "table_file_deletion", "file_number": 123}
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:06.949892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.055619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.055630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.055633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.055634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:07.055636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:07.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:07.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.610 222021 DEBUG oslo_concurrency.lockutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.611 222021 DEBUG oslo_concurrency.lockutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.635 222021 DEBUG nova.objects.instance [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.705 222021 DEBUG oslo_concurrency.lockutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.726 222021 DEBUG oslo_concurrency.lockutils [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.727 222021 DEBUG oslo_concurrency.lockutils [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:07 np0005593233 nova_compute[222017]: 2026-01-23 10:16:07.757 222021 INFO nova.compute.manager [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Detaching volume c633bb7d-c04b-4d7b-a574-1649ca824520#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.200 222021 INFO nova.virt.block_device [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Attempting to driver detach volume c633bb7d-c04b-4d7b-a574-1649ca824520 from mountpoint /dev/vdb#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.208 222021 DEBUG oslo_concurrency.lockutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.208 222021 DEBUG oslo_concurrency.lockutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.209 222021 INFO nova.compute.manager [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attaching volume 3d24b1fa-0276-448e-a73a-1cba237d818c to /dev/vdb#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.214 222021 DEBUG nova.virt.libvirt.driver [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Attempting to detach device vdb from instance cca30801-d289-4e95-89b2-afcc3d0199a7 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.215 222021 DEBUG nova.virt.libvirt.guest [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-c633bb7d-c04b-4d7b-a574-1649ca824520">
Jan 23 05:16:08 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <serial>c633bb7d-c04b-4d7b-a574-1649ca824520</serial>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:08 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.225 222021 INFO nova.virt.libvirt.driver [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Successfully detached device vdb from instance cca30801-d289-4e95-89b2-afcc3d0199a7 from the persistent domain config.#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.225 222021 DEBUG nova.virt.libvirt.driver [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance cca30801-d289-4e95-89b2-afcc3d0199a7 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.226 222021 DEBUG nova.virt.libvirt.guest [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-c633bb7d-c04b-4d7b-a574-1649ca824520">
Jan 23 05:16:08 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <serial>c633bb7d-c04b-4d7b-a574-1649ca824520</serial>
Jan 23 05:16:08 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:16:08 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:08 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.348 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769163368.3475718, cca30801-d289-4e95-89b2-afcc3d0199a7 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.349 222021 DEBUG nova.virt.libvirt.driver [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance cca30801-d289-4e95-89b2-afcc3d0199a7 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.352 222021 INFO nova.virt.libvirt.driver [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Successfully detached device vdb from instance cca30801-d289-4e95-89b2-afcc3d0199a7 from the live domain config.#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.517 222021 DEBUG os_brick.utils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.519 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.534 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.534 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[086c7aa9-3acc-4868-9bf2-ec7606e84e8d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.536 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.546 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.546 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[22dc1190-98f7-469a-bc05-18a1c16faf0b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.548 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.558 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.559 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[6b04e315-23b5-4a3b-932d-98097418c852]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.560 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[abc67ab9-7524-4946-a584-9314d4a25a4b]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.561 222021 DEBUG oslo_concurrency.processutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.599 222021 DEBUG oslo_concurrency.processutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "nvme version" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.602 222021 DEBUG os_brick.initiator.connectors.lightos [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.602 222021 DEBUG os_brick.initiator.connectors.lightos [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.602 222021 DEBUG os_brick.initiator.connectors.lightos [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.603 222021 DEBUG os_brick.utils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] <== get_connector_properties: return (84ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.603 222021 DEBUG nova.virt.block_device [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating existing volume attachment record: 95eb8608-e2ba-4ce0-9d64-d3d14c8f5c8d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.680 222021 DEBUG nova.objects.instance [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'flavor' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:08 np0005593233 nova_compute[222017]: 2026-01-23 10:16:08.791 222021 DEBUG oslo_concurrency.lockutils [None req-a07e21ed-ca16-4633-87d6-a0f66b2522fb fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:16:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:16:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:16:08 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:16:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:09.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:09.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:09 np0005593233 nova_compute[222017]: 2026-01-23 10:16:09.378 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:16:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1858783872' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.057 222021 DEBUG nova.objects.instance [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.062 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.062 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.063 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.063 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.063 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.064 222021 INFO nova.compute.manager [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Terminating instance#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.065 222021 DEBUG nova.compute.manager [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.089 222021 DEBUG nova.virt.libvirt.driver [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attempting to attach volume 3d24b1fa-0276-448e-a73a-1cba237d818c with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.093 222021 DEBUG nova.virt.libvirt.guest [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-3d24b1fa-0276-448e-a73a-1cba237d818c">
Jan 23 05:16:10 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:16:10 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:16:10 np0005593233 nova_compute[222017]:  <serial>3d24b1fa-0276-448e-a73a-1cba237d818c</serial>
Jan 23 05:16:10 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:10 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:16:10 np0005593233 podman[277631]: 2026-01-23 10:16:10.14276731 +0000 UTC m=+0.147230535 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.296 222021 DEBUG nova.virt.libvirt.driver [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.296 222021 DEBUG nova.virt.libvirt.driver [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.296 222021 DEBUG nova.virt.libvirt.driver [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.296 222021 DEBUG nova.virt.libvirt.driver [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No VIF found with MAC fa:16:3e:42:68:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.300 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.757 222021 DEBUG oslo_concurrency.lockutils [None req-a2e3c9a5-b01e-41ab-b1be-bdf29e49328e 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:10 np0005593233 kernel: tapb7f30c18-45 (unregistering): left promiscuous mode
Jan 23 05:16:10 np0005593233 NetworkManager[48871]: <info>  [1769163370.7763] device (tapb7f30c18-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.790 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:16:10Z|00619|binding|INFO|Releasing lport b7f30c18-45b6-4931-8d26-193df386ae94 from this chassis (sb_readonly=0)
Jan 23 05:16:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:16:10Z|00620|binding|INFO|Setting lport b7f30c18-45b6-4931-8d26-193df386ae94 down in Southbound
Jan 23 05:16:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:16:10Z|00621|binding|INFO|Removing iface tapb7f30c18-45 ovn-installed in OVS
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:10.808 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:65:5d 10.100.0.13'], port_security=['fa:16:3e:b8:65:5d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cca30801-d289-4e95-89b2-afcc3d0199a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd87239e3-bcf1-4a1e-b5bc-c1125963e6fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=b7f30c18-45b6-4931-8d26-193df386ae94) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:16:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:10.810 140224 INFO neutron.agent.ovn.metadata.agent [-] Port b7f30c18-45b6-4931-8d26-193df386ae94 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 unbound from our chassis#033[00m
Jan 23 05:16:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:10.811 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:16:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:10.814 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[97be1c09-3c58-4a3d-9aa6-4a81b9a6ad7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:10.815 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace which is not needed anymore#033[00m
Jan 23 05:16:10 np0005593233 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 23 05:16:10 np0005593233 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000008b.scope: Consumed 14.806s CPU time.
Jan 23 05:16:10 np0005593233 systemd-machined[190954]: Machine qemu-67-instance-0000008b terminated.
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.930 222021 INFO nova.virt.libvirt.driver [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Instance destroyed successfully.#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.931 222021 DEBUG nova.objects.instance [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'resources' on Instance uuid cca30801-d289-4e95-89b2-afcc3d0199a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:10 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [NOTICE]   (277438) : haproxy version is 2.8.14-c23fe91
Jan 23 05:16:10 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [NOTICE]   (277438) : path to executable is /usr/sbin/haproxy
Jan 23 05:16:10 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [WARNING]  (277438) : Exiting Master process...
Jan 23 05:16:10 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [ALERT]    (277438) : Current worker (277440) exited with code 143 (Terminated)
Jan 23 05:16:10 np0005593233 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[277434]: [WARNING]  (277438) : All workers exited. Exiting... (0)
Jan 23 05:16:10 np0005593233 systemd[1]: libpod-43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d.scope: Deactivated successfully.
Jan 23 05:16:10 np0005593233 podman[277710]: 2026-01-23 10:16:10.991479355 +0000 UTC m=+0.049060748 container died 43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.997 222021 DEBUG nova.virt.libvirt.vif [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1133977145',display_name='tempest-ServerRescueNegativeTestJSON-server-1133977145',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1133977145',id=139,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA8EX9GibC86Iq3T2qC/IrlQ+r5p/SWpdEE8xXluu9bLU1lGamBBUxI8TbZp+bpGCmD0iIc57s4GniCMNRyOJdr1+wsaQQ63CuK5/FMyW72KViBWccZ5JBdjf1rnPzLdhg==',key_name='tempest-keypair-1649200342',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:15:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-ajqso3l2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:15:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=cca30801-d289-4e95-89b2-afcc3d0199a7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.998 222021 DEBUG nova.network.os_vif_util [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "b7f30c18-45b6-4931-8d26-193df386ae94", "address": "fa:16:3e:b8:65:5d", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f30c18-45", "ovs_interfaceid": "b7f30c18-45b6-4931-8d26-193df386ae94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.999 222021 DEBUG nova.network.os_vif_util [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:16:10 np0005593233 nova_compute[222017]: 2026-01-23 10:16:10.999 222021 DEBUG os_vif [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.001 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.002 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7f30c18-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.003 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.008 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.010 222021 INFO os_vif [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:65:5d,bridge_name='br-int',has_traffic_filtering=True,id=b7f30c18-45b6-4931-8d26-193df386ae94,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f30c18-45')#033[00m
Jan 23 05:16:11 np0005593233 systemd[1]: var-lib-containers-storage-overlay-199e49bdf6398893f526be7c1a163339ce01fda05a6c58f981f1b3d945cbbfa4-merged.mount: Deactivated successfully.
Jan 23 05:16:11 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:16:11 np0005593233 podman[277710]: 2026-01-23 10:16:11.043030533 +0000 UTC m=+0.100611926 container cleanup 43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:16:11 np0005593233 systemd[1]: libpod-conmon-43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d.scope: Deactivated successfully.
Jan 23 05:16:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:11.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:11 np0005593233 podman[277758]: 2026-01-23 10:16:11.117481129 +0000 UTC m=+0.049134731 container remove 43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.126 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1066b234-43c5-4b37-bd27-0122a584ffdf]: (4, ('Fri Jan 23 10:16:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d)\n43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d\nFri Jan 23 10:16:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d)\n43babdc3bc5fac8a3da6f861366f0df04422e300a5b4884855c504b69f36272d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.128 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[78ebd480-57b3-4283-8769-c831f97852c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.129 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.131 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:11 np0005593233 kernel: tap00bd3319-b0: left promiscuous mode
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.145 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.148 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8422ba2b-4a10-4da8-a32b-ccd00bc8034d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.164 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[26fd4f75-18ce-463a-89f3-fb66cfdd0964]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.166 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[df3b7cf9-36e0-4b83-8d5c-fb76119c938c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.186 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a42ebd11-7026-429a-a34c-1eb361aa0100]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 725632, 'reachable_time': 16252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277776, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 systemd[1]: run-netns-ovnmeta\x2d00bd3319\x2dbfe5\x2d4acd\x2db2e4\x2d17830ee847f9.mount: Deactivated successfully.
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.191 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:16:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:11.191 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e05863-c653-44ce-8617-85c55156ddc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:11.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.349 222021 DEBUG nova.compute.manager [req-70efd6c9-a5fd-4177-ab00-316cdb3844a9 req-441f6a41-0253-4812-b852-c3900441f0c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.349 222021 DEBUG oslo_concurrency.lockutils [req-70efd6c9-a5fd-4177-ab00-316cdb3844a9 req-441f6a41-0253-4812-b852-c3900441f0c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.350 222021 DEBUG oslo_concurrency.lockutils [req-70efd6c9-a5fd-4177-ab00-316cdb3844a9 req-441f6a41-0253-4812-b852-c3900441f0c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.350 222021 DEBUG oslo_concurrency.lockutils [req-70efd6c9-a5fd-4177-ab00-316cdb3844a9 req-441f6a41-0253-4812-b852-c3900441f0c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.350 222021 DEBUG nova.compute.manager [req-70efd6c9-a5fd-4177-ab00-316cdb3844a9 req-441f6a41-0253-4812-b852-c3900441f0c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.350 222021 DEBUG nova.compute.manager [req-70efd6c9-a5fd-4177-ab00-316cdb3844a9 req-441f6a41-0253-4812-b852-c3900441f0c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-unplugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.389 222021 INFO nova.virt.libvirt.driver [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Deleting instance files /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7_del#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.390 222021 INFO nova.virt.libvirt.driver [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Deletion of /var/lib/nova/instances/cca30801-d289-4e95-89b2-afcc3d0199a7_del complete#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.482 222021 INFO nova.compute.manager [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Took 1.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.482 222021 DEBUG oslo.service.loopingcall [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.483 222021 DEBUG nova.compute.manager [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.483 222021 DEBUG nova.network.neutron [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:16:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.794 222021 DEBUG oslo_concurrency.lockutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.794 222021 DEBUG oslo_concurrency.lockutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.827 222021 DEBUG nova.objects.instance [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:11 np0005593233 nova_compute[222017]: 2026-01-23 10:16:11.881 222021 DEBUG oslo_concurrency.lockutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.008297) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372008353, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 311, "num_deletes": 250, "total_data_size": 130006, "memory_usage": 137528, "flush_reason": "Manual Compaction"}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372011378, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 85993, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62349, "largest_seqno": 62655, "table_properties": {"data_size": 83980, "index_size": 176, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4261, "raw_average_key_size": 15, "raw_value_size": 80058, "raw_average_value_size": 283, "num_data_blocks": 8, "num_entries": 282, "num_filter_entries": 282, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163367, "oldest_key_time": 1769163367, "file_creation_time": 1769163372, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 3137 microseconds, and 1442 cpu microseconds.
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011433) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 85993 bytes OK
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011459) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.012785) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.012804) EVENT_LOG_v1 {"time_micros": 1769163372012797, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.012832) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 127747, prev total WAL file size 127747, number of live WAL files 2.
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.013263) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353031' seq:0, type:0; will stop at (end)
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(83KB)], [126(9790KB)]
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372013313, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 10111817, "oldest_snapshot_seqno": -1}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 8176 keys, 9042167 bytes, temperature: kUnknown
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372070976, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 9042167, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8992093, "index_size": 28521, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20485, "raw_key_size": 215477, "raw_average_key_size": 26, "raw_value_size": 8850879, "raw_average_value_size": 1082, "num_data_blocks": 1084, "num_entries": 8176, "num_filter_entries": 8176, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163372, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.071338) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 9042167 bytes
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.073477) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.0 rd, 156.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 9.6 +0.0 blob) out(8.6 +0.0 blob), read-write-amplify(222.7) write-amplify(105.2) OK, records in: 8685, records dropped: 509 output_compression: NoCompression
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.073511) EVENT_LOG_v1 {"time_micros": 1769163372073497, "job": 80, "event": "compaction_finished", "compaction_time_micros": 57768, "compaction_time_cpu_micros": 24452, "output_level": 6, "num_output_files": 1, "total_output_size": 9042167, "num_input_records": 8685, "num_output_records": 8176, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372073710, "job": 80, "event": "table_file_deletion", "file_number": 128}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372075495, "job": 80, "event": "table_file_deletion", "file_number": 126}
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.013175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.075529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.075534) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.075536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.075538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:16:12.075539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593233 nova_compute[222017]: 2026-01-23 10:16:12.725 222021 DEBUG oslo_concurrency.lockutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:12 np0005593233 nova_compute[222017]: 2026-01-23 10:16:12.726 222021 DEBUG oslo_concurrency.lockutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:12 np0005593233 nova_compute[222017]: 2026-01-23 10:16:12.726 222021 INFO nova.compute.manager [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attaching volume e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa to /dev/vdc#033[00m
Jan 23 05:16:12 np0005593233 nova_compute[222017]: 2026-01-23 10:16:12.737 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:12 np0005593233 nova_compute[222017]: 2026-01-23 10:16:12.999 222021 DEBUG os_brick.utils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.000 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.014 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.015 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c63025c1-70df-4adc-b20b-13750b4bf797]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.016 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.027 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.027 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[86ed7a64-84d1-406a-8d48-ec3c0b187179]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.029 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.040 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.041 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[4244be84-b2e1-4a7b-ab44-268e550322d9]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.042 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[08a220e1-7f85-44a7-a4cd-7b01ed63dcf5]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.043 222021 DEBUG oslo_concurrency.processutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.085 222021 DEBUG oslo_concurrency.processutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "nvme version" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.088 222021 DEBUG os_brick.initiator.connectors.lightos [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.089 222021 DEBUG os_brick.initiator.connectors.lightos [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.089 222021 DEBUG os_brick.initiator.connectors.lightos [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.089 222021 DEBUG os_brick.utils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] <== get_connector_properties: return (89ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.089 222021 DEBUG nova.virt.block_device [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating existing volume attachment record: 4b6ed06c-49b5-4410-93b2-ea6b20f8212b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:16:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:13.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:13.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.604 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.637 222021 DEBUG nova.compute.manager [req-e433e8f6-48ec-4441-a1c2-9ba0aa6a092f req-814482f7-6053-4c02-a246-062aeea82077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.638 222021 DEBUG oslo_concurrency.lockutils [req-e433e8f6-48ec-4441-a1c2-9ba0aa6a092f req-814482f7-6053-4c02-a246-062aeea82077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.638 222021 DEBUG oslo_concurrency.lockutils [req-e433e8f6-48ec-4441-a1c2-9ba0aa6a092f req-814482f7-6053-4c02-a246-062aeea82077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.638 222021 DEBUG oslo_concurrency.lockutils [req-e433e8f6-48ec-4441-a1c2-9ba0aa6a092f req-814482f7-6053-4c02-a246-062aeea82077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.639 222021 DEBUG nova.compute.manager [req-e433e8f6-48ec-4441-a1c2-9ba0aa6a092f req-814482f7-6053-4c02-a246-062aeea82077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] No waiting events found dispatching network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:16:13 np0005593233 nova_compute[222017]: 2026-01-23 10:16:13.639 222021 WARNING nova.compute.manager [req-e433e8f6-48ec-4441-a1c2-9ba0aa6a092f req-814482f7-6053-4c02-a246-062aeea82077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received unexpected event network-vif-plugged-b7f30c18-45b6-4931-8d26-193df386ae94 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.252 222021 DEBUG nova.objects.instance [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.294 222021 DEBUG nova.virt.libvirt.driver [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attempting to attach volume e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.298 222021 DEBUG nova.virt.libvirt.guest [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa">
Jan 23 05:16:14 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:16:14 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:16:14 np0005593233 nova_compute[222017]:  <serial>e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa</serial>
Jan 23 05:16:14 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:14 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.424 222021 DEBUG nova.network.neutron [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.476 222021 INFO nova.compute.manager [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Took 2.99 seconds to deallocate network for instance.#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.488 222021 DEBUG nova.virt.libvirt.driver [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.489 222021 DEBUG nova.virt.libvirt.driver [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.489 222021 DEBUG nova.virt.libvirt.driver [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.489 222021 DEBUG nova.virt.libvirt.driver [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.490 222021 DEBUG nova.virt.libvirt.driver [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No VIF found with MAC fa:16:3e:42:68:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.584 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.585 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.682 222021 DEBUG nova.compute.manager [req-9e2729f8-1d3b-4e9d-a224-d6bfef413dd9 req-a70f2b7b-cdbc-4e80-ab14-db0f861ac8c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Received event network-vif-deleted-b7f30c18-45b6-4931-8d26-193df386ae94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.752 222021 DEBUG oslo_concurrency.processutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:14 np0005593233 nova_compute[222017]: 2026-01-23 10:16:14.976 222021 DEBUG oslo_concurrency.lockutils [None req-f91a4bb9-01e8-4ebd-b70c-58ee969e7a21 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:15.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:15.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/283980972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:15 np0005593233 nova_compute[222017]: 2026-01-23 10:16:15.285 222021 DEBUG oslo_concurrency.processutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:15 np0005593233 nova_compute[222017]: 2026-01-23 10:16:15.293 222021 DEBUG nova.compute.provider_tree [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:16:15 np0005593233 nova_compute[222017]: 2026-01-23 10:16:15.346 222021 DEBUG nova.scheduler.client.report [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:16:15 np0005593233 nova_compute[222017]: 2026-01-23 10:16:15.473 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:15 np0005593233 nova_compute[222017]: 2026-01-23 10:16:15.700 222021 INFO nova.scheduler.client.report [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Deleted allocations for instance cca30801-d289-4e95-89b2-afcc3d0199a7#033[00m
Jan 23 05:16:15 np0005593233 nova_compute[222017]: 2026-01-23 10:16:15.907 222021 DEBUG oslo_concurrency.lockutils [None req-92260248-f85d-4de4-b04f-e0fc0bc4075e fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "cca30801-d289-4e95-89b2-afcc3d0199a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:16 np0005593233 nova_compute[222017]: 2026-01-23 10:16:16.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:16:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:16:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:17.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:17 np0005593233 nova_compute[222017]: 2026-01-23 10:16:17.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:19.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:19.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:19 np0005593233 nova_compute[222017]: 2026-01-23 10:16:19.785 222021 DEBUG nova.compute.manager [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:19 np0005593233 nova_compute[222017]: 2026-01-23 10:16:19.786 222021 DEBUG nova.compute.manager [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:16:19 np0005593233 nova_compute[222017]: 2026-01-23 10:16:19.788 222021 DEBUG oslo_concurrency.lockutils [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:16:19 np0005593233 nova_compute[222017]: 2026-01-23 10:16:19.788 222021 DEBUG oslo_concurrency.lockutils [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:16:19 np0005593233 nova_compute[222017]: 2026-01-23 10:16:19.789 222021 DEBUG nova.network.neutron [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:16:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:16:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/807689243' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:16:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:16:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/807689243' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:16:21 np0005593233 nova_compute[222017]: 2026-01-23 10:16:21.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:21.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:21.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:16:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2839213652' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:16:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:16:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2839213652' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:16:22 np0005593233 nova_compute[222017]: 2026-01-23 10:16:22.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:23.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:23.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:24 np0005593233 nova_compute[222017]: 2026-01-23 10:16:24.610 222021 DEBUG nova.network.neutron [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:16:24 np0005593233 nova_compute[222017]: 2026-01-23 10:16:24.611 222021 DEBUG nova.network.neutron [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:25 np0005593233 podman[277879]: 2026-01-23 10:16:25.054511649 +0000 UTC m=+0.062166439 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 05:16:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:25.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:25.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:25 np0005593233 nova_compute[222017]: 2026-01-23 10:16:25.235 222021 DEBUG oslo_concurrency.lockutils [req-3ec353bb-e0f8-487b-b0e4-8b26a0180af3 req-92c2d0c5-8d07-4998-b800-f9092fe1420e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:16:25 np0005593233 nova_compute[222017]: 2026-01-23 10:16:25.929 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163370.9272475, cca30801-d289-4e95-89b2-afcc3d0199a7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:16:25 np0005593233 nova_compute[222017]: 2026-01-23 10:16:25.929 222021 INFO nova.compute.manager [-] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.015 222021 DEBUG nova.compute.manager [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.015 222021 DEBUG nova.compute.manager [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.017 222021 DEBUG oslo_concurrency.lockutils [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.017 222021 DEBUG oslo_concurrency.lockutils [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.017 222021 DEBUG nova.network.neutron [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:16:26 np0005593233 nova_compute[222017]: 2026-01-23 10:16:26.108 222021 DEBUG nova.compute.manager [None req-70cd80c8-4c6d-4f59-a585-77a0498029c9 - - - - - -] [instance: cca30801-d289-4e95-89b2-afcc3d0199a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:16:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:27.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:27.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:27 np0005593233 nova_compute[222017]: 2026-01-23 10:16:27.826 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:28.030 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:16:28 np0005593233 nova_compute[222017]: 2026-01-23 10:16:28.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:28.032 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:16:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:29.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:29.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.784 222021 DEBUG nova.network.neutron [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.785 222021 DEBUG nova.network.neutron [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.864 222021 DEBUG oslo_concurrency.lockutils [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.865 222021 DEBUG nova.compute.manager [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.865 222021 DEBUG nova.compute.manager [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.866 222021 DEBUG oslo_concurrency.lockutils [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.866 222021 DEBUG oslo_concurrency.lockutils [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:16:30 np0005593233 nova_compute[222017]: 2026-01-23 10:16:30.866 222021 DEBUG nova.network.neutron [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:16:31 np0005593233 nova_compute[222017]: 2026-01-23 10:16:31.012 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:31.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:31.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:32 np0005593233 nova_compute[222017]: 2026-01-23 10:16:32.829 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:33.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:33.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:35.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:35.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:16:35Z|00622|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:16:35 np0005593233 nova_compute[222017]: 2026-01-23 10:16:35.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:36 np0005593233 nova_compute[222017]: 2026-01-23 10:16:36.015 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:36 np0005593233 nova_compute[222017]: 2026-01-23 10:16:36.252 222021 DEBUG nova.network.neutron [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:16:36 np0005593233 nova_compute[222017]: 2026-01-23 10:16:36.253 222021 DEBUG nova.network.neutron [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:36 np0005593233 nova_compute[222017]: 2026-01-23 10:16:36.469 222021 DEBUG oslo_concurrency.lockutils [req-23c38d04-a6a8-453e-8c64-70da770e93dd req-2c8831bf-5634-4977-a050-37030a09a43a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:16:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:37.035 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:16:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:37.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:37.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:37 np0005593233 nova_compute[222017]: 2026-01-23 10:16:37.830 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:39 np0005593233 nova_compute[222017]: 2026-01-23 10:16:39.049 222021 DEBUG nova.compute.manager [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:39 np0005593233 nova_compute[222017]: 2026-01-23 10:16:39.049 222021 DEBUG nova.compute.manager [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:16:39 np0005593233 nova_compute[222017]: 2026-01-23 10:16:39.050 222021 DEBUG oslo_concurrency.lockutils [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:16:39 np0005593233 nova_compute[222017]: 2026-01-23 10:16:39.050 222021 DEBUG oslo_concurrency.lockutils [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:16:39 np0005593233 nova_compute[222017]: 2026-01-23 10:16:39.051 222021 DEBUG nova.network.neutron [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:16:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:39.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:39.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:41 np0005593233 nova_compute[222017]: 2026-01-23 10:16:41.017 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:41 np0005593233 podman[277901]: 2026-01-23 10:16:41.110930397 +0000 UTC m=+0.121375064 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:16:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:41.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:41.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:41 np0005593233 nova_compute[222017]: 2026-01-23 10:16:41.377 222021 DEBUG nova.network.neutron [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:16:41 np0005593233 nova_compute[222017]: 2026-01-23 10:16:41.377 222021 DEBUG nova.network.neutron [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:41 np0005593233 nova_compute[222017]: 2026-01-23 10:16:41.415 222021 DEBUG oslo_concurrency.lockutils [req-7e40182e-a0ed-42da-9fbd-fe55a7618e13 req-0933eecf-81d0-4a44-b79d-25a57bbdf1bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:16:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:42 np0005593233 nova_compute[222017]: 2026-01-23 10:16:42.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:42 np0005593233 nova_compute[222017]: 2026-01-23 10:16:42.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:42.679 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:42.680 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:16:42.680 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:42 np0005593233 nova_compute[222017]: 2026-01-23 10:16:42.833 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:43.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:43.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.519 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.519 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.520 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.520 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.520 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.863 222021 DEBUG oslo_concurrency.lockutils [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.865 222021 DEBUG oslo_concurrency.lockutils [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2797897778' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:44 np0005593233 nova_compute[222017]: 2026-01-23 10:16:44.998 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.115 222021 INFO nova.compute.manager [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Detaching volume 3d24b1fa-0276-448e-a73a-1cba237d818c#033[00m
Jan 23 05:16:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:45.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.157 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.157 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.158 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.158 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:16:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:45.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.359 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.361 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4208MB free_disk=20.987842559814453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.361 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.361 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.552 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.553 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.553 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.570 222021 INFO nova.virt.block_device [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attempting to driver detach volume 3d24b1fa-0276-448e-a73a-1cba237d818c from mountpoint /dev/vdb#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.584 222021 DEBUG nova.virt.libvirt.driver [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Attempting to detach device vdb from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.585 222021 DEBUG nova.virt.libvirt.guest [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-3d24b1fa-0276-448e-a73a-1cba237d818c">
Jan 23 05:16:45 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <serial>3d24b1fa-0276-448e-a73a-1cba237d818c</serial>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:45 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.595 222021 INFO nova.virt.libvirt.driver [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdb from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the persistent domain config.#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.596 222021 DEBUG nova.virt.libvirt.driver [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.596 222021 DEBUG nova.virt.libvirt.guest [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-3d24b1fa-0276-448e-a73a-1cba237d818c">
Jan 23 05:16:45 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <serial>3d24b1fa-0276-448e-a73a-1cba237d818c</serial>
Jan 23 05:16:45 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:16:45 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:45 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.638 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.736 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769163405.7356598, fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.739 222021 DEBUG nova.virt.libvirt.driver [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:16:45 np0005593233 nova_compute[222017]: 2026-01-23 10:16:45.744 222021 INFO nova.virt.libvirt.driver [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdb from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the live domain config.#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/575820677' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.120 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.128 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.167 222021 DEBUG nova.objects.instance [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.172 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.214 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.214 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:46 np0005593233 nova_compute[222017]: 2026-01-23 10:16:46.241 222021 DEBUG oslo_concurrency.lockutils [None req-9e7d69f5-d3f4-404b-a67b-1462e6d48c6a 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:47.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:47 np0005593233 nova_compute[222017]: 2026-01-23 10:16:47.216 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:47 np0005593233 nova_compute[222017]: 2026-01-23 10:16:47.217 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:47 np0005593233 nova_compute[222017]: 2026-01-23 10:16:47.217 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:47.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:47 np0005593233 nova_compute[222017]: 2026-01-23 10:16:47.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:47 np0005593233 nova_compute[222017]: 2026-01-23 10:16:47.836 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:48 np0005593233 nova_compute[222017]: 2026-01-23 10:16:48.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:48 np0005593233 nova_compute[222017]: 2026-01-23 10:16:48.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:16:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:49.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:49.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:16:50Z|00623|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:16:50 np0005593233 nova_compute[222017]: 2026-01-23 10:16:50.773 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:50 np0005593233 nova_compute[222017]: 2026-01-23 10:16:50.871 222021 DEBUG oslo_concurrency.lockutils [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:50 np0005593233 nova_compute[222017]: 2026-01-23 10:16:50.871 222021 DEBUG oslo_concurrency.lockutils [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.115 222021 INFO nova.compute.manager [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Detaching volume e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa#033[00m
Jan 23 05:16:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:51.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:16:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:51.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.621 222021 INFO nova.virt.block_device [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Attempting to driver detach volume e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa from mountpoint /dev/vdc#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.631 222021 DEBUG nova.virt.libvirt.driver [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Attempting to detach device vdc from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.632 222021 DEBUG nova.virt.libvirt.guest [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa">
Jan 23 05:16:51 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <serial>e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa</serial>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:51 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.643 222021 INFO nova.virt.libvirt.driver [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdc from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the persistent domain config.#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.643 222021 DEBUG nova.virt.libvirt.driver [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:16:51 np0005593233 nova_compute[222017]: 2026-01-23 10:16:51.643 222021 DEBUG nova.virt.libvirt.guest [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa">
Jan 23 05:16:51 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <serial>e7ae8c2b-16eb-4a7d-ae1d-2b336c442dfa</serial>
Jan 23 05:16:51 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 23 05:16:51 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:16:51 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:16:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.006 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769163412.0063486, fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.011 222021 DEBUG nova.virt.libvirt.driver [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.014 222021 INFO nova.virt.libvirt.driver [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdc from instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 from the live domain config.#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.438 222021 DEBUG nova.objects.instance [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.531 222021 DEBUG oslo_concurrency.lockutils [None req-795b0b0c-bc8e-4d75-9053-96dbbe858294 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:52 np0005593233 nova_compute[222017]: 2026-01-23 10:16:52.841 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:53.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:54 np0005593233 nova_compute[222017]: 2026-01-23 10:16:54.379 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:16:54 np0005593233 nova_compute[222017]: 2026-01-23 10:16:54.380 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:16:54 np0005593233 nova_compute[222017]: 2026-01-23 10:16:54.380 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:16:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:55.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:56 np0005593233 nova_compute[222017]: 2026-01-23 10:16:56.989 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:57 np0005593233 podman[277977]: 2026-01-23 10:16:57.077537814 +0000 UTC m=+0.063093196 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:16:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:16:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:57.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:16:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:57.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:57 np0005593233 nova_compute[222017]: 2026-01-23 10:16:57.843 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:59.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:16:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:59.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:01.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:01.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:01 np0005593233 nova_compute[222017]: 2026-01-23 10:17:01.992 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:02 np0005593233 nova_compute[222017]: 2026-01-23 10:17:02.847 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:03.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:03.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:04 np0005593233 nova_compute[222017]: 2026-01-23 10:17:04.249 222021 DEBUG nova.compute.manager [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:04 np0005593233 nova_compute[222017]: 2026-01-23 10:17:04.249 222021 DEBUG nova.compute.manager [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:17:04 np0005593233 nova_compute[222017]: 2026-01-23 10:17:04.250 222021 DEBUG oslo_concurrency.lockutils [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:04.913 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:17:04 np0005593233 nova_compute[222017]: 2026-01-23 10:17:04.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:04.915 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:17:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:05.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:05.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:06 np0005593233 nova_compute[222017]: 2026-01-23 10:17:06.994 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:07.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:07.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:07 np0005593233 nova_compute[222017]: 2026-01-23 10:17:07.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:09.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:09.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:09 np0005593233 nova_compute[222017]: 2026-01-23 10:17:09.722 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:09 np0005593233 nova_compute[222017]: 2026-01-23 10:17:09.844 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:09 np0005593233 nova_compute[222017]: 2026-01-23 10:17:09.844 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:17:09 np0005593233 nova_compute[222017]: 2026-01-23 10:17:09.845 222021 DEBUG oslo_concurrency.lockutils [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:09 np0005593233 nova_compute[222017]: 2026-01-23 10:17:09.846 222021 DEBUG nova.network.neutron [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:17:10 np0005593233 nova_compute[222017]: 2026-01-23 10:17:10.117 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:11.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:11.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:11.917 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:11 np0005593233 nova_compute[222017]: 2026-01-23 10:17:11.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:12 np0005593233 podman[277997]: 2026-01-23 10:17:12.126212534 +0000 UTC m=+0.125318345 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:17:12 np0005593233 nova_compute[222017]: 2026-01-23 10:17:12.840 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:12 np0005593233 nova_compute[222017]: 2026-01-23 10:17:12.851 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:13.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:13.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:14 np0005593233 nova_compute[222017]: 2026-01-23 10:17:14.101 222021 DEBUG nova.network.neutron [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:17:14 np0005593233 nova_compute[222017]: 2026-01-23 10:17:14.102 222021 DEBUG nova.network.neutron [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:14 np0005593233 nova_compute[222017]: 2026-01-23 10:17:14.142 222021 DEBUG oslo_concurrency.lockutils [req-b4b373a1-d17f-4bf5-b99c-3597dfa0ba84 req-f1fd3c55-63d6-459d-9f64-d5034ef39c65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:15.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:15.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:17:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:17:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:17:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:17 np0005593233 nova_compute[222017]: 2026-01-23 10:17:16.999 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:17.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:17.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1656960970' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:17 np0005593233 nova_compute[222017]: 2026-01-23 10:17:17.899 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:19.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:19.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:20 np0005593233 nova_compute[222017]: 2026-01-23 10:17:20.442 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:21.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:21.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:22 np0005593233 nova_compute[222017]: 2026-01-23 10:17:22.001 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:22 np0005593233 nova_compute[222017]: 2026-01-23 10:17:22.903 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:17:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:17:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:23.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:23.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:25.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:25.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:26 np0005593233 nova_compute[222017]: 2026-01-23 10:17:26.727 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:27 np0005593233 nova_compute[222017]: 2026-01-23 10:17:27.003 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:27.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:27.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:27 np0005593233 nova_compute[222017]: 2026-01-23 10:17:27.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:28 np0005593233 podman[278203]: 2026-01-23 10:17:28.066972249 +0000 UTC m=+0.075627440 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:17:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:29.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:29.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:31.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:31.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:32 np0005593233 nova_compute[222017]: 2026-01-23 10:17:32.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:32 np0005593233 nova_compute[222017]: 2026-01-23 10:17:32.952 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:33.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:33.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:35.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:35.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:17:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.009 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.126 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.127 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.166 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:17:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:37.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.317 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.318 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.330 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.331 222021 INFO nova.compute.claims [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:17:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.493 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:17:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3617567388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:17:37 np0005593233 nova_compute[222017]: 2026-01-23 10:17:37.946 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.005 222021 DEBUG nova.compute.provider_tree [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.008 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.025 222021 DEBUG nova.scheduler.client.report [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.665 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.666 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.886 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:17:38 np0005593233 nova_compute[222017]: 2026-01-23 10:17:38.887 222021 DEBUG nova.network.neutron [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.011 222021 INFO nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.051 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:17:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:39.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.255 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.259 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.260 222021 INFO nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Creating image(s)#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.293 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.333 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:39.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.365 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.369 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.410 222021 DEBUG nova.policy [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec99ae7c69d0438280441e0434374cbf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.448 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.449 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.450 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.450 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.501 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.505 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.865 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.360s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:39 np0005593233 nova_compute[222017]: 2026-01-23 10:17:39.929 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] resizing rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:17:40 np0005593233 nova_compute[222017]: 2026-01-23 10:17:40.062 222021 DEBUG nova.objects.instance [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:40 np0005593233 nova_compute[222017]: 2026-01-23 10:17:40.109 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:17:40 np0005593233 nova_compute[222017]: 2026-01-23 10:17:40.110 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Ensure instance console log exists: /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:17:40 np0005593233 nova_compute[222017]: 2026-01-23 10:17:40.110 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:40 np0005593233 nova_compute[222017]: 2026-01-23 10:17:40.111 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:40 np0005593233 nova_compute[222017]: 2026-01-23 10:17:40.111 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:41.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:42 np0005593233 nova_compute[222017]: 2026-01-23 10:17:42.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593233 nova_compute[222017]: 2026-01-23 10:17:42.115 222021 DEBUG nova.network.neutron [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Successfully created port: 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:42.680 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:42.681 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:42.682 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:43 np0005593233 nova_compute[222017]: 2026-01-23 10:17:43.008 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:43 np0005593233 podman[278412]: 2026-01-23 10:17:43.107643164 +0000 UTC m=+0.115433046 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:17:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:43.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:17:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:43.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:17:43 np0005593233 nova_compute[222017]: 2026-01-23 10:17:43.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:17:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3300825439' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:17:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:17:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3300825439' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:17:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:45.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:45.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:45 np0005593233 nova_compute[222017]: 2026-01-23 10:17:45.951 222021 DEBUG nova.network.neutron [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Successfully updated port: 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.410 222021 DEBUG nova.compute.manager [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-changed-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.410 222021 DEBUG nova.compute.manager [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Refreshing instance network info cache due to event network-changed-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.411 222021 DEBUG oslo_concurrency.lockutils [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.411 222021 DEBUG oslo_concurrency.lockutils [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.412 222021 DEBUG nova.network.neutron [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Refreshing network info cache for port 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.449 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.467 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.468 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.468 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.468 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.468 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.804 222021 DEBUG nova.network.neutron [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:17:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:17:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/273718332' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:17:46 np0005593233 nova_compute[222017]: 2026-01-23 10:17:46.946 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.015 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:47.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.340 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.340 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-0000008d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:17:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:47.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.554 222021 DEBUG nova.network.neutron [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.592 222021 DEBUG oslo_concurrency.lockutils [req-f2bde9c6-591b-4fbf-938f-7326dd0c8b24 req-f5af8999-732a-4c6b-b790-6ad854ef3824 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.593 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquired lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.594 222021 DEBUG nova.network.neutron [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.653 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.655 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4147MB free_disk=20.946033477783203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.655 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.656 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.835 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.836 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 95158f57-1f68-4b3e-9d10-e3006c3f2060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.837 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.837 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.952 222021 DEBUG nova.network.neutron [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:17:47 np0005593233 nova_compute[222017]: 2026-01-23 10:17:47.957 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:48 np0005593233 nova_compute[222017]: 2026-01-23 10:17:48.059 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:17:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3948698968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:17:48 np0005593233 nova_compute[222017]: 2026-01-23 10:17:48.438 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:48 np0005593233 nova_compute[222017]: 2026-01-23 10:17:48.447 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:17:48 np0005593233 nova_compute[222017]: 2026-01-23 10:17:48.489 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:17:48 np0005593233 nova_compute[222017]: 2026-01-23 10:17:48.581 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:17:48 np0005593233 nova_compute[222017]: 2026-01-23 10:17:48.582 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:49.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:49.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:49 np0005593233 nova_compute[222017]: 2026-01-23 10:17:49.582 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:49 np0005593233 nova_compute[222017]: 2026-01-23 10:17:49.583 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:49 np0005593233 nova_compute[222017]: 2026-01-23 10:17:49.583 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:50 np0005593233 nova_compute[222017]: 2026-01-23 10:17:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:50 np0005593233 nova_compute[222017]: 2026-01-23 10:17:50.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:17:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:51.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:51.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.389 222021 DEBUG nova.network.neutron [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.473 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Releasing lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.474 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Instance network_info: |[{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.476 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Start _get_guest_xml network_info=[{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.481 222021 WARNING nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.486 222021 DEBUG nova.virt.libvirt.host [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.487 222021 DEBUG nova.virt.libvirt.host [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.494 222021 DEBUG nova.virt.libvirt.host [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.495 222021 DEBUG nova.virt.libvirt.host [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.496 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.497 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.497 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.497 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.498 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.498 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.498 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.498 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.498 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.499 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.499 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.499 222021 DEBUG nova.virt.hardware [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.502 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3704621548' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:51 np0005593233 nova_compute[222017]: 2026-01-23 10:17:51.983 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.015 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.023 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.061 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3312771721' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.519 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.521 222021 DEBUG nova.virt.libvirt.vif [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-395808227',display_name='tempest-₡-395808227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--395808227',id=146,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-42zdutrk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:39Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=95158f57-1f68-4b3e-9d10-e3006c3f2060,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.522 222021 DEBUG nova.network.os_vif_util [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.523 222021 DEBUG nova.network.os_vif_util [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.524 222021 DEBUG nova.objects.instance [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.570 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <uuid>95158f57-1f68-4b3e-9d10-e3006c3f2060</uuid>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <name>instance-00000092</name>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:name>tempest-₡-395808227</nova:name>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:17:51</nova:creationTime>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:user uuid="ec99ae7c69d0438280441e0434374cbf">tempest-ServersTestJSON-1611255243-project-member</nova:user>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:project uuid="c59351a1b59c4cc9ad389dff900935f2">tempest-ServersTestJSON-1611255243</nova:project>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <nova:port uuid="758c2b61-c90d-4bca-a2a4-1dbdf2631d0d">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <entry name="serial">95158f57-1f68-4b3e-9d10-e3006c3f2060</entry>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <entry name="uuid">95158f57-1f68-4b3e-9d10-e3006c3f2060</entry>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/95158f57-1f68-4b3e-9d10-e3006c3f2060_disk">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/95158f57-1f68-4b3e-9d10-e3006c3f2060_disk.config">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:b4:42:f9"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <target dev="tap758c2b61-c9"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/console.log" append="off"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:17:52 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:17:52 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:17:52 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:17:52 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.571 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Preparing to wait for external event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.571 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.572 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.572 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.573 222021 DEBUG nova.virt.libvirt.vif [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-395808227',display_name='tempest-₡-395808227',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--395808227',id=146,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-42zdutrk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:39Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=95158f57-1f68-4b3e-9d10-e3006c3f2060,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.573 222021 DEBUG nova.network.os_vif_util [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.574 222021 DEBUG nova.network.os_vif_util [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.575 222021 DEBUG os_vif [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.575 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.576 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.576 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.580 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap758c2b61-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.581 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap758c2b61-c9, col_values=(('external_ids', {'iface-id': '758c2b61-c90d-4bca-a2a4-1dbdf2631d0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:42:f9', 'vm-uuid': '95158f57-1f68-4b3e-9d10-e3006c3f2060'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:52 np0005593233 NetworkManager[48871]: <info>  [1769163472.7169] manager: (tap758c2b61-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.716 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.726 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.728 222021 INFO os_vif [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9')#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.915 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.916 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.916 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No VIF found with MAC fa:16:3e:b4:42:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.917 222021 INFO nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Using config drive#033[00m
Jan 23 05:17:52 np0005593233 nova_compute[222017]: 2026-01-23 10:17:52.949 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.059 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:53.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:53.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.469 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.719 222021 INFO nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Creating config drive at /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/disk.config#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.724 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpribsenl6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.867 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpribsenl6" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.907 222021 DEBUG nova.storage.rbd_utils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.912 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/disk.config 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.969 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.970 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.970 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:17:53 np0005593233 nova_compute[222017]: 2026-01-23 10:17:53.971 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:54 np0005593233 nova_compute[222017]: 2026-01-23 10:17:54.347 222021 DEBUG oslo_concurrency.processutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/disk.config 95158f57-1f68-4b3e-9d10-e3006c3f2060_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:54 np0005593233 nova_compute[222017]: 2026-01-23 10:17:54.348 222021 INFO nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Deleting local config drive /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060/disk.config because it was imported into RBD.#033[00m
Jan 23 05:17:54 np0005593233 kernel: tap758c2b61-c9: entered promiscuous mode
Jan 23 05:17:54 np0005593233 NetworkManager[48871]: <info>  [1769163474.4237] manager: (tap758c2b61-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Jan 23 05:17:54 np0005593233 ovn_controller[130653]: 2026-01-23T10:17:54Z|00624|binding|INFO|Claiming lport 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d for this chassis.
Jan 23 05:17:54 np0005593233 ovn_controller[130653]: 2026-01-23T10:17:54Z|00625|binding|INFO|758c2b61-c90d-4bca-a2a4-1dbdf2631d0d: Claiming fa:16:3e:b4:42:f9 10.100.0.3
Jan 23 05:17:54 np0005593233 nova_compute[222017]: 2026-01-23 10:17:54.423 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:54 np0005593233 ovn_controller[130653]: 2026-01-23T10:17:54Z|00626|binding|INFO|Setting lport 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d ovn-installed in OVS
Jan 23 05:17:54 np0005593233 nova_compute[222017]: 2026-01-23 10:17:54.440 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:54 np0005593233 nova_compute[222017]: 2026-01-23 10:17:54.442 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:54 np0005593233 systemd-udevd[278613]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:17:54 np0005593233 ovn_controller[130653]: 2026-01-23T10:17:54Z|00627|binding|INFO|Setting lport 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d up in Southbound
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.467 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:42:f9 10.100.0.3'], port_security=['fa:16:3e:b4:42:f9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '95158f57-1f68-4b3e-9d10-e3006c3f2060', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd59f1dd0-018a-40d5-b9a0-54c6c1f9d925', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c808b115-ccf1-41c4-acea-daabae8abf5b, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.468 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d in datapath 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 bound to our chassis#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.469 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7#033[00m
Jan 23 05:17:54 np0005593233 NetworkManager[48871]: <info>  [1769163474.4778] device (tap758c2b61-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:17:54 np0005593233 NetworkManager[48871]: <info>  [1769163474.4794] device (tap758c2b61-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:17:54 np0005593233 nova_compute[222017]: 2026-01-23 10:17:54.701 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.708 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[83e9ba4d-e384-4c6d-a779-3d99fbf5233c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.709 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap43bdb40a-e1 in ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.714 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap43bdb40a-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.714 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1c78b61f-1a7c-4c6e-89c7-8565f5716120]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.716 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d619516b-868c-4ace-8f0e-14e2ff2a11b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.735 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[7c70810e-b7d8-4da7-b390-73245e143019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 systemd-machined[190954]: New machine qemu-68-instance-00000092.
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.750 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a259e448-1c34-4111-8f52-68b4f70b0e0b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 systemd[1]: Started Virtual Machine qemu-68-instance-00000092.
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.787 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[eecd9866-e53d-4f11-a88d-618b3b13c197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.794 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a48ddb5-8175-4038-8a77-f2fa35d7ef79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 NetworkManager[48871]: <info>  [1769163474.7956] manager: (tap43bdb40a-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.835 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6c71abe4-eae2-40ad-9143-177b1858bb4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.839 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[21cd1701-845c-4232-be84-4b6f2d718e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 NetworkManager[48871]: <info>  [1769163474.8657] device (tap43bdb40a-e0): carrier: link connected
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.873 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b8dcb447-c3bb-424e-a843-01fe8c08ce93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.895 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[803bde01-ef3f-4255-a112-9e6aa56b0681]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43bdb40a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:5e:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738749, 'reachable_time': 35943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278651, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.911 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[25e1d8d4-245e-4432-8358-83204c5a5141]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:5ee5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738749, 'tstamp': 738749}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278652, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.929 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8dfe54-9284-46be-9d4a-26a566ebc3e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43bdb40a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:5e:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738749, 'reachable_time': 35943, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278653, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:54.972 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[444846d1-1756-4f6a-acfb-62d5ef78f9ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.036 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4195e7-9211-4075-a03e-e2b51e047048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.039 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bdb40a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.040 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.041 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43bdb40a-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.093 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:55 np0005593233 NetworkManager[48871]: <info>  [1769163475.0949] manager: (tap43bdb40a-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 23 05:17:55 np0005593233 kernel: tap43bdb40a-e0: entered promiscuous mode
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.099 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43bdb40a-e0, col_values=(('external_ids', {'iface-id': '8a8ef4f2-2ba5-405a-811e-058c5ff2b91e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.100 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:55 np0005593233 ovn_controller[130653]: 2026-01-23T10:17:55Z|00628|binding|INFO|Releasing lport 8a8ef4f2-2ba5-405a-811e-058c5ff2b91e from this chassis (sb_readonly=0)
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.116 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.117 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.119 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[83156697-9d2d-4385-94de-5e63aac04986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.120 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7.pid.haproxy
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:17:55 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:17:55.120 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'env', 'PROCESS_TAG=haproxy-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:17:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:55.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.283 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163475.2823691, 95158f57-1f68-4b3e-9d10-e3006c3f2060 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.284 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] VM Started (Lifecycle Event)#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.325 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.330 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163475.282635, 95158f57-1f68-4b3e-9d10-e3006c3f2060 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.331 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:17:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:55.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.387 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.392 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.424 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:17:55 np0005593233 podman[278727]: 2026-01-23 10:17:55.563730315 +0000 UTC m=+0.068062946 container create 8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:17:55 np0005593233 systemd[1]: Started libpod-conmon-8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb.scope.
Jan 23 05:17:55 np0005593233 podman[278727]: 2026-01-23 10:17:55.531344049 +0000 UTC m=+0.035676710 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:17:55 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:17:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70ef4d2a17100684f1bb784f2980f0fc67f052e8b5602fea29ec318f6b13da07/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:17:55 np0005593233 podman[278727]: 2026-01-23 10:17:55.665909535 +0000 UTC m=+0.170242226 container init 8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 05:17:55 np0005593233 podman[278727]: 2026-01-23 10:17:55.676763782 +0000 UTC m=+0.181096453 container start 8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:17:55 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [NOTICE]   (278746) : New worker (278748) forked
Jan 23 05:17:55 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [NOTICE]   (278746) : Loading success.
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.833 222021 DEBUG nova.compute.manager [req-d9e7cef8-853e-466a-86af-73c0d6536571 req-9bb10870-6b6b-4049-95f3-39c4f611d8cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.833 222021 DEBUG oslo_concurrency.lockutils [req-d9e7cef8-853e-466a-86af-73c0d6536571 req-9bb10870-6b6b-4049-95f3-39c4f611d8cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.833 222021 DEBUG oslo_concurrency.lockutils [req-d9e7cef8-853e-466a-86af-73c0d6536571 req-9bb10870-6b6b-4049-95f3-39c4f611d8cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.833 222021 DEBUG oslo_concurrency.lockutils [req-d9e7cef8-853e-466a-86af-73c0d6536571 req-9bb10870-6b6b-4049-95f3-39c4f611d8cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.834 222021 DEBUG nova.compute.manager [req-d9e7cef8-853e-466a-86af-73c0d6536571 req-9bb10870-6b6b-4049-95f3-39c4f611d8cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Processing event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.834 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.838 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163475.837743, 95158f57-1f68-4b3e-9d10-e3006c3f2060 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.838 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.839 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.844 222021 INFO nova.virt.libvirt.driver [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Instance spawned successfully.#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.844 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.875 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.875 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.876 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.877 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.878 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.878 222021 DEBUG nova.virt.libvirt.driver [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.888 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.893 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.957 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:17:55 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.999 222021 INFO nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Took 16.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:17:56 np0005593233 nova_compute[222017]: 2026-01-23 10:17:55.999 222021 DEBUG nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:56 np0005593233 nova_compute[222017]: 2026-01-23 10:17:56.171 222021 INFO nova.compute.manager [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Took 18.91 seconds to build instance.#033[00m
Jan 23 05:17:56 np0005593233 nova_compute[222017]: 2026-01-23 10:17:56.216 222021 DEBUG oslo_concurrency.lockutils [None req-961906e7-29f3-4e9b-92f1-5baf0ffa5489 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:57.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:57.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:57 np0005593233 nova_compute[222017]: 2026-01-23 10:17:57.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.041 222021 DEBUG nova.compute.manager [req-8ed9e7fc-da70-485e-ba2e-715e07d9e038 req-36d3c7cd-e033-4142-a6cf-c70162614ee0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.042 222021 DEBUG oslo_concurrency.lockutils [req-8ed9e7fc-da70-485e-ba2e-715e07d9e038 req-36d3c7cd-e033-4142-a6cf-c70162614ee0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.042 222021 DEBUG oslo_concurrency.lockutils [req-8ed9e7fc-da70-485e-ba2e-715e07d9e038 req-36d3c7cd-e033-4142-a6cf-c70162614ee0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.042 222021 DEBUG oslo_concurrency.lockutils [req-8ed9e7fc-da70-485e-ba2e-715e07d9e038 req-36d3c7cd-e033-4142-a6cf-c70162614ee0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.042 222021 DEBUG nova.compute.manager [req-8ed9e7fc-da70-485e-ba2e-715e07d9e038 req-36d3c7cd-e033-4142-a6cf-c70162614ee0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] No waiting events found dispatching network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.042 222021 WARNING nova.compute.manager [req-8ed9e7fc-da70-485e-ba2e-715e07d9e038 req-36d3c7cd-e033-4142-a6cf-c70162614ee0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received unexpected event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.062 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.461 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.483 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:58 np0005593233 nova_compute[222017]: 2026-01-23 10:17:58.484 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:17:59 np0005593233 podman[278757]: 2026-01-23 10:17:59.049413147 +0000 UTC m=+0.062518769 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:17:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:17:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:59.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:17:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:17:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:59.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:01.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:01.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.046756) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482046878, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 1382, "num_deletes": 257, "total_data_size": 2976011, "memory_usage": 3016560, "flush_reason": "Manual Compaction"}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482073170, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 1954264, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62660, "largest_seqno": 64037, "table_properties": {"data_size": 1948456, "index_size": 3074, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12831, "raw_average_key_size": 19, "raw_value_size": 1936579, "raw_average_value_size": 2970, "num_data_blocks": 136, "num_entries": 652, "num_filter_entries": 652, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163372, "oldest_key_time": 1769163372, "file_creation_time": 1769163482, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 26473 microseconds, and 13836 cpu microseconds.
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.073242) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 1954264 bytes OK
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.073277) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.075211) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.075232) EVENT_LOG_v1 {"time_micros": 1769163482075225, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.075257) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 2969380, prev total WAL file size 2969380, number of live WAL files 2.
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.076742) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323539' seq:72057594037927935, type:22 .. '6C6F676D0032353132' seq:0, type:0; will stop at (end)
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(1908KB)], [129(8830KB)]
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482076812, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 10996431, "oldest_snapshot_seqno": -1}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8301 keys, 10860444 bytes, temperature: kUnknown
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482147268, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 10860444, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10807525, "index_size": 31046, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20805, "raw_key_size": 219095, "raw_average_key_size": 26, "raw_value_size": 10662124, "raw_average_value_size": 1284, "num_data_blocks": 1189, "num_entries": 8301, "num_filter_entries": 8301, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163482, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.147577) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 10860444 bytes
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.149129) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.8 rd, 153.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 8.6 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(11.2) write-amplify(5.6) OK, records in: 8828, records dropped: 527 output_compression: NoCompression
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.149148) EVENT_LOG_v1 {"time_micros": 1769163482149139, "job": 82, "event": "compaction_finished", "compaction_time_micros": 70566, "compaction_time_cpu_micros": 26419, "output_level": 6, "num_output_files": 1, "total_output_size": 10860444, "num_input_records": 8828, "num_output_records": 8301, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482149581, "job": 82, "event": "table_file_deletion", "file_number": 131}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482151142, "job": 82, "event": "table_file_deletion", "file_number": 129}
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.076613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.151175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.151181) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.151183) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.151184) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:18:02.151186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593233 nova_compute[222017]: 2026-01-23 10:18:02.477 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:02 np0005593233 nova_compute[222017]: 2026-01-23 10:18:02.753 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:03 np0005593233 nova_compute[222017]: 2026-01-23 10:18:03.064 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:03.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:05.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:05.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:07.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:07.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:07 np0005593233 nova_compute[222017]: 2026-01-23 10:18:07.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:08 np0005593233 nova_compute[222017]: 2026-01-23 10:18:08.066 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:08 np0005593233 nova_compute[222017]: 2026-01-23 10:18:08.577 222021 DEBUG nova.compute.manager [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:08 np0005593233 nova_compute[222017]: 2026-01-23 10:18:08.577 222021 DEBUG nova.compute.manager [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing instance network info cache due to event network-changed-10b1482b-63d3-4411-b752-5d8f34f77403. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:18:08 np0005593233 nova_compute[222017]: 2026-01-23 10:18:08.577 222021 DEBUG oslo_concurrency.lockutils [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:18:08 np0005593233 nova_compute[222017]: 2026-01-23 10:18:08.578 222021 DEBUG oslo_concurrency.lockutils [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:18:08 np0005593233 nova_compute[222017]: 2026-01-23 10:18:08.578 222021 DEBUG nova.network.neutron [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Refreshing network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:18:09 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:09Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:42:f9 10.100.0.3
Jan 23 05:18:09 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:09Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:42:f9 10.100.0.3
Jan 23 05:18:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:09.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:09 np0005593233 nova_compute[222017]: 2026-01-23 10:18:09.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:09.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:11.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:12 np0005593233 nova_compute[222017]: 2026-01-23 10:18:12.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:12 np0005593233 nova_compute[222017]: 2026-01-23 10:18:12.825 222021 DEBUG nova.network.neutron [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updated VIF entry in instance network info cache for port 10b1482b-63d3-4411-b752-5d8f34f77403. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:18:12 np0005593233 nova_compute[222017]: 2026-01-23 10:18:12.825 222021 DEBUG nova.network.neutron [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [{"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:12 np0005593233 nova_compute[222017]: 2026-01-23 10:18:12.884 222021 DEBUG oslo_concurrency.lockutils [req-06877ac3-463b-4a3d-9f00-6c0d9be5e356 req-fae6be2f-065f-47ea-8a02-da43827a3917 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:18:13 np0005593233 nova_compute[222017]: 2026-01-23 10:18:13.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:13.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:13.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:14 np0005593233 podman[278779]: 2026-01-23 10:18:14.092451258 +0000 UTC m=+0.105267289 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:18:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:15.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:17.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:17 np0005593233 nova_compute[222017]: 2026-01-23 10:18:17.804 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:18 np0005593233 nova_compute[222017]: 2026-01-23 10:18:18.071 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:19.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:19.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:20.225 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:20.226 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:18:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:20.227 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:20 np0005593233 nova_compute[222017]: 2026-01-23 10:18:20.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:21.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:21.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:22 np0005593233 nova_compute[222017]: 2026-01-23 10:18:22.807 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:23 np0005593233 nova_compute[222017]: 2026-01-23 10:18:23.073 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:23.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:23 np0005593233 podman[278980]: 2026-01-23 10:18:23.404958402 +0000 UTC m=+0.083105122 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 05:18:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:23.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:23 np0005593233 podman[278980]: 2026-01-23 10:18:23.537660975 +0000 UTC m=+0.215807675 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Jan 23 05:18:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 23 05:18:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:25.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 23 05:18:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:25.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:18:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:18:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:26Z|00629|binding|INFO|Releasing lport 8a8ef4f2-2ba5-405a-811e-058c5ff2b91e from this chassis (sb_readonly=0)
Jan 23 05:18:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:26Z|00630|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:18:26 np0005593233 nova_compute[222017]: 2026-01-23 10:18:26.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:26Z|00631|binding|INFO|Releasing lport 8a8ef4f2-2ba5-405a-811e-058c5ff2b91e from this chassis (sb_readonly=0)
Jan 23 05:18:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:26Z|00632|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:18:26 np0005593233 nova_compute[222017]: 2026-01-23 10:18:26.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:27.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:27.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.649 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.650 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.650 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.650 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.651 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.652 222021 INFO nova.compute.manager [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Terminating instance#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.653 222021 DEBUG nova.compute.manager [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:18:27 np0005593233 kernel: tap10b1482b-63 (unregistering): left promiscuous mode
Jan 23 05:18:27 np0005593233 NetworkManager[48871]: <info>  [1769163507.7250] device (tap10b1482b-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:18:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:27Z|00633|binding|INFO|Releasing lport 10b1482b-63d3-4411-b752-5d8f34f77403 from this chassis (sb_readonly=0)
Jan 23 05:18:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:27Z|00634|binding|INFO|Setting lport 10b1482b-63d3-4411-b752-5d8f34f77403 down in Southbound
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.790 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:27Z|00635|binding|INFO|Removing iface tap10b1482b-63 ovn-installed in OVS
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:27.802 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:68:ac 10.100.0.9'], port_security=['fa:16:3e:42:68:ac 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ae621f21a8e438fb95152309b38cee5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3b0a0b41-45a8-4582-a4d2-a9aff1f1a18c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5888498-07d6-4c96-95ee-546974eebd82, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=10b1482b-63d3-4411-b752-5d8f34f77403) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:27.803 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 10b1482b-63d3-4411-b752-5d8f34f77403 in datapath f98d79de-4a23-4f29-9848-c5d4c5683a5d unbound from our chassis#033[00m
Jan 23 05:18:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:27.804 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f98d79de-4a23-4f29-9848-c5d4c5683a5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:18:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:27.806 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1f1034-1dff-4956-a8b7-0607ae9de610]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.807 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:27.807 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d namespace which is not needed anymore#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Jan 23 05:18:27 np0005593233 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008d.scope: Consumed 24.280s CPU time.
Jan 23 05:18:27 np0005593233 systemd-machined[190954]: Machine qemu-65-instance-0000008d terminated.
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.894 222021 INFO nova.virt.libvirt.driver [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Instance destroyed successfully.#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.895 222021 DEBUG nova.objects.instance [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'resources' on Instance uuid fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.914 222021 DEBUG nova.virt.libvirt.vif [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-1890261987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-1890261987',id=141,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPuMczToXGmZUNyxG5fVGeV6xaoJVOpQ6Lh9dx5t6v22bv4xalVGQLUjYNEpg7ajkuOU/WHiNfvMhffjZHY/YojnQQYOX+q0GTa9+NPbkGDFf1XELa+vTNvIe6ZV8CwP9g==',key_name='tempest-TestInstancesWithCinderVolumes-232096272',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:15:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ae621f21a8e438fb95152309b38cee5',ramdisk_id='',reservation_id='r-ryo6mpg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-565485208',owner_user_name='tempest-TestInstancesWithCinderVolumes-565485208-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:15:06Z,user_data=None,user_id='95ac13194f0940128d42af3d45d130fa',uuid=fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.915 222021 DEBUG nova.network.os_vif_util [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converting VIF {"id": "10b1482b-63d3-4411-b752-5d8f34f77403", "address": "fa:16:3e:42:68:ac", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10b1482b-63", "ovs_interfaceid": "10b1482b-63d3-4411-b752-5d8f34f77403", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.918 222021 DEBUG nova.network.os_vif_util [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.919 222021 DEBUG os_vif [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.921 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.922 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10b1482b-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.923 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.926 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:27 np0005593233 nova_compute[222017]: 2026-01-23 10:18:27.928 222021 INFO os_vif [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:68:ac,bridge_name='br-int',has_traffic_filtering=True,id=10b1482b-63d3-4411-b752-5d8f34f77403,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10b1482b-63')#033[00m
Jan 23 05:18:27 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [NOTICE]   (276538) : haproxy version is 2.8.14-c23fe91
Jan 23 05:18:27 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [NOTICE]   (276538) : path to executable is /usr/sbin/haproxy
Jan 23 05:18:27 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [WARNING]  (276538) : Exiting Master process...
Jan 23 05:18:27 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [WARNING]  (276538) : Exiting Master process...
Jan 23 05:18:27 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [ALERT]    (276538) : Current worker (276540) exited with code 143 (Terminated)
Jan 23 05:18:27 np0005593233 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[276534]: [WARNING]  (276538) : All workers exited. Exiting... (0)
Jan 23 05:18:27 np0005593233 systemd[1]: libpod-c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764.scope: Deactivated successfully.
Jan 23 05:18:27 np0005593233 podman[279267]: 2026-01-23 10:18:27.98331726 +0000 UTC m=+0.053922696 container died c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:18:28 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764-userdata-shm.mount: Deactivated successfully.
Jan 23 05:18:28 np0005593233 systemd[1]: var-lib-containers-storage-overlay-f839200124fde9106996e5116aa450621a11da822a96e76bfe65398066e12919-merged.mount: Deactivated successfully.
Jan 23 05:18:28 np0005593233 podman[279267]: 2026-01-23 10:18:28.031558275 +0000 UTC m=+0.102163681 container cleanup c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:18:28 np0005593233 systemd[1]: libpod-conmon-c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764.scope: Deactivated successfully.
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.075 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:28 np0005593233 podman[279314]: 2026-01-23 10:18:28.11127931 +0000 UTC m=+0.048462432 container remove c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.118 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[edb68260-f5df-4ed3-8cf8-4cc0bb159f9f]: (4, ('Fri Jan 23 10:18:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d (c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764)\nc1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764\nFri Jan 23 10:18:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d (c1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764)\nc1dbcb2c447a54e25a27e853af2f1f6a65b6906613352048b87a18e26cb86764\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.120 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[32c2cc6b-1d7c-42c8-9488-73f9f7e9370e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.121 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf98d79de-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.122 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:28 np0005593233 kernel: tapf98d79de-40: left promiscuous mode
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.135 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.138 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6555c0a-719f-4866-9472-42c9ac48dd1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.149 222021 INFO nova.virt.libvirt.driver [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Deleting instance files /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_del#033[00m
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.150 222021 INFO nova.virt.libvirt.driver [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Deletion of /var/lib/nova/instances/fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78_del complete#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.159 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbafb55-056a-4ff6-a9d6-d20a17a391f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.161 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1f237417-4566-4bfe-b8ed-58528649d97c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.179 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3d831058-edc0-431c-81fe-ff21754f9d04]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 721601, 'reachable_time': 15087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279330, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.181 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:18:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:28.181 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cccdc7-33b6-4a03-94f9-e551de8578b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:28 np0005593233 systemd[1]: run-netns-ovnmeta\x2df98d79de\x2d4a23\x2d4f29\x2d9848\x2dc5d4c5683a5d.mount: Deactivated successfully.
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.224 222021 INFO nova.compute.manager [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.224 222021 DEBUG oslo.service.loopingcall [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.225 222021 DEBUG nova.compute.manager [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:18:28 np0005593233 nova_compute[222017]: 2026-01-23 10:18:28.225 222021 DEBUG nova.network.neutron [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:18:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:29.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.352 222021 DEBUG nova.compute.manager [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-vif-unplugged-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.353 222021 DEBUG oslo_concurrency.lockutils [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.353 222021 DEBUG oslo_concurrency.lockutils [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.353 222021 DEBUG oslo_concurrency.lockutils [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.353 222021 DEBUG nova.compute.manager [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] No waiting events found dispatching network-vif-unplugged-10b1482b-63d3-4411-b752-5d8f34f77403 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.353 222021 DEBUG nova.compute.manager [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-vif-unplugged-10b1482b-63d3-4411-b752-5d8f34f77403 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.354 222021 DEBUG nova.compute.manager [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.354 222021 DEBUG oslo_concurrency.lockutils [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.354 222021 DEBUG oslo_concurrency.lockutils [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.354 222021 DEBUG oslo_concurrency.lockutils [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.354 222021 DEBUG nova.compute.manager [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] No waiting events found dispatching network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.355 222021 WARNING nova.compute.manager [req-9b28169d-4631-4671-b225-86841c1ebde9 req-074971fd-b09e-4b19-8b46-aa1ecec14bda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received unexpected event network-vif-plugged-10b1482b-63d3-4411-b752-5d8f34f77403 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:18:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:29.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.620 222021 DEBUG nova.network.neutron [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.639 222021 INFO nova.compute.manager [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Took 1.41 seconds to deallocate network for instance.#033[00m
Jan 23 05:18:29 np0005593233 nova_compute[222017]: 2026-01-23 10:18:29.740 222021 DEBUG nova.compute.manager [req-4e4b8e24-7345-4900-a8f0-7a7422ea911c req-aae2a21c-c4d4-4ee2-aa58-364b13653ccc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Received event network-vif-deleted-10b1482b-63d3-4411-b752-5d8f34f77403 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.060 222021 INFO nova.compute.manager [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Took 0.42 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:18:30 np0005593233 podman[279331]: 2026-01-23 10:18:30.073349477 +0000 UTC m=+0.071884143 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.111 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.112 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.256 222021 DEBUG oslo_concurrency.processutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2707073906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.780 222021 DEBUG oslo_concurrency.processutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.788 222021 DEBUG nova.compute.provider_tree [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.828 222021 DEBUG nova.scheduler.client.report [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.880 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:30 np0005593233 nova_compute[222017]: 2026-01-23 10:18:30.928 222021 INFO nova.scheduler.client.report [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Deleted allocations for instance fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78#033[00m
Jan 23 05:18:31 np0005593233 nova_compute[222017]: 2026-01-23 10:18:31.065 222021 DEBUG oslo_concurrency.lockutils [None req-478711b3-fa1d-4841-8feb-0805bf4fe9a1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:31.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:31.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:32 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:32 np0005593233 nova_compute[222017]: 2026-01-23 10:18:32.926 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:33 np0005593233 nova_compute[222017]: 2026-01-23 10:18:33.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:33.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:35.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:35.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:35 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #51. Immutable memtables: 0.
Jan 23 05:18:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:37.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:37.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:37 np0005593233 nova_compute[222017]: 2026-01-23 10:18:37.976 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:38 np0005593233 nova_compute[222017]: 2026-01-23 10:18:38.079 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:39.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:39.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:18:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3154585024' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:18:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:18:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3154585024' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:18:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:41.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:41.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:42.681 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:42.683 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:18:42.684 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:42 np0005593233 nova_compute[222017]: 2026-01-23 10:18:42.894 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163507.8924856, fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:42 np0005593233 nova_compute[222017]: 2026-01-23 10:18:42.895 222021 INFO nova.compute.manager [-] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:18:42 np0005593233 nova_compute[222017]: 2026-01-23 10:18:42.954 222021 DEBUG nova.compute.manager [None req-5f58368e-8374-44a3-a269-db6a3e32ae96 - - - - - -] [instance: fd9d9b93-c87f-40e9-9b2d-1e40b9a74a78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:42 np0005593233 nova_compute[222017]: 2026-01-23 10:18:42.980 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:43 np0005593233 nova_compute[222017]: 2026-01-23 10:18:43.125 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:43.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:43 np0005593233 nova_compute[222017]: 2026-01-23 10:18:43.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:43.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:45 np0005593233 podman[279424]: 2026-01-23 10:18:45.137083377 +0000 UTC m=+0.132964592 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 05:18:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:45.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:45.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e310 e310: 3 total, 3 up, 3 in
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.790 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.791 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.792 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.792 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:18:46 np0005593233 nova_compute[222017]: 2026-01-23 10:18:46.793 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3245639709' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.278 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:47.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.369 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.370 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:18:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:47.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.659 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.662 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4219MB free_disk=20.922027587890625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.663 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.663 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 95158f57-1f68-4b3e-9d10-e3006c3f2060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:18:47 np0005593233 nova_compute[222017]: 2026-01-23 10:18:47.799 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.128 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.131 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.201 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.202 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.206 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/346486064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.668 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.678 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.704 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.746 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:18:48 np0005593233 nova_compute[222017]: 2026-01-23 10:18:48.747 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:49.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:49.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:50 np0005593233 nova_compute[222017]: 2026-01-23 10:18:50.746 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:50 np0005593233 nova_compute[222017]: 2026-01-23 10:18:50.747 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:50 np0005593233 nova_compute[222017]: 2026-01-23 10:18:50.747 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:51.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:51.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 e311: 3 total, 3 up, 3 in
Jan 23 05:18:52 np0005593233 nova_compute[222017]: 2026-01-23 10:18:52.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:52 np0005593233 nova_compute[222017]: 2026-01-23 10:18:52.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:18:53 np0005593233 nova_compute[222017]: 2026-01-23 10:18:53.202 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:53 np0005593233 nova_compute[222017]: 2026-01-23 10:18:53.205 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:53.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:53.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:54 np0005593233 ovn_controller[130653]: 2026-01-23T10:18:54Z|00636|binding|INFO|Releasing lport 8a8ef4f2-2ba5-405a-811e-058c5ff2b91e from this chassis (sb_readonly=0)
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.778 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.779 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.779 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:18:54 np0005593233 nova_compute[222017]: 2026-01-23 10:18:54.779 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:55.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:55.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:57.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:18:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:57.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:57 np0005593233 nova_compute[222017]: 2026-01-23 10:18:57.938 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:57 np0005593233 nova_compute[222017]: 2026-01-23 10:18:57.977 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:18:57 np0005593233 nova_compute[222017]: 2026-01-23 10:18:57.978 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:18:58 np0005593233 nova_compute[222017]: 2026-01-23 10:18:58.206 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:18:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:59.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:18:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:18:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:18:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:59.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:00 np0005593233 nova_compute[222017]: 2026-01-23 10:19:00.972 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:01 np0005593233 podman[279495]: 2026-01-23 10:19:01.05303227 +0000 UTC m=+0.062076577 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:19:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:01.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:01.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:03 np0005593233 nova_compute[222017]: 2026-01-23 10:19:03.208 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:03 np0005593233 nova_compute[222017]: 2026-01-23 10:19:03.210 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:03 np0005593233 nova_compute[222017]: 2026-01-23 10:19:03.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:19:03 np0005593233 nova_compute[222017]: 2026-01-23 10:19:03.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:03 np0005593233 nova_compute[222017]: 2026-01-23 10:19:03.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:03 np0005593233 nova_compute[222017]: 2026-01-23 10:19:03.266 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:03.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:03.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:05.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:05.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:07.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:07.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.267 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.270 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.295 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.296 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:08 np0005593233 nova_compute[222017]: 2026-01-23 10:19:08.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:09.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:19:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:09.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:19:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:11.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:11.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:12 np0005593233 nova_compute[222017]: 2026-01-23 10:19:12.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:12 np0005593233 nova_compute[222017]: 2026-01-23 10:19:12.412 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:19:12 np0005593233 nova_compute[222017]: 2026-01-23 10:19:12.477 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:19:13 np0005593233 nova_compute[222017]: 2026-01-23 10:19:13.296 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:13 np0005593233 nova_compute[222017]: 2026-01-23 10:19:13.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:13 np0005593233 nova_compute[222017]: 2026-01-23 10:19:13.299 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:19:13 np0005593233 nova_compute[222017]: 2026-01-23 10:19:13.299 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:13.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:13 np0005593233 nova_compute[222017]: 2026-01-23 10:19:13.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:13 np0005593233 nova_compute[222017]: 2026-01-23 10:19:13.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:13.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:14 np0005593233 nova_compute[222017]: 2026-01-23 10:19:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:14 np0005593233 nova_compute[222017]: 2026-01-23 10:19:14.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:19:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:15.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:15.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:16 np0005593233 podman[279515]: 2026-01-23 10:19:16.117877007 +0000 UTC m=+0.118752819 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 23 05:19:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:19:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:17.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:19:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:19:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:17.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:19:18 np0005593233 nova_compute[222017]: 2026-01-23 10:19:18.359 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:18 np0005593233 nova_compute[222017]: 2026-01-23 10:19:18.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:18 np0005593233 nova_compute[222017]: 2026-01-23 10:19:18.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:19:18 np0005593233 nova_compute[222017]: 2026-01-23 10:19:18.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:18 np0005593233 nova_compute[222017]: 2026-01-23 10:19:18.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:18 np0005593233 nova_compute[222017]: 2026-01-23 10:19:18.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:19.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:19.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:20.996 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:20.997 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:19:21 np0005593233 nova_compute[222017]: 2026-01-23 10:19:21.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:21.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:21.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:23.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:23 np0005593233 nova_compute[222017]: 2026-01-23 10:19:23.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:23.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.064 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.065 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.107 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.147 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.147 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.208 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.231 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.232 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.241 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.242 222021 INFO nova.compute.claims [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.330 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:24 np0005593233 nova_compute[222017]: 2026-01-23 10:19:24.614 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/256884339' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.102 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.109 222021 DEBUG nova.compute.provider_tree [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.159 222021 DEBUG nova.scheduler.client.report [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.202 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.204 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.210 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.210 222021 INFO nova.compute.claims [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.313 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "3715b058-9d33-4d04-860e-4f15aebb0134" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.313 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "3715b058-9d33-4d04-860e-4f15aebb0134" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.331 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "3715b058-9d33-4d04-860e-4f15aebb0134" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.332 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:19:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:25.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.515 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.516 222021 DEBUG nova.network.neutron [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:19:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:25.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.569 222021 INFO nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.609 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.664 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.787 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.790 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.790 222021 INFO nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Creating image(s)#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.839 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.875 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.911 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.916 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:25 np0005593233 nova_compute[222017]: 2026-01-23 10:19:25.960 222021 DEBUG nova.policy [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c2c5617ce3104251a0aaf4950da1708c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '363e1e6f82e8475f84a35d534d110de1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:19:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:25.999 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.010 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.011 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.012 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.012 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.046 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.051 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2951901486' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.152 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.159 222021 DEBUG nova.compute.provider_tree [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.211 222021 DEBUG nova.scheduler.client.report [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.256 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.257 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.330 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.330 222021 DEBUG nova.network.neutron [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.371 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.404 222021 INFO nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.455 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] resizing rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.488 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.570 222021 DEBUG nova.objects.instance [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lazy-loading 'migration_context' on Instance uuid dffff7fd-2faf-4f94-953a-fbfb7c752e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.630 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.631 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Ensure instance console log exists: /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.631 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.631 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.632 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.729 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.731 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.732 222021 INFO nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Creating image(s)#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.766 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.799 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.826 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.831 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.909 222021 DEBUG nova.policy [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec99ae7c69d0438280441e0434374cbf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.927 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.928 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.929 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.930 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.964 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:26 np0005593233 nova_compute[222017]: 2026-01-23 10:19:26.970 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:27.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:27 np0005593233 nova_compute[222017]: 2026-01-23 10:19:27.411 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:27 np0005593233 nova_compute[222017]: 2026-01-23 10:19:27.486 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] resizing rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:19:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:27.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:27 np0005593233 nova_compute[222017]: 2026-01-23 10:19:27.676 222021 DEBUG nova.network.neutron [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Successfully created port: c402b0f5-30ab-49f8-ac15-df817598cdb4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:19:27 np0005593233 nova_compute[222017]: 2026-01-23 10:19:27.993 222021 DEBUG nova.objects.instance [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.023 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.024 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Ensure instance console log exists: /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.025 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.025 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.025 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.457 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:28 np0005593233 nova_compute[222017]: 2026-01-23 10:19:28.458 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:29 np0005593233 nova_compute[222017]: 2026-01-23 10:19:29.281 222021 DEBUG nova.network.neutron [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Successfully created port: 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:19:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:29.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:29.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.674 222021 DEBUG nova.network.neutron [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Successfully updated port: c402b0f5-30ab-49f8-ac15-df817598cdb4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.711 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "refresh_cache-dffff7fd-2faf-4f94-953a-fbfb7c752e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.712 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquired lock "refresh_cache-dffff7fd-2faf-4f94-953a-fbfb7c752e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.712 222021 DEBUG nova.network.neutron [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.867 222021 DEBUG nova.network.neutron [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Successfully updated port: 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.901 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "refresh_cache-54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.902 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquired lock "refresh_cache-54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.902 222021 DEBUG nova.network.neutron [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.953 222021 DEBUG nova.compute.manager [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-changed-c402b0f5-30ab-49f8-ac15-df817598cdb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.953 222021 DEBUG nova.compute.manager [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Refreshing instance network info cache due to event network-changed-c402b0f5-30ab-49f8-ac15-df817598cdb4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:30 np0005593233 nova_compute[222017]: 2026-01-23 10:19:30.954 222021 DEBUG oslo_concurrency.lockutils [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dffff7fd-2faf-4f94-953a-fbfb7c752e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:31 np0005593233 nova_compute[222017]: 2026-01-23 10:19:31.088 222021 DEBUG nova.network.neutron [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:19:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:31.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:31.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:32 np0005593233 podman[279941]: 2026-01-23 10:19:32.047375547 +0000 UTC m=+0.069905578 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:19:32 np0005593233 nova_compute[222017]: 2026-01-23 10:19:32.222 222021 DEBUG nova.network.neutron [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.086 222021 DEBUG nova.compute.manager [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-changed-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.089 222021 DEBUG nova.compute.manager [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Refreshing instance network info cache due to event network-changed-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.089 222021 DEBUG oslo_concurrency.lockutils [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:33.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.459 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.460 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.495 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:33 np0005593233 nova_compute[222017]: 2026-01-23 10:19:33.496 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:19:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:33.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.819262) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573819296, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 1179, "num_deletes": 253, "total_data_size": 2528200, "memory_usage": 2551856, "flush_reason": "Manual Compaction"}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573837288, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 1670104, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 64042, "largest_seqno": 65216, "table_properties": {"data_size": 1664752, "index_size": 2747, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11991, "raw_average_key_size": 20, "raw_value_size": 1653925, "raw_average_value_size": 2808, "num_data_blocks": 120, "num_entries": 589, "num_filter_entries": 589, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163483, "oldest_key_time": 1769163483, "file_creation_time": 1769163573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 18081 microseconds, and 5101 cpu microseconds.
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.837337) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 1670104 bytes OK
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.837363) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.839617) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.839631) EVENT_LOG_v1 {"time_micros": 1769163573839626, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.839649) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 2522394, prev total WAL file size 2522394, number of live WAL files 2.
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.840468) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(1630KB)], [132(10MB)]
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573840520, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 12530548, "oldest_snapshot_seqno": -1}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8365 keys, 10637969 bytes, temperature: kUnknown
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573962443, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 10637969, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10584850, "index_size": 31118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20933, "raw_key_size": 221220, "raw_average_key_size": 26, "raw_value_size": 10438551, "raw_average_value_size": 1247, "num_data_blocks": 1187, "num_entries": 8365, "num_filter_entries": 8365, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.962894) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 10637969 bytes
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.964998) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 102.7 rd, 87.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 10.4 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(13.9) write-amplify(6.4) OK, records in: 8890, records dropped: 525 output_compression: NoCompression
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.965021) EVENT_LOG_v1 {"time_micros": 1769163573965011, "job": 84, "event": "compaction_finished", "compaction_time_micros": 122056, "compaction_time_cpu_micros": 42613, "output_level": 6, "num_output_files": 1, "total_output_size": 10637969, "num_input_records": 8890, "num_output_records": 8365, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573965583, "job": 84, "event": "table_file_deletion", "file_number": 134}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573967850, "job": 84, "event": "table_file_deletion", "file_number": 132}
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.840378) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.968094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.968107) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.968113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.968117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:19:33.968122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.052 222021 DEBUG nova.network.neutron [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Updating instance_info_cache with network_info: [{"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.158 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Releasing lock "refresh_cache-dffff7fd-2faf-4f94-953a-fbfb7c752e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.158 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Instance network_info: |[{"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.159 222021 DEBUG oslo_concurrency.lockutils [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dffff7fd-2faf-4f94-953a-fbfb7c752e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.159 222021 DEBUG nova.network.neutron [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Refreshing network info cache for port c402b0f5-30ab-49f8-ac15-df817598cdb4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.167 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Start _get_guest_xml network_info=[{"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.174 222021 WARNING nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.187 222021 DEBUG nova.virt.libvirt.host [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.188 222021 DEBUG nova.virt.libvirt.host [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.201 222021 DEBUG nova.virt.libvirt.host [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.202 222021 DEBUG nova.virt.libvirt.host [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.204 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.204 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.204 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.205 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.205 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.205 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.205 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.205 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.206 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.206 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.206 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.206 222021 DEBUG nova.virt.hardware [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.209 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:19:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/943411179' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.674 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.705 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:34 np0005593233 nova_compute[222017]: 2026-01-23 10:19:34.710 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.032 222021 DEBUG nova.network.neutron [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Updating instance_info_cache with network_info: [{"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.075 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Releasing lock "refresh_cache-54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.076 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Instance network_info: |[{"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.077 222021 DEBUG oslo_concurrency.lockutils [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.077 222021 DEBUG nova.network.neutron [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Refreshing network info cache for port 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.080 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Start _get_guest_xml network_info=[{"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.085 222021 WARNING nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.095 222021 DEBUG nova.virt.libvirt.host [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.097 222021 DEBUG nova.virt.libvirt.host [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.103 222021 DEBUG nova.virt.libvirt.host [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.103 222021 DEBUG nova.virt.libvirt.host [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.105 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.105 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.106 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.106 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.106 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.106 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.106 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.107 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.107 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.107 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.107 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.107 222021 DEBUG nova.virt.hardware [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.111 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2390321078' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.220 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.222 222021 DEBUG nova.virt.libvirt.vif [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1205625057',display_name='tempest-ServerGroupTestJSON-server-1205625057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1205625057',id=151,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='363e1e6f82e8475f84a35d534d110de1',ramdisk_id='',reservation_id='r-wcvswefz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1092690866',owner_user_name='tempest-ServerGroupTestJSON-1092690866-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:25Z,user_data=None,user_id='c2c5617ce3104251a0aaf4950da1708c',uuid=dffff7fd-2faf-4f94-953a-fbfb7c752e62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.222 222021 DEBUG nova.network.os_vif_util [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Converting VIF {"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.223 222021 DEBUG nova.network.os_vif_util [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.224 222021 DEBUG nova.objects.instance [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lazy-loading 'pci_devices' on Instance uuid dffff7fd-2faf-4f94-953a-fbfb7c752e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.251 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <uuid>dffff7fd-2faf-4f94-953a-fbfb7c752e62</uuid>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <name>instance-00000097</name>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerGroupTestJSON-server-1205625057</nova:name>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:19:34</nova:creationTime>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:user uuid="c2c5617ce3104251a0aaf4950da1708c">tempest-ServerGroupTestJSON-1092690866-project-member</nova:user>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:project uuid="363e1e6f82e8475f84a35d534d110de1">tempest-ServerGroupTestJSON-1092690866</nova:project>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <nova:port uuid="c402b0f5-30ab-49f8-ac15-df817598cdb4">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <entry name="serial">dffff7fd-2faf-4f94-953a-fbfb7c752e62</entry>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <entry name="uuid">dffff7fd-2faf-4f94-953a-fbfb7c752e62</entry>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk.config">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:58:2c:de"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <target dev="tapc402b0f5-30"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/console.log" append="off"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:19:35 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:19:35 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:19:35 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:19:35 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.253 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Preparing to wait for external event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.253 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.253 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.254 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.254 222021 DEBUG nova.virt.libvirt.vif [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1205625057',display_name='tempest-ServerGroupTestJSON-server-1205625057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1205625057',id=151,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='363e1e6f82e8475f84a35d534d110de1',ramdisk_id='',reservation_id='r-wcvswefz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-1092690866',owner_user_name='tempest-ServerGroupTestJSON-1092690866-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:25Z,user_data=None,user_id='c2c5617ce3104251a0aaf4950da1708c',uuid=dffff7fd-2faf-4f94-953a-fbfb7c752e62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.255 222021 DEBUG nova.network.os_vif_util [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Converting VIF {"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.255 222021 DEBUG nova.network.os_vif_util [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.256 222021 DEBUG os_vif [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.257 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.257 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.258 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.262 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc402b0f5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.263 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc402b0f5-30, col_values=(('external_ids', {'iface-id': 'c402b0f5-30ab-49f8-ac15-df817598cdb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:2c:de', 'vm-uuid': 'dffff7fd-2faf-4f94-953a-fbfb7c752e62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.266 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:35 np0005593233 NetworkManager[48871]: <info>  [1769163575.2663] manager: (tapc402b0f5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.273 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.274 222021 INFO os_vif [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30')#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.346 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.346 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.347 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] No VIF found with MAC fa:16:3e:58:2c:de, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.347 222021 INFO nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Using config drive#033[00m
Jan 23 05:19:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:35.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.392 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:35.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:19:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3260707024' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.634 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.669 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:35 np0005593233 nova_compute[222017]: 2026-01-23 10:19:35.675 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:19:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/522953428' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.208 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.211 222021 DEBUG nova.virt.libvirt.vif [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2121647420',display_name='tempest-ServersTestJSON-server-2121647420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-2121647420',id=152,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-dqfutryh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:26Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=54d74a3a-c2da-4981-bc4a-9fa6e950d2d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.211 222021 DEBUG nova.network.os_vif_util [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.213 222021 DEBUG nova.network.os_vif_util [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.214 222021 DEBUG nova.objects.instance [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.247 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <uuid>54d74a3a-c2da-4981-bc4a-9fa6e950d2d0</uuid>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <name>instance-00000098</name>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersTestJSON-server-2121647420</nova:name>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:19:35</nova:creationTime>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:user uuid="ec99ae7c69d0438280441e0434374cbf">tempest-ServersTestJSON-1611255243-project-member</nova:user>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:project uuid="c59351a1b59c4cc9ad389dff900935f2">tempest-ServersTestJSON-1611255243</nova:project>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <nova:port uuid="7c5a83e6-b3a1-4a9c-9b2e-9012a943012a">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <entry name="serial">54d74a3a-c2da-4981-bc4a-9fa6e950d2d0</entry>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <entry name="uuid">54d74a3a-c2da-4981-bc4a-9fa6e950d2d0</entry>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk.config">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:e0:80:8a"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <target dev="tap7c5a83e6-b3"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/console.log" append="off"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:19:36 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:19:36 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:19:36 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:19:36 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.248 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Preparing to wait for external event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.248 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.249 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.249 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.250 222021 DEBUG nova.virt.libvirt.vif [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2121647420',display_name='tempest-ServersTestJSON-server-2121647420',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-2121647420',id=152,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-dqfutryh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:26Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=54d74a3a-c2da-4981-bc4a-9fa6e950d2d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.250 222021 DEBUG nova.network.os_vif_util [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.251 222021 DEBUG nova.network.os_vif_util [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.251 222021 DEBUG os_vif [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.253 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.253 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.257 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.257 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c5a83e6-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.258 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c5a83e6-b3, col_values=(('external_ids', {'iface-id': '7c5a83e6-b3a1-4a9c-9b2e-9012a943012a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:80:8a', 'vm-uuid': '54d74a3a-c2da-4981-bc4a-9fa6e950d2d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.260 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:36 np0005593233 NetworkManager[48871]: <info>  [1769163576.2620] manager: (tap7c5a83e6-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.269 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.271 222021 INFO os_vif [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3')#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.335 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.335 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.336 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No VIF found with MAC fa:16:3e:e0:80:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.336 222021 INFO nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Using config drive#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.367 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.739 222021 INFO nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Creating config drive at /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/disk.config#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.744 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpucg29v2v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.891 222021 DEBUG nova.network.neutron [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Updated VIF entry in instance network info cache for port c402b0f5-30ab-49f8-ac15-df817598cdb4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.892 222021 DEBUG nova.network.neutron [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Updating instance_info_cache with network_info: [{"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.896 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpucg29v2v" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.934 222021 DEBUG nova.storage.rbd_utils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] rbd image dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.940 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/disk.config dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:36 np0005593233 nova_compute[222017]: 2026-01-23 10:19:36.976 222021 DEBUG oslo_concurrency.lockutils [req-36343872-a693-4117-ad87-fbb8ea5d54b7 req-fa8e446c-74e2-4a0b-815e-bca0b69a367c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dffff7fd-2faf-4f94-953a-fbfb7c752e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.164 222021 DEBUG oslo_concurrency.processutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/disk.config dffff7fd-2faf-4f94-953a-fbfb7c752e62_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.225s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.166 222021 INFO nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Deleting local config drive /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62/disk.config because it was imported into RBD.#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.225 222021 INFO nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Creating config drive at /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/disk.config#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.232 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuafpe5m_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:37 np0005593233 kernel: tapc402b0f5-30: entered promiscuous mode
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.2605] manager: (tapc402b0f5-30): new Tun device (/org/freedesktop/NetworkManager/Devices/298)
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00637|binding|INFO|Claiming lport c402b0f5-30ab-49f8-ac15-df817598cdb4 for this chassis.
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00638|binding|INFO|c402b0f5-30ab-49f8-ac15-df817598cdb4: Claiming fa:16:3e:58:2c:de 10.100.0.9
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.289 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.291 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:2c:de 10.100.0.9'], port_security=['fa:16:3e:58:2c:de 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'dffff7fd-2faf-4f94-953a-fbfb7c752e62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '363e1e6f82e8475f84a35d534d110de1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b21c3cf2-7e29-49c7-8b15-9965809bce5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa406281-67f0-4e7c-9b1f-52c20eb59848, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=c402b0f5-30ab-49f8-ac15-df817598cdb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.294 140224 INFO neutron.agent.ovn.metadata.agent [-] Port c402b0f5-30ab-49f8-ac15-df817598cdb4 in datapath 5263810a-9fb5-4c3f-837e-8d26c2010e34 bound to our chassis#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.297 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5263810a-9fb5-4c3f-837e-8d26c2010e34#033[00m
Jan 23 05:19:37 np0005593233 systemd-machined[190954]: New machine qemu-69-instance-00000097.
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.321 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8b1a07-b6df-41a2-b821-eeae76c5265e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.324 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5263810a-91 in ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.328 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5263810a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.328 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d12090-cebd-41e2-93ec-5f8d81dc3916]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.330 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed1e1b9-f51f-4c38-8675-2b45694a7043]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00639|binding|INFO|Setting lport c402b0f5-30ab-49f8-ac15-df817598cdb4 ovn-installed in OVS
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.352 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[2414aeb1-5452-4dce-abe0-269ff2d12ffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00640|binding|INFO|Setting lport c402b0f5-30ab-49f8-ac15-df817598cdb4 up in Southbound
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.354 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 systemd[1]: Started Virtual Machine qemu-69-instance-00000097.
Jan 23 05:19:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:19:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:37.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:19:37 np0005593233 systemd-udevd[280295]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.379 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8261c830-6b0e-4534-9937-d3dbc49feb8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.3883] device (tapc402b0f5-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.3901] device (tapc402b0f5-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.406 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuafpe5m_" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.413 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[efd39fbd-4b3a-45a0-8e50-91619f9d5009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 systemd-udevd[280298]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.421 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fc187ded-5e0b-4923-a786-55a20135b0b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.4229] manager: (tap5263810a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/299)
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.460 222021 DEBUG nova.storage.rbd_utils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.468 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/disk.config 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.471 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3036d296-8938-4aae-b42a-88b7fba75211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.477 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[73f6dd99-05fe-4e56-86b9-0fc83ef023f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.5103] device (tap5263810a-90): carrier: link connected
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.518 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9d07c854-cb03-4ba0-982e-22040d846e41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.541 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3fcc46-2e30-4f8b-8d20-d11bccab2def]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5263810a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:5a:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 749014, 'reachable_time': 16513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280347, 'error': None, 'target': 'ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:37.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.568 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b85b65-74a7-44fb-9608-43bb10664506]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fece:5ae2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 749014, 'tstamp': 749014}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280355, 'error': None, 'target': 'ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.595 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b03c5684-d56c-4829-94db-0352d424a454]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5263810a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:5a:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 192], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 749014, 'reachable_time': 16513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280373, 'error': None, 'target': 'ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.628 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ee03706c-0bae-4590-95ec-77ac11387b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.664 222021 DEBUG oslo_concurrency.processutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/disk.config 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.196s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.665 222021 INFO nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Deleting local config drive /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0/disk.config because it was imported into RBD.#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.698 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9561ace4-c811-4b79-bb4f-1f4996a4af84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.701 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5263810a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.700 222021 DEBUG nova.network.neutron [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Updated VIF entry in instance network info cache for port 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.702 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.702 222021 DEBUG nova.network.neutron [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Updating instance_info_cache with network_info: [{"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.702 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5263810a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.704 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.7058] manager: (tap5263810a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 23 05:19:37 np0005593233 kernel: tap5263810a-90: entered promiscuous mode
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.712 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.715 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5263810a-90, col_values=(('external_ids', {'iface-id': '8861971c-f545-4d51-9527-662749049344'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.717 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00641|binding|INFO|Releasing lport 8861971c-f545-4d51-9527-662749049344 from this chassis (sb_readonly=0)
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.7247] manager: (tap7c5a83e6-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Jan 23 05:19:37 np0005593233 systemd-udevd[280337]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:37 np0005593233 kernel: tap7c5a83e6-b3: entered promiscuous mode
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.737 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00642|binding|INFO|Claiming lport 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a for this chassis.
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00643|binding|INFO|7c5a83e6-b3a1-4a9c-9b2e-9012a943012a: Claiming fa:16:3e:e0:80:8a 10.100.0.9
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.739 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5263810a-9fb5-4c3f-837e-8d26c2010e34.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5263810a-9fb5-4c3f-837e-8d26c2010e34.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.740 222021 DEBUG oslo_concurrency.lockutils [req-d7fb4730-b3ec-4a59-9155-3030d5ba7e5a req-76cf5f7c-93c0-4484-9465-4b6ba21076d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.7420] device (tap7c5a83e6-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:19:37 np0005593233 NetworkManager[48871]: <info>  [1769163577.7437] device (tap7c5a83e6-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.740 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[713e7d69-6b7f-4217-8ffe-847a161458aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.741 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-5263810a-9fb5-4c3f-837e-8d26c2010e34
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/5263810a-9fb5-4c3f-837e-8d26c2010e34.pid.haproxy
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 5263810a-9fb5-4c3f-837e-8d26c2010e34
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.742 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'env', 'PROCESS_TAG=haproxy-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5263810a-9fb5-4c3f-837e-8d26c2010e34.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:19:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:37.766 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:80:8a 10.100.0.9'], port_security=['fa:16:3e:e0:80:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '54d74a3a-c2da-4981-bc4a-9fa6e950d2d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd59f1dd0-018a-40d5-b9a0-54c6c1f9d925', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c808b115-ccf1-41c4-acea-daabae8abf5b, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00644|binding|INFO|Setting lport 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a ovn-installed in OVS
Jan 23 05:19:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:37Z|00645|binding|INFO|Setting lport 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a up in Southbound
Jan 23 05:19:37 np0005593233 systemd-machined[190954]: New machine qemu-70-instance-00000098.
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.779 222021 DEBUG nova.compute.manager [req-6b7c78aa-f125-4940-a230-8d37f733ff41 req-3a1e6200-8683-4f34-8a9b-55395e3d0f9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.780 222021 DEBUG oslo_concurrency.lockutils [req-6b7c78aa-f125-4940-a230-8d37f733ff41 req-3a1e6200-8683-4f34-8a9b-55395e3d0f9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.780 222021 DEBUG oslo_concurrency.lockutils [req-6b7c78aa-f125-4940-a230-8d37f733ff41 req-3a1e6200-8683-4f34-8a9b-55395e3d0f9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.780 222021 DEBUG oslo_concurrency.lockutils [req-6b7c78aa-f125-4940-a230-8d37f733ff41 req-3a1e6200-8683-4f34-8a9b-55395e3d0f9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.780 222021 DEBUG nova.compute.manager [req-6b7c78aa-f125-4940-a230-8d37f733ff41 req-3a1e6200-8683-4f34-8a9b-55395e3d0f9e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Processing event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593233 systemd[1]: Started Virtual Machine qemu-70-instance-00000098.
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.829 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.832 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163577.828433, dffff7fd-2faf-4f94-953a-fbfb7c752e62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.834 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] VM Started (Lifecycle Event)#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.847 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.857 222021 INFO nova.virt.libvirt.driver [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Instance spawned successfully.#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.858 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.862 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.866 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.893 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.894 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.894 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.894 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.895 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.895 222021 DEBUG nova.virt.libvirt.driver [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.900 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.901 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163577.8396692, dffff7fd-2faf-4f94-953a-fbfb7c752e62 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.901 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.935 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.940 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163577.8464806, dffff7fd-2faf-4f94-953a-fbfb7c752e62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:37 np0005593233 nova_compute[222017]: 2026-01-23 10:19:37.940 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.002 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.006 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.020 222021 INFO nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Took 12.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.021 222021 DEBUG nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.043 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.117 222021 INFO nova.compute.manager [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Took 13.94 seconds to build instance.#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.165 222021 DEBUG oslo_concurrency.lockutils [None req-5db1816c-2db1-4bc4-a211-29673dc7e35c c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:38 np0005593233 podman[280470]: 2026-01-23 10:19:38.233314336 +0000 UTC m=+0.080342983 container create 187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:19:38 np0005593233 podman[280470]: 2026-01-23 10:19:38.188063516 +0000 UTC m=+0.035092253 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:19:38 np0005593233 systemd[1]: Started libpod-conmon-187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa.scope.
Jan 23 05:19:38 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:19:38 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e42b7c57232782fb359534d22fb138cf68eeccfe370ff9be5af6c1703a05b43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.350 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163578.3491852, 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.351 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] VM Started (Lifecycle Event)#033[00m
Jan 23 05:19:38 np0005593233 podman[280470]: 2026-01-23 10:19:38.365840195 +0000 UTC m=+0.212868892 container init 187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:19:38 np0005593233 podman[280470]: 2026-01-23 10:19:38.373363698 +0000 UTC m=+0.220392365 container start 187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:19:38 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [NOTICE]   (280525) : New worker (280527) forked
Jan 23 05:19:38 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [NOTICE]   (280525) : Loading success.
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.469 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a in datapath 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 unbound from our chassis#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.472 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.495 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[03d9978d-9c9e-4ecb-8219-25fcef181237]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.547 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6072ec58-90e5-4c42-a1f2-01cef7b3ba1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.552 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c7144b-cb92-4a49-9c75-461cafb83a33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.591 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.594 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7a56b9-3ebe-4978-9f00-c2a9e203eeaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.595 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163578.3493524, 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.596 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.616 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3735ce03-71cf-4b57-98e8-bda0952e00c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43bdb40a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:5e:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738749, 'reachable_time': 26119, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280541, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.621 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.624 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.643 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a78eb388-432e-452b-a812-c370cd37ea79]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738763, 'tstamp': 738763}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280542, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738766, 'tstamp': 738766}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280542, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.647 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bdb40a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.650 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.651 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.652 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.654 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43bdb40a-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.654 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.655 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43bdb40a-e0, col_values=(('external_ids', {'iface-id': '8a8ef4f2-2ba5-405a-811e-058c5ff2b91e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:38.655 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.872 222021 DEBUG nova.compute.manager [req-f3a7aa36-2e87-4ba0-844f-e28227cef371 req-60292a4c-b925-4c1d-a240-cdc5ec51b0f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.873 222021 DEBUG oslo_concurrency.lockutils [req-f3a7aa36-2e87-4ba0-844f-e28227cef371 req-60292a4c-b925-4c1d-a240-cdc5ec51b0f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.873 222021 DEBUG oslo_concurrency.lockutils [req-f3a7aa36-2e87-4ba0-844f-e28227cef371 req-60292a4c-b925-4c1d-a240-cdc5ec51b0f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.873 222021 DEBUG oslo_concurrency.lockutils [req-f3a7aa36-2e87-4ba0-844f-e28227cef371 req-60292a4c-b925-4c1d-a240-cdc5ec51b0f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.873 222021 DEBUG nova.compute.manager [req-f3a7aa36-2e87-4ba0-844f-e28227cef371 req-60292a4c-b925-4c1d-a240-cdc5ec51b0f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Processing event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.874 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.881 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163578.8812943, 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.882 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.884 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.887 222021 INFO nova.virt.libvirt.driver [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Instance spawned successfully.#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.887 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.964 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.968 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.980 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.980 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.981 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.981 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.982 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.982 222021 DEBUG nova.virt.libvirt.driver [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:38 np0005593233 nova_compute[222017]: 2026-01-23 10:19:38.992 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:39 np0005593233 nova_compute[222017]: 2026-01-23 10:19:39.049 222021 INFO nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Took 12.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:19:39 np0005593233 nova_compute[222017]: 2026-01-23 10:19:39.050 222021 DEBUG nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:39 np0005593233 nova_compute[222017]: 2026-01-23 10:19:39.136 222021 INFO nova.compute.manager [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Took 14.83 seconds to build instance.#033[00m
Jan 23 05:19:39 np0005593233 nova_compute[222017]: 2026-01-23 10:19:39.166 222021 DEBUG oslo_concurrency.lockutils [None req-33819a78-549e-42b9-8139-28a669a37752 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:39.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:39.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.260 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:41.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:41.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.817 222021 DEBUG nova.compute.manager [req-94bee0a9-d706-4248-b79d-3a5f9140e7b1 req-bf667138-3254-479f-9835-c678d97af6fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.817 222021 DEBUG oslo_concurrency.lockutils [req-94bee0a9-d706-4248-b79d-3a5f9140e7b1 req-bf667138-3254-479f-9835-c678d97af6fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.817 222021 DEBUG oslo_concurrency.lockutils [req-94bee0a9-d706-4248-b79d-3a5f9140e7b1 req-bf667138-3254-479f-9835-c678d97af6fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.817 222021 DEBUG oslo_concurrency.lockutils [req-94bee0a9-d706-4248-b79d-3a5f9140e7b1 req-bf667138-3254-479f-9835-c678d97af6fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.818 222021 DEBUG nova.compute.manager [req-94bee0a9-d706-4248-b79d-3a5f9140e7b1 req-bf667138-3254-479f-9835-c678d97af6fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] No waiting events found dispatching network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:41 np0005593233 nova_compute[222017]: 2026-01-23 10:19:41.818 222021 WARNING nova.compute.manager [req-94bee0a9-d706-4248-b79d-3a5f9140e7b1 req-bf667138-3254-479f-9835-c678d97af6fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received unexpected event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.399 222021 DEBUG nova.compute.manager [req-3af5eb78-5a70-4077-afc9-ed23241b6ea8 req-8f41af8a-1e8c-4d58-bd0f-e94a549b5ccb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.400 222021 DEBUG oslo_concurrency.lockutils [req-3af5eb78-5a70-4077-afc9-ed23241b6ea8 req-8f41af8a-1e8c-4d58-bd0f-e94a549b5ccb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.400 222021 DEBUG oslo_concurrency.lockutils [req-3af5eb78-5a70-4077-afc9-ed23241b6ea8 req-8f41af8a-1e8c-4d58-bd0f-e94a549b5ccb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.400 222021 DEBUG oslo_concurrency.lockutils [req-3af5eb78-5a70-4077-afc9-ed23241b6ea8 req-8f41af8a-1e8c-4d58-bd0f-e94a549b5ccb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.401 222021 DEBUG nova.compute.manager [req-3af5eb78-5a70-4077-afc9-ed23241b6ea8 req-8f41af8a-1e8c-4d58-bd0f-e94a549b5ccb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] No waiting events found dispatching network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.401 222021 WARNING nova.compute.manager [req-3af5eb78-5a70-4077-afc9-ed23241b6ea8 req-8f41af8a-1e8c-4d58-bd0f-e94a549b5ccb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received unexpected event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.682 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.684 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.684 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.702 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.704 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.704 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.704 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.704 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.705 222021 INFO nova.compute.manager [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Terminating instance#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.706 222021 DEBUG nova.compute.manager [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:19:42 np0005593233 kernel: tapc402b0f5-30 (unregistering): left promiscuous mode
Jan 23 05:19:42 np0005593233 NetworkManager[48871]: <info>  [1769163582.7586] device (tapc402b0f5-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:19:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:42Z|00646|binding|INFO|Releasing lport c402b0f5-30ab-49f8-ac15-df817598cdb4 from this chassis (sb_readonly=0)
Jan 23 05:19:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:42Z|00647|binding|INFO|Setting lport c402b0f5-30ab-49f8-ac15-df817598cdb4 down in Southbound
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:42Z|00648|binding|INFO|Removing iface tapc402b0f5-30 ovn-installed in OVS
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.777 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.815 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:42 np0005593233 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000097.scope: Deactivated successfully.
Jan 23 05:19:42 np0005593233 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000097.scope: Consumed 5.607s CPU time.
Jan 23 05:19:42 np0005593233 systemd-machined[190954]: Machine qemu-69-instance-00000097 terminated.
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.921 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:2c:de 10.100.0.9'], port_security=['fa:16:3e:58:2c:de 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'dffff7fd-2faf-4f94-953a-fbfb7c752e62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '363e1e6f82e8475f84a35d534d110de1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b21c3cf2-7e29-49c7-8b15-9965809bce5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aa406281-67f0-4e7c-9b1f-52c20eb59848, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=c402b0f5-30ab-49f8-ac15-df817598cdb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.923 140224 INFO neutron.agent.ovn.metadata.agent [-] Port c402b0f5-30ab-49f8-ac15-df817598cdb4 in datapath 5263810a-9fb5-4c3f-837e-8d26c2010e34 unbound from our chassis#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.926 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5263810a-9fb5-4c3f-837e-8d26c2010e34, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.928 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2c3ddc-9b5b-4102-83d4-5fb62a275cbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:42.929 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34 namespace which is not needed anymore#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.969 222021 INFO nova.virt.libvirt.driver [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Instance destroyed successfully.#033[00m
Jan 23 05:19:42 np0005593233 nova_compute[222017]: 2026-01-23 10:19:42.970 222021 DEBUG nova.objects.instance [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lazy-loading 'resources' on Instance uuid dffff7fd-2faf-4f94-953a-fbfb7c752e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.003 222021 DEBUG nova.virt.libvirt.vif [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:19:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1205625057',display_name='tempest-ServerGroupTestJSON-server-1205625057',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1205625057',id=151,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:19:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='363e1e6f82e8475f84a35d534d110de1',ramdisk_id='',reservation_id='r-wcvswefz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-1092690866',owner_user_name='tempest-ServerGroupTestJSON-1092690866-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:19:38Z,user_data=None,user_id='c2c5617ce3104251a0aaf4950da1708c',uuid=dffff7fd-2faf-4f94-953a-fbfb7c752e62,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.004 222021 DEBUG nova.network.os_vif_util [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Converting VIF {"id": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "address": "fa:16:3e:58:2c:de", "network": {"id": "5263810a-9fb5-4c3f-837e-8d26c2010e34", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-182787493-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "363e1e6f82e8475f84a35d534d110de1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc402b0f5-30", "ovs_interfaceid": "c402b0f5-30ab-49f8-ac15-df817598cdb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.005 222021 DEBUG nova.network.os_vif_util [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.005 222021 DEBUG os_vif [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.008 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc402b0f5-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.012 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.012 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.015 222021 INFO os_vif [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:2c:de,bridge_name='br-int',has_traffic_filtering=True,id=c402b0f5-30ab-49f8-ac15-df817598cdb4,network=Network(5263810a-9fb5-4c3f-837e-8d26c2010e34),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc402b0f5-30')#033[00m
Jan 23 05:19:43 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [NOTICE]   (280525) : haproxy version is 2.8.14-c23fe91
Jan 23 05:19:43 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [NOTICE]   (280525) : path to executable is /usr/sbin/haproxy
Jan 23 05:19:43 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [WARNING]  (280525) : Exiting Master process...
Jan 23 05:19:43 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [WARNING]  (280525) : Exiting Master process...
Jan 23 05:19:43 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [ALERT]    (280525) : Current worker (280527) exited with code 143 (Terminated)
Jan 23 05:19:43 np0005593233 neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34[280520]: [WARNING]  (280525) : All workers exited. Exiting... (0)
Jan 23 05:19:43 np0005593233 systemd[1]: libpod-187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa.scope: Deactivated successfully.
Jan 23 05:19:43 np0005593233 podman[280645]: 2026-01-23 10:19:43.131065189 +0000 UTC m=+0.063896778 container died 187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:19:43 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa-userdata-shm.mount: Deactivated successfully.
Jan 23 05:19:43 np0005593233 systemd[1]: var-lib-containers-storage-overlay-0e42b7c57232782fb359534d22fb138cf68eeccfe370ff9be5af6c1703a05b43-merged.mount: Deactivated successfully.
Jan 23 05:19:43 np0005593233 podman[280645]: 2026-01-23 10:19:43.197656343 +0000 UTC m=+0.130487932 container cleanup 187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:19:43 np0005593233 systemd[1]: libpod-conmon-187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa.scope: Deactivated successfully.
Jan 23 05:19:43 np0005593233 podman[280673]: 2026-01-23 10:19:43.277709877 +0000 UTC m=+0.050908641 container remove 187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.285 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ffed0b31-e8ba-448c-8aaa-113cc2b62c8b]: (4, ('Fri Jan 23 10:19:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34 (187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa)\n187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa\nFri Jan 23 10:19:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34 (187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa)\n187b20fb9b005ac636ea359ebfe0b194f107f6660c1b8d7ee1f5f4c51dcababa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.288 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[09a46c4a-b731-4b0e-af16-f43b1f195a10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.290 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5263810a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.292 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 kernel: tap5263810a-90: left promiscuous mode
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.307 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.308 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.313 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[555e7e08-0df5-4ba9-968c-8e602256dda5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.330 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0dffdb7a-056b-4cd6-b312-03ffa270c2cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.332 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b260d7-143f-49b7-85d3-3025942fd496]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.351 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2f53c273-78da-4993-973c-22c0c064f7c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 749004, 'reachable_time': 38485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280690, 'error': None, 'target': 'ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 systemd[1]: run-netns-ovnmeta\x2d5263810a\x2d9fb5\x2d4c3f\x2d837e\x2d8d26c2010e34.mount: Deactivated successfully.
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.356 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5263810a-9fb5-4c3f-837e-8d26c2010e34 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:19:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:43.356 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[7521e958-49ff-468c-8e27-1674a8be6d78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:43.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.506 222021 INFO nova.virt.libvirt.driver [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Deleting instance files /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62_del#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.507 222021 INFO nova.virt.libvirt.driver [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Deletion of /var/lib/nova/instances/dffff7fd-2faf-4f94-953a-fbfb7c752e62_del complete#033[00m
Jan 23 05:19:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:43.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.585 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.734 222021 INFO nova.compute.manager [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Took 1.03 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.734 222021 DEBUG oslo.service.loopingcall [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.734 222021 DEBUG nova.compute.manager [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:19:43 np0005593233 nova_compute[222017]: 2026-01-23 10:19:43.735 222021 DEBUG nova.network.neutron [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.143 222021 DEBUG nova.compute.manager [req-815fc436-7b6a-48dd-bc1e-f736501e5982 req-86f2e0bb-9788-4e4f-8371-7e18f38e413e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-vif-unplugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.143 222021 DEBUG oslo_concurrency.lockutils [req-815fc436-7b6a-48dd-bc1e-f736501e5982 req-86f2e0bb-9788-4e4f-8371-7e18f38e413e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.143 222021 DEBUG oslo_concurrency.lockutils [req-815fc436-7b6a-48dd-bc1e-f736501e5982 req-86f2e0bb-9788-4e4f-8371-7e18f38e413e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.143 222021 DEBUG oslo_concurrency.lockutils [req-815fc436-7b6a-48dd-bc1e-f736501e5982 req-86f2e0bb-9788-4e4f-8371-7e18f38e413e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.143 222021 DEBUG nova.compute.manager [req-815fc436-7b6a-48dd-bc1e-f736501e5982 req-86f2e0bb-9788-4e4f-8371-7e18f38e413e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] No waiting events found dispatching network-vif-unplugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.144 222021 DEBUG nova.compute.manager [req-815fc436-7b6a-48dd-bc1e-f736501e5982 req-86f2e0bb-9788-4e4f-8371-7e18f38e413e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-vif-unplugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:19:44 np0005593233 nova_compute[222017]: 2026-01-23 10:19:44.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:45.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.404 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.405 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.406 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.407 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.407 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.409 222021 INFO nova.compute.manager [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Terminating instance#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.411 222021 DEBUG nova.compute.manager [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:19:45 np0005593233 kernel: tap7c5a83e6-b3 (unregistering): left promiscuous mode
Jan 23 05:19:45 np0005593233 NetworkManager[48871]: <info>  [1769163585.4597] device (tap7c5a83e6-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:19:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:45Z|00649|binding|INFO|Releasing lport 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a from this chassis (sb_readonly=0)
Jan 23 05:19:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:45Z|00650|binding|INFO|Setting lport 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a down in Southbound
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.471 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:19:45Z|00651|binding|INFO|Removing iface tap7c5a83e6-b3 ovn-installed in OVS
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.476 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.487 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:80:8a 10.100.0.9'], port_security=['fa:16:3e:e0:80:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '54d74a3a-c2da-4981-bc4a-9fa6e950d2d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd59f1dd0-018a-40d5-b9a0-54c6c1f9d925', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c808b115-ccf1-41c4-acea-daabae8abf5b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.489 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 7c5a83e6-b3a1-4a9c-9b2e-9012a943012a in datapath 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 unbound from our chassis#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.490 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.506 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.518 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c6fa94b6-e48c-43aa-82a4-6829741dbed8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:45 np0005593233 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000098.scope: Deactivated successfully.
Jan 23 05:19:45 np0005593233 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000098.scope: Consumed 7.180s CPU time.
Jan 23 05:19:45 np0005593233 systemd-machined[190954]: Machine qemu-70-instance-00000098 terminated.
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.563 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[83956103-d0a1-4b3a-8819-160f2b07bf52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.568 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d33f8d64-bc82-4a2d-a422-1d31392250c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:45.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.609 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[710eb6a6-c06e-445c-92f4-17b2edc3cab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.639 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f6fcf9-283e-4fed-8e19-a2f0a697e2ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43bdb40a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:5e:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738749, 'reachable_time': 26119, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280701, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.657 222021 INFO nova.virt.libvirt.driver [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Instance destroyed successfully.#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.658 222021 DEBUG nova.objects.instance [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'resources' on Instance uuid 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.674 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[42c192bb-fa41-4bc6-a0b8-363c1fc8e2e7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738763, 'tstamp': 738763}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280708, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738766, 'tstamp': 738766}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280708, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.677 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bdb40a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.679 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.689 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43bdb40a-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.686 222021 DEBUG nova.virt.libvirt.vif [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:19:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2121647420',display_name='tempest-ServersTestJSON-server-2121647420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-2121647420',id=152,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:19:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-dqfutryh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:19:39Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=54d74a3a-c2da-4981-bc4a-9fa6e950d2d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.689 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.687 222021 DEBUG nova.network.os_vif_util [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "address": "fa:16:3e:e0:80:8a", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c5a83e6-b3", "ovs_interfaceid": "7c5a83e6-b3a1-4a9c-9b2e-9012a943012a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.688 222021 DEBUG nova.network.os_vif_util [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.690 222021 DEBUG os_vif [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.690 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43bdb40a-e0, col_values=(('external_ids', {'iface-id': '8a8ef4f2-2ba5-405a-811e-058c5ff2b91e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:19:45.691 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.693 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c5a83e6-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.696 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.697 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.701 222021 INFO os_vif [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:80:8a,bridge_name='br-int',has_traffic_filtering=True,id=7c5a83e6-b3a1-4a9c-9b2e-9012a943012a,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c5a83e6-b3')#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.766 222021 DEBUG nova.network.neutron [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.794 222021 INFO nova.compute.manager [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Took 2.06 seconds to deallocate network for instance.#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.892 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.894 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:45 np0005593233 nova_compute[222017]: 2026-01-23 10:19:45.980 222021 DEBUG nova.scheduler.client.report [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.008 222021 DEBUG nova.scheduler.client.report [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.009 222021 DEBUG nova.compute.provider_tree [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.124 222021 DEBUG nova.compute.manager [req-f894009f-487d-43b2-babb-0a7119b39a11 req-17430667-515c-4ae7-b4f6-ab2263f8457b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-vif-deleted-c402b0f5-30ab-49f8-ac15-df817598cdb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.308 222021 DEBUG nova.scheduler.client.report [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.352 222021 DEBUG nova.compute.manager [req-6eb72846-4a64-40f3-ac94-6f6f4c1d9259 req-0c9254ee-1fe8-4af3-bc32-cde2d615d412 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.352 222021 DEBUG oslo_concurrency.lockutils [req-6eb72846-4a64-40f3-ac94-6f6f4c1d9259 req-0c9254ee-1fe8-4af3-bc32-cde2d615d412 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.353 222021 DEBUG oslo_concurrency.lockutils [req-6eb72846-4a64-40f3-ac94-6f6f4c1d9259 req-0c9254ee-1fe8-4af3-bc32-cde2d615d412 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.353 222021 DEBUG oslo_concurrency.lockutils [req-6eb72846-4a64-40f3-ac94-6f6f4c1d9259 req-0c9254ee-1fe8-4af3-bc32-cde2d615d412 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.353 222021 DEBUG nova.compute.manager [req-6eb72846-4a64-40f3-ac94-6f6f4c1d9259 req-0c9254ee-1fe8-4af3-bc32-cde2d615d412 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] No waiting events found dispatching network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.354 222021 WARNING nova.compute.manager [req-6eb72846-4a64-40f3-ac94-6f6f4c1d9259 req-0c9254ee-1fe8-4af3-bc32-cde2d615d412 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Received unexpected event network-vif-plugged-c402b0f5-30ab-49f8-ac15-df817598cdb4 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.356 222021 DEBUG nova.scheduler.client.report [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.424 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.436 222021 INFO nova.virt.libvirt.driver [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Deleting instance files /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_del#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.438 222021 INFO nova.virt.libvirt.driver [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Deletion of /var/lib/nova/instances/54d74a3a-c2da-4981-bc4a-9fa6e950d2d0_del complete#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.480 222021 DEBUG oslo_concurrency.processutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.540 222021 INFO nova.compute.manager [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Took 1.13 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.542 222021 DEBUG oslo.service.loopingcall [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.543 222021 DEBUG nova.compute.manager [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:19:46 np0005593233 nova_compute[222017]: 2026-01-23 10:19:46.543 222021 DEBUG nova.network.neutron [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:19:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1931455427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.054 222021 DEBUG oslo_concurrency.processutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.574s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.063 222021 DEBUG nova.compute.provider_tree [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.108 222021 DEBUG nova.scheduler.client.report [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:47 np0005593233 podman[280754]: 2026-01-23 10:19:47.164855645 +0000 UTC m=+0.160601164 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.170 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.177 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.178 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.180 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.181 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.260 222021 INFO nova.scheduler.client.report [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Deleted allocations for instance dffff7fd-2faf-4f94-953a-fbfb7c752e62#033[00m
Jan 23 05:19:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:47.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:47.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2715903321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:47 np0005593233 nova_compute[222017]: 2026-01-23 10:19:47.712 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:48 np0005593233 nova_compute[222017]: 2026-01-23 10:19:48.629 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:49.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.378 222021 DEBUG oslo_concurrency.lockutils [None req-b3d6fc89-ee08-4efe-8d2a-4780d2de098d c2c5617ce3104251a0aaf4950da1708c 363e1e6f82e8475f84a35d534d110de1 - - default default] Lock "dffff7fd-2faf-4f94-953a-fbfb7c752e62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.449 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.449 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.540 222021 DEBUG nova.compute.manager [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-vif-unplugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.541 222021 DEBUG oslo_concurrency.lockutils [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.541 222021 DEBUG oslo_concurrency.lockutils [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.541 222021 DEBUG oslo_concurrency.lockutils [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.542 222021 DEBUG nova.compute.manager [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] No waiting events found dispatching network-vif-unplugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.542 222021 DEBUG nova.compute.manager [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-vif-unplugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.542 222021 DEBUG nova.compute.manager [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.543 222021 DEBUG oslo_concurrency.lockutils [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.543 222021 DEBUG oslo_concurrency.lockutils [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.543 222021 DEBUG oslo_concurrency.lockutils [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.543 222021 DEBUG nova.compute.manager [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] No waiting events found dispatching network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.544 222021 WARNING nova.compute.manager [req-2c5376b4-03c6-411a-a8ec-2891fb2cde84 req-8d427e9d-833b-4ad5-88ff-d1f760d946e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received unexpected event network-vif-plugged-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:19:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:49.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.693 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.698 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4141MB free_disk=20.818321228027344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.699 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.699 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 95158f57-1f68-4b3e-9d10-e3006c3f2060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.799 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.799 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.799 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:19:49 np0005593233 nova_compute[222017]: 2026-01-23 10:19:49.900 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.319 222021 DEBUG nova.network.neutron [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1410008606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.383 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.390 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.418 222021 INFO nova.compute.manager [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Took 3.87 seconds to deallocate network for instance.#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.491 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.555 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.555 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.561 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.561 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.599 222021 DEBUG nova.compute.manager [req-a53ef76f-9776-42e2-9d9b-1d115bbe83cf req-a5be496b-9237-4ff9-b4d0-503209df45fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Received event network-vif-deleted-7c5a83e6-b3a1-4a9c-9b2e-9012a943012a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.656 222021 DEBUG oslo_concurrency.processutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:50 np0005593233 nova_compute[222017]: 2026-01-23 10:19:50.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1571576233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.107 222021 DEBUG oslo_concurrency.processutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.115 222021 DEBUG nova.compute.provider_tree [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.151 222021 DEBUG nova.scheduler.client.report [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.232 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:51.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.401 222021 INFO nova.scheduler.client.report [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Deleted allocations for instance 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.486 222021 DEBUG oslo_concurrency.lockutils [None req-2db43f58-f8c8-4db3-84e1-7ef1dee66966 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "54d74a3a-c2da-4981-bc4a-9fa6e950d2d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.556 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.556 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.557 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:51 np0005593233 nova_compute[222017]: 2026-01-23 10:19:51.557 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:51.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:53.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:19:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:19:53 np0005593233 nova_compute[222017]: 2026-01-23 10:19:53.632 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:54 np0005593233 nova_compute[222017]: 2026-01-23 10:19:54.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:54 np0005593233 nova_compute[222017]: 2026-01-23 10:19:54.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:19:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:55.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:19:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:55.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:19:55 np0005593233 nova_compute[222017]: 2026-01-23 10:19:55.696 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:56 np0005593233 nova_compute[222017]: 2026-01-23 10:19:56.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:56 np0005593233 nova_compute[222017]: 2026-01-23 10:19:56.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:19:56 np0005593233 nova_compute[222017]: 2026-01-23 10:19:56.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:19:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 e312: 3 total, 3 up, 3 in
Jan 23 05:19:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:57.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:57.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:57 np0005593233 nova_compute[222017]: 2026-01-23 10:19:57.965 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163582.9640872, dffff7fd-2faf-4f94-953a-fbfb7c752e62 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:57 np0005593233 nova_compute[222017]: 2026-01-23 10:19:57.966 222021 INFO nova.compute.manager [-] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:19:58 np0005593233 nova_compute[222017]: 2026-01-23 10:19:58.634 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:59.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:19:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:19:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:19:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:59.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 05:20:00 np0005593233 nova_compute[222017]: 2026-01-23 10:20:00.655 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163585.653427, 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:20:00 np0005593233 nova_compute[222017]: 2026-01-23 10:20:00.655 222021 INFO nova.compute.manager [-] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:20:00 np0005593233 nova_compute[222017]: 2026-01-23 10:20:00.697 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:01.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:01.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:03 np0005593233 podman[280849]: 2026-01-23 10:20:03.059639848 +0000 UTC m=+0.061030707 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:20:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:03.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:03.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:03 np0005593233 nova_compute[222017]: 2026-01-23 10:20:03.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:05.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:05.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:05 np0005593233 nova_compute[222017]: 2026-01-23 10:20:05.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:06 np0005593233 nova_compute[222017]: 2026-01-23 10:20:06.715 222021 DEBUG nova.compute.manager [None req-93f25d2e-2869-4c76-8bc2-1d9d91cf83ed - - - - - -] [instance: 54d74a3a-c2da-4981-bc4a-9fa6e950d2d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:20:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:07.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:07.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:08 np0005593233 nova_compute[222017]: 2026-01-23 10:20:08.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:08 np0005593233 nova_compute[222017]: 2026-01-23 10:20:08.976 222021 DEBUG nova.compute.manager [None req-a98ebcab-5dcc-4de7-ba1d-af54509fe6ce - - - - - -] [instance: dffff7fd-2faf-4f94-953a-fbfb7c752e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:20:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:09.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:09.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:10 np0005593233 nova_compute[222017]: 2026-01-23 10:20:10.701 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:11 np0005593233 nova_compute[222017]: 2026-01-23 10:20:11.139 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:11 np0005593233 nova_compute[222017]: 2026-01-23 10:20:11.140 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:11 np0005593233 nova_compute[222017]: 2026-01-23 10:20:11.140 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:20:11 np0005593233 nova_compute[222017]: 2026-01-23 10:20:11.141 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:20:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:11.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:20:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:11.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:13.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:13.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:13 np0005593233 nova_compute[222017]: 2026-01-23 10:20:13.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:15.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:15.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:15 np0005593233 nova_compute[222017]: 2026-01-23 10:20:15.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:17.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:18 np0005593233 podman[280868]: 2026-01-23 10:20:18.10787733 +0000 UTC m=+0.105924107 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:20:18 np0005593233 nova_compute[222017]: 2026-01-23 10:20:18.669 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:19.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:19.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:20 np0005593233 nova_compute[222017]: 2026-01-23 10:20:20.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:20:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:21.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:20:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:23.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:20:23.497 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:20:23 np0005593233 nova_compute[222017]: 2026-01-23 10:20:23.499 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:20:23.500 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:20:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:23.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:23 np0005593233 nova_compute[222017]: 2026-01-23 10:20:23.672 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:25.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:25.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:25 np0005593233 nova_compute[222017]: 2026-01-23 10:20:25.746 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:26 np0005593233 nova_compute[222017]: 2026-01-23 10:20:26.989 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:27.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:20:27.503 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:27.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:28 np0005593233 nova_compute[222017]: 2026-01-23 10:20:28.705 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:29.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:29.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:30 np0005593233 ovn_controller[130653]: 2026-01-23T10:20:30Z|00652|binding|INFO|Releasing lport 8a8ef4f2-2ba5-405a-811e-058c5ff2b91e from this chassis (sb_readonly=0)
Jan 23 05:20:30 np0005593233 nova_compute[222017]: 2026-01-23 10:20:30.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:30 np0005593233 nova_compute[222017]: 2026-01-23 10:20:30.749 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:30 np0005593233 nova_compute[222017]: 2026-01-23 10:20:30.784 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:30 np0005593233 nova_compute[222017]: 2026-01-23 10:20:30.784 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:20:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:31.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:20:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:31.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:20:31 np0005593233 nova_compute[222017]: 2026-01-23 10:20:31.779 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:31 np0005593233 nova_compute[222017]: 2026-01-23 10:20:31.779 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:33.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:33.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:33 np0005593233 nova_compute[222017]: 2026-01-23 10:20:33.751 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:34 np0005593233 podman[280893]: 2026-01-23 10:20:34.068274529 +0000 UTC m=+0.069289961 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 23 05:20:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:35.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:35.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:35 np0005593233 nova_compute[222017]: 2026-01-23 10:20:35.751 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:37.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:37.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:38 np0005593233 nova_compute[222017]: 2026-01-23 10:20:38.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:39.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:39.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:40 np0005593233 nova_compute[222017]: 2026-01-23 10:20:40.755 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:20:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:41.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:20:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:41.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:20:42.683 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:20:42.684 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:20:42.686 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:20:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:20:43 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:20:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:43.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:20:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:43.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:20:43 np0005593233 nova_compute[222017]: 2026-01-23 10:20:43.832 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:44 np0005593233 nova_compute[222017]: 2026-01-23 10:20:44.652 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:44 np0005593233 nova_compute[222017]: 2026-01-23 10:20:44.653 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:44 np0005593233 nova_compute[222017]: 2026-01-23 10:20:44.793 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.302 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.302 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.311 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.311 222021 INFO nova.compute.claims [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.391 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:45.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:45.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.759 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:45 np0005593233 nova_compute[222017]: 2026-01-23 10:20:45.954 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:20:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2281018177' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.449 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.460 222021 DEBUG nova.compute.provider_tree [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.571 222021 DEBUG nova.scheduler.client.report [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.611 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.612 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.792 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.792 222021 DEBUG nova.network.neutron [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.877 222021 INFO nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:20:46 np0005593233 nova_compute[222017]: 2026-01-23 10:20:46.936 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:20:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.194 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.195 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.196 222021 INFO nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Creating image(s)#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.225 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.255 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.283 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.287 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.371 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.374 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.375 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.376 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.437 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.448 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b442ed8c-1696-44f7-bef1-b826cbc895be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:47.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.496 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.498 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:20:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.856 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b442ed8c-1696-44f7-bef1-b826cbc895be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:47 np0005593233 nova_compute[222017]: 2026-01-23 10:20:47.956 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] resizing rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:20:48 np0005593233 nova_compute[222017]: 2026-01-23 10:20:48.098 222021 DEBUG nova.objects.instance [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'migration_context' on Instance uuid b442ed8c-1696-44f7-bef1-b826cbc895be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:48 np0005593233 nova_compute[222017]: 2026-01-23 10:20:48.866 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:49 np0005593233 podman[281230]: 2026-01-23 10:20:49.122250214 +0000 UTC m=+0.129643638 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:20:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:49.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.507 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.508 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Ensure instance console log exists: /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.508 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.508 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.509 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.516 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.516 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.516 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.516 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.517 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:49 np0005593233 nova_compute[222017]: 2026-01-23 10:20:49.679 222021 DEBUG nova.policy [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec99ae7c69d0438280441e0434374cbf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:20:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:49.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:20:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1982093040' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.016 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.223 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.223 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.434 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.436 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4130MB free_disk=20.830928802490234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.437 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.438 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:20:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.614 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 95158f57-1f68-4b3e-9d10-e3006c3f2060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.614 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance b442ed8c-1696-44f7-bef1-b826cbc895be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.615 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.615 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.763 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:50 np0005593233 nova_compute[222017]: 2026-01-23 10:20:50.791 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:20:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1549956689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:51 np0005593233 nova_compute[222017]: 2026-01-23 10:20:51.306 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:51 np0005593233 nova_compute[222017]: 2026-01-23 10:20:51.314 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:20:51 np0005593233 nova_compute[222017]: 2026-01-23 10:20:51.414 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:20:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:51.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:51 np0005593233 nova_compute[222017]: 2026-01-23 10:20:51.499 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:20:51 np0005593233 nova_compute[222017]: 2026-01-23 10:20:51.500 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:20:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:51.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:20:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:53 np0005593233 nova_compute[222017]: 2026-01-23 10:20:53.390 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:53 np0005593233 nova_compute[222017]: 2026-01-23 10:20:53.392 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:53 np0005593233 nova_compute[222017]: 2026-01-23 10:20:53.392 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:53.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:53.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:53 np0005593233 nova_compute[222017]: 2026-01-23 10:20:53.762 222021 DEBUG nova.network.neutron [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Successfully created port: 50456029-5bfc-40d5-8093-da54e568d718 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:20:53 np0005593233 nova_compute[222017]: 2026-01-23 10:20:53.869 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:54 np0005593233 nova_compute[222017]: 2026-01-23 10:20:54.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:54 np0005593233 nova_compute[222017]: 2026-01-23 10:20:54.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:20:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:55.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:55.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:55 np0005593233 nova_compute[222017]: 2026-01-23 10:20:55.805 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.513 222021 DEBUG nova.network.neutron [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Successfully updated port: 50456029-5bfc-40d5-8093-da54e568d718 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.546 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "refresh_cache-b442ed8c-1696-44f7-bef1-b826cbc895be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.546 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquired lock "refresh_cache-b442ed8c-1696-44f7-bef1-b826cbc895be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.547 222021 DEBUG nova.network.neutron [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.649 222021 DEBUG nova.compute.manager [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-changed-50456029-5bfc-40d5-8093-da54e568d718 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.650 222021 DEBUG nova.compute.manager [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Refreshing instance network info cache due to event network-changed-50456029-5bfc-40d5-8093-da54e568d718. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:20:56 np0005593233 nova_compute[222017]: 2026-01-23 10:20:56.650 222021 DEBUG oslo_concurrency.lockutils [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b442ed8c-1696-44f7-bef1-b826cbc895be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:57 np0005593233 nova_compute[222017]: 2026-01-23 10:20:57.093 222021 DEBUG nova.network.neutron [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:20:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:57.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:57.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:58 np0005593233 nova_compute[222017]: 2026-01-23 10:20:58.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:58 np0005593233 nova_compute[222017]: 2026-01-23 10:20:58.383 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:58 np0005593233 nova_compute[222017]: 2026-01-23 10:20:58.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:20:58 np0005593233 nova_compute[222017]: 2026-01-23 10:20:58.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:20:58 np0005593233 nova_compute[222017]: 2026-01-23 10:20:58.453 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:20:58 np0005593233 nova_compute[222017]: 2026-01-23 10:20:58.871 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:59 np0005593233 nova_compute[222017]: 2026-01-23 10:20:59.171 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:59 np0005593233 nova_compute[222017]: 2026-01-23 10:20:59.172 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:59 np0005593233 nova_compute[222017]: 2026-01-23 10:20:59.172 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:20:59 np0005593233 nova_compute[222017]: 2026-01-23 10:20:59.173 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:59.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:20:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:59.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.202 222021 DEBUG nova.network.neutron [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Updating instance_info_cache with network_info: [{"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.304 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Releasing lock "refresh_cache-b442ed8c-1696-44f7-bef1-b826cbc895be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.305 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Instance network_info: |[{"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.305 222021 DEBUG oslo_concurrency.lockutils [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b442ed8c-1696-44f7-bef1-b826cbc895be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.306 222021 DEBUG nova.network.neutron [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Refreshing network info cache for port 50456029-5bfc-40d5-8093-da54e568d718 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.309 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Start _get_guest_xml network_info=[{"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.314 222021 WARNING nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.320 222021 DEBUG nova.virt.libvirt.host [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.321 222021 DEBUG nova.virt.libvirt.host [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.325 222021 DEBUG nova.virt.libvirt.host [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.326 222021 DEBUG nova.virt.libvirt.host [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.327 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.328 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.328 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.328 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.329 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.329 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.329 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.330 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.330 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.330 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.330 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.331 222021 DEBUG nova.virt.hardware [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.334 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:21:00 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/533658098' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.884 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.919 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:00 np0005593233 nova_compute[222017]: 2026-01-23 10:21:00.925 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2658687205' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.406 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.408 222021 DEBUG nova.virt.libvirt.vif [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:20:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-702642272',display_name='tempest-ServersTestJSON-server-702642272',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-702642272',id=155,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-6wzi63bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:20:47Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=b442ed8c-1696-44f7-bef1-b826cbc895be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.408 222021 DEBUG nova.network.os_vif_util [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.410 222021 DEBUG nova.network.os_vif_util [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.411 222021 DEBUG nova.objects.instance [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid b442ed8c-1696-44f7-bef1-b826cbc895be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:01.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:01.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.805 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <uuid>b442ed8c-1696-44f7-bef1-b826cbc895be</uuid>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <name>instance-0000009b</name>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServersTestJSON-server-702642272</nova:name>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:21:00</nova:creationTime>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:user uuid="ec99ae7c69d0438280441e0434374cbf">tempest-ServersTestJSON-1611255243-project-member</nova:user>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:project uuid="c59351a1b59c4cc9ad389dff900935f2">tempest-ServersTestJSON-1611255243</nova:project>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <nova:port uuid="50456029-5bfc-40d5-8093-da54e568d718">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <entry name="serial">b442ed8c-1696-44f7-bef1-b826cbc895be</entry>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <entry name="uuid">b442ed8c-1696-44f7-bef1-b826cbc895be</entry>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b442ed8c-1696-44f7-bef1-b826cbc895be_disk">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b442ed8c-1696-44f7-bef1-b826cbc895be_disk.config">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:85:e5:96"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <target dev="tap50456029-5b"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/console.log" append="off"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:21:01 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:21:01 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:21:01 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:21:01 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.806 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Preparing to wait for external event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.806 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.807 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.807 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.808 222021 DEBUG nova.virt.libvirt.vif [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:20:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-702642272',display_name='tempest-ServersTestJSON-server-702642272',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-702642272',id=155,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-6wzi63bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:20:47Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=b442ed8c-1696-44f7-bef1-b826cbc895be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.808 222021 DEBUG nova.network.os_vif_util [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.809 222021 DEBUG nova.network.os_vif_util [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.810 222021 DEBUG os_vif [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.811 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.812 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.812 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.818 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50456029-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.819 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50456029-5b, col_values=(('external_ids', {'iface-id': '50456029-5bfc-40d5-8093-da54e568d718', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:e5:96', 'vm-uuid': 'b442ed8c-1696-44f7-bef1-b826cbc895be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:01 np0005593233 NetworkManager[48871]: <info>  [1769163661.8216] manager: (tap50456029-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.820 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.826 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.831 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.832 222021 INFO os_vif [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b')#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.955 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.956 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.956 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] No VIF found with MAC fa:16:3e:85:e5:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:21:01 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.957 222021 INFO nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Using config drive#033[00m
Jan 23 05:21:02 np0005593233 nova_compute[222017]: 2026-01-23 10:21:01.999 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:03.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:03.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:03 np0005593233 nova_compute[222017]: 2026-01-23 10:21:03.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.042 222021 INFO nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Creating config drive at /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/disk.config#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.049 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpajorxrdg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.199 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpajorxrdg" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.243 222021 DEBUG nova.storage.rbd_utils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] rbd image b442ed8c-1696-44f7-bef1-b826cbc895be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.248 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/disk.config b442ed8c-1696-44f7-bef1-b826cbc895be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.479 222021 DEBUG oslo_concurrency.processutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/disk.config b442ed8c-1696-44f7-bef1-b826cbc895be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.230s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.481 222021 INFO nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Deleting local config drive /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be/disk.config because it was imported into RBD.#033[00m
Jan 23 05:21:04 np0005593233 kernel: tap50456029-5b: entered promiscuous mode
Jan 23 05:21:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:04Z|00653|binding|INFO|Claiming lport 50456029-5bfc-40d5-8093-da54e568d718 for this chassis.
Jan 23 05:21:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:04Z|00654|binding|INFO|50456029-5bfc-40d5-8093-da54e568d718: Claiming fa:16:3e:85:e5:96 10.100.0.6
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.549 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:04 np0005593233 NetworkManager[48871]: <info>  [1769163664.5498] manager: (tap50456029-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/303)
Jan 23 05:21:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:04Z|00655|binding|INFO|Setting lport 50456029-5bfc-40d5-8093-da54e568d718 ovn-installed in OVS
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.564 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.568 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:04 np0005593233 systemd-machined[190954]: New machine qemu-71-instance-0000009b.
Jan 23 05:21:04 np0005593233 systemd[1]: Started Virtual Machine qemu-71-instance-0000009b.
Jan 23 05:21:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:04Z|00656|binding|INFO|Setting lport 50456029-5bfc-40d5-8093-da54e568d718 up in Southbound
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.620 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:e5:96 10.100.0.6'], port_security=['fa:16:3e:85:e5:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b442ed8c-1696-44f7-bef1-b826cbc895be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd59f1dd0-018a-40d5-b9a0-54c6c1f9d925', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c808b115-ccf1-41c4-acea-daabae8abf5b, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=50456029-5bfc-40d5-8093-da54e568d718) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.621 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 50456029-5bfc-40d5-8093-da54e568d718 in datapath 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 bound to our chassis#033[00m
Jan 23 05:21:04 np0005593233 systemd-udevd[281495]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.623 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7#033[00m
Jan 23 05:21:04 np0005593233 NetworkManager[48871]: <info>  [1769163664.6440] device (tap50456029-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:21:04 np0005593233 NetworkManager[48871]: <info>  [1769163664.6451] device (tap50456029-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.650 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd672f3-7006-4191-9ffc-612a937bbc06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:04 np0005593233 podman[281483]: 2026-01-23 10:21:04.666240912 +0000 UTC m=+0.081090334 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.691 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[320a0dae-6667-4c94-a4ff-b6fad6db1717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.696 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c865e683-ff4b-41fe-91fb-d3e77cba9efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.739 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e24ccd7c-2224-4060-b81c-e81028b912aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.762 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d2207943-e0bf-434e-862b-20db43430a7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43bdb40a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:5e:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738749, 'reachable_time': 26119, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281519, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.777 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.790 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c0d6525-f555-42af-b18e-7670411e59b7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738763, 'tstamp': 738763}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281520, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738766, 'tstamp': 738766}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281520, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.793 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bdb40a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.796 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43bdb40a-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.796 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.797 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43bdb40a-e0, col_values=(('external_ids', {'iface-id': '8a8ef4f2-2ba5-405a-811e-058c5ff2b91e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:04.797 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.992 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:04 np0005593233 nova_compute[222017]: 2026-01-23 10:21:04.993 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.015 222021 DEBUG nova.network.neutron [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Updated VIF entry in instance network info cache for port 50456029-5bfc-40d5-8093-da54e568d718. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.016 222021 DEBUG nova.network.neutron [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Updating instance_info_cache with network_info: [{"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.036 222021 DEBUG oslo_concurrency.lockutils [req-44f8c1fa-9b8c-42b0-851b-03418e3c08b3 req-69d46056-a93c-4b31-b0e8-0dba5728a2a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b442ed8c-1696-44f7-bef1-b826cbc895be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.150 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163665.1493456, b442ed8c-1696-44f7-bef1-b826cbc895be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.150 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] VM Started (Lifecycle Event)#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.437 222021 DEBUG nova.compute.manager [req-95d153ed-e5b0-4d89-8a90-319c3b482012 req-d3f0219a-c7fa-4048-a95e-94fed71560ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.438 222021 DEBUG oslo_concurrency.lockutils [req-95d153ed-e5b0-4d89-8a90-319c3b482012 req-d3f0219a-c7fa-4048-a95e-94fed71560ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.440 222021 DEBUG oslo_concurrency.lockutils [req-95d153ed-e5b0-4d89-8a90-319c3b482012 req-d3f0219a-c7fa-4048-a95e-94fed71560ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.440 222021 DEBUG oslo_concurrency.lockutils [req-95d153ed-e5b0-4d89-8a90-319c3b482012 req-d3f0219a-c7fa-4048-a95e-94fed71560ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.441 222021 DEBUG nova.compute.manager [req-95d153ed-e5b0-4d89-8a90-319c3b482012 req-d3f0219a-c7fa-4048-a95e-94fed71560ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Processing event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.442 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.447 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.452 222021 INFO nova.virt.libvirt.driver [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Instance spawned successfully.#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.452 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.458 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.466 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:21:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:05.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.572 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.572 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.573 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.573 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.573 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.574 222021 DEBUG nova.virt.libvirt.driver [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.577 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.577 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163665.149513, b442ed8c-1696-44f7-bef1-b826cbc895be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.577 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:21:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:05.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.837 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.841 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163665.4463055, b442ed8c-1696-44f7-bef1-b826cbc895be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.842 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.914 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:05 np0005593233 nova_compute[222017]: 2026-01-23 10:21:05.919 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:21:06 np0005593233 nova_compute[222017]: 2026-01-23 10:21:06.088 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:21:06 np0005593233 nova_compute[222017]: 2026-01-23 10:21:06.130 222021 INFO nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Took 18.94 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:21:06 np0005593233 nova_compute[222017]: 2026-01-23 10:21:06.131 222021 DEBUG nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:06 np0005593233 nova_compute[222017]: 2026-01-23 10:21:06.373 222021 INFO nova.compute.manager [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Took 21.28 seconds to build instance.#033[00m
Jan 23 05:21:06 np0005593233 nova_compute[222017]: 2026-01-23 10:21:06.617 222021 DEBUG oslo_concurrency.lockutils [None req-042228e6-b903-4a7e-adca-af4efc18c3f6 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:06 np0005593233 nova_compute[222017]: 2026-01-23 10:21:06.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:07.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:07.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:08 np0005593233 nova_compute[222017]: 2026-01-23 10:21:08.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:09.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:09.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:09 np0005593233 nova_compute[222017]: 2026-01-23 10:21:09.763 222021 DEBUG nova.compute.manager [req-cccd9d23-aa95-421b-bc08-5985a18d2488 req-1680847b-034b-44f1-82b1-c4c1929f237d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:09 np0005593233 nova_compute[222017]: 2026-01-23 10:21:09.764 222021 DEBUG oslo_concurrency.lockutils [req-cccd9d23-aa95-421b-bc08-5985a18d2488 req-1680847b-034b-44f1-82b1-c4c1929f237d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:09 np0005593233 nova_compute[222017]: 2026-01-23 10:21:09.764 222021 DEBUG oslo_concurrency.lockutils [req-cccd9d23-aa95-421b-bc08-5985a18d2488 req-1680847b-034b-44f1-82b1-c4c1929f237d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:09 np0005593233 nova_compute[222017]: 2026-01-23 10:21:09.764 222021 DEBUG oslo_concurrency.lockutils [req-cccd9d23-aa95-421b-bc08-5985a18d2488 req-1680847b-034b-44f1-82b1-c4c1929f237d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:09 np0005593233 nova_compute[222017]: 2026-01-23 10:21:09.765 222021 DEBUG nova.compute.manager [req-cccd9d23-aa95-421b-bc08-5985a18d2488 req-1680847b-034b-44f1-82b1-c4c1929f237d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] No waiting events found dispatching network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:09 np0005593233 nova_compute[222017]: 2026-01-23 10:21:09.765 222021 WARNING nova.compute.manager [req-cccd9d23-aa95-421b-bc08-5985a18d2488 req-1680847b-034b-44f1-82b1-c4c1929f237d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received unexpected event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:21:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:11.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:11.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:11 np0005593233 nova_compute[222017]: 2026-01-23 10:21:11.824 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:13.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:13.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:13 np0005593233 nova_compute[222017]: 2026-01-23 10:21:13.879 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:15.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:15.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.769 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.770 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.770 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.770 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.770 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.772 222021 INFO nova.compute.manager [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Terminating instance#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.772 222021 DEBUG nova.compute.manager [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:21:16 np0005593233 nova_compute[222017]: 2026-01-23 10:21:16.825 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:17 np0005593233 kernel: tap50456029-5b (unregistering): left promiscuous mode
Jan 23 05:21:17 np0005593233 NetworkManager[48871]: <info>  [1769163677.3317] device (tap50456029-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:21:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:17Z|00657|binding|INFO|Releasing lport 50456029-5bfc-40d5-8093-da54e568d718 from this chassis (sb_readonly=0)
Jan 23 05:21:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:17Z|00658|binding|INFO|Setting lport 50456029-5bfc-40d5-8093-da54e568d718 down in Southbound
Jan 23 05:21:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:21:17Z|00659|binding|INFO|Removing iface tap50456029-5b ovn-installed in OVS
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.379 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.382 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.390 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:e5:96 10.100.0.6'], port_security=['fa:16:3e:85:e5:96 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b442ed8c-1696-44f7-bef1-b826cbc895be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd59f1dd0-018a-40d5-b9a0-54c6c1f9d925', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c808b115-ccf1-41c4-acea-daabae8abf5b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=50456029-5bfc-40d5-8093-da54e568d718) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.393 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 50456029-5bfc-40d5-8093-da54e568d718 in datapath 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 unbound from our chassis#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.395 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.400 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Jan 23 05:21:17 np0005593233 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009b.scope: Consumed 12.166s CPU time.
Jan 23 05:21:17 np0005593233 systemd-machined[190954]: Machine qemu-71-instance-0000009b terminated.
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.425 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2425cdb3-7012-4be0-99d1-0e3b44c71e42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.476 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e39485c1-d43c-4438-8736-94fd144a9f38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.481 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[28ce6126-5379-4e67-a022-d8cc78e80d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:17.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.518 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5d897262-bda7-4d8c-a781-88c51e2be32c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.546 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[66d59378-43aa-43f2-9306-a08f44f27888]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap43bdb40a-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2b:5e:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738749, 'reachable_time': 26119, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281576, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.575 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5874fe6d-8525-4707-b926-83d87d7853db]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738763, 'tstamp': 738763}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281577, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap43bdb40a-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738766, 'tstamp': 738766}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281577, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.578 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bdb40a-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.581 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.589 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.590 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43bdb40a-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.590 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.591 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap43bdb40a-e0, col_values=(('external_ids', {'iface-id': '8a8ef4f2-2ba5-405a-811e-058c5ff2b91e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:17.591 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.617 222021 INFO nova.virt.libvirt.driver [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Instance destroyed successfully.#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.618 222021 DEBUG nova.objects.instance [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'resources' on Instance uuid b442ed8c-1696-44f7-bef1-b826cbc895be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.643 222021 DEBUG nova.virt.libvirt.vif [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:20:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-702642272',display_name='tempest-ServersTestJSON-server-702642272',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-702642272',id=155,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-6wzi63bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:11Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=b442ed8c-1696-44f7-bef1-b826cbc895be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.644 222021 DEBUG nova.network.os_vif_util [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "50456029-5bfc-40d5-8093-da54e568d718", "address": "fa:16:3e:85:e5:96", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50456029-5b", "ovs_interfaceid": "50456029-5bfc-40d5-8093-da54e568d718", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.645 222021 DEBUG nova.network.os_vif_util [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.646 222021 DEBUG os_vif [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.648 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50456029-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.651 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.654 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:21:17 np0005593233 nova_compute[222017]: 2026-01-23 10:21:17.658 222021 INFO os_vif [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:e5:96,bridge_name='br-int',has_traffic_filtering=True,id=50456029-5bfc-40d5-8093-da54e568d718,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50456029-5b')#033[00m
Jan 23 05:21:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:17.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:21:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.8 total, 600.0 interval#012Cumulative writes: 46K writes, 181K keys, 46K commit groups, 1.0 writes per commit group, ingest: 0.17 GB, 0.04 MB/s#012Cumulative WAL: 46K writes, 17K syncs, 2.71 writes per sync, written: 0.17 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7501 writes, 29K keys, 7501 commit groups, 1.0 writes per commit group, ingest: 28.62 MB, 0.05 MB/s#012Interval WAL: 7501 writes, 3029 syncs, 2.48 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.200 222021 DEBUG nova.compute.manager [req-0e96b712-b7e8-47d1-b768-22cfc62135ca req-c5d8de51-8cfb-4c17-882e-3d1870c83ab5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-vif-unplugged-50456029-5bfc-40d5-8093-da54e568d718 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.201 222021 DEBUG oslo_concurrency.lockutils [req-0e96b712-b7e8-47d1-b768-22cfc62135ca req-c5d8de51-8cfb-4c17-882e-3d1870c83ab5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.201 222021 DEBUG oslo_concurrency.lockutils [req-0e96b712-b7e8-47d1-b768-22cfc62135ca req-c5d8de51-8cfb-4c17-882e-3d1870c83ab5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.201 222021 DEBUG oslo_concurrency.lockutils [req-0e96b712-b7e8-47d1-b768-22cfc62135ca req-c5d8de51-8cfb-4c17-882e-3d1870c83ab5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.202 222021 DEBUG nova.compute.manager [req-0e96b712-b7e8-47d1-b768-22cfc62135ca req-c5d8de51-8cfb-4c17-882e-3d1870c83ab5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] No waiting events found dispatching network-vif-unplugged-50456029-5bfc-40d5-8093-da54e568d718 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.202 222021 DEBUG nova.compute.manager [req-0e96b712-b7e8-47d1-b768-22cfc62135ca req-c5d8de51-8cfb-4c17-882e-3d1870c83ab5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-vif-unplugged-50456029-5bfc-40d5-8093-da54e568d718 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:21:18 np0005593233 nova_compute[222017]: 2026-01-23 10:21:18.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:19 np0005593233 nova_compute[222017]: 2026-01-23 10:21:19.031 222021 INFO nova.virt.libvirt.driver [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Deleting instance files /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be_del#033[00m
Jan 23 05:21:19 np0005593233 nova_compute[222017]: 2026-01-23 10:21:19.033 222021 INFO nova.virt.libvirt.driver [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Deletion of /var/lib/nova/instances/b442ed8c-1696-44f7-bef1-b826cbc895be_del complete#033[00m
Jan 23 05:21:19 np0005593233 nova_compute[222017]: 2026-01-23 10:21:19.116 222021 INFO nova.compute.manager [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Took 2.34 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:21:19 np0005593233 nova_compute[222017]: 2026-01-23 10:21:19.117 222021 DEBUG oslo.service.loopingcall [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:21:19 np0005593233 nova_compute[222017]: 2026-01-23 10:21:19.117 222021 DEBUG nova.compute.manager [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:21:19 np0005593233 nova_compute[222017]: 2026-01-23 10:21:19.118 222021 DEBUG nova.network.neutron [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:21:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:19.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:19.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:20 np0005593233 podman[281609]: 2026-01-23 10:21:20.155148598 +0000 UTC m=+0.155353275 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.469 222021 DEBUG nova.compute.manager [req-4d9e0a81-3e28-42f8-9a73-15f3e026dde5 req-e82726ea-4a26-4266-b688-bb180ee77e6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.469 222021 DEBUG oslo_concurrency.lockutils [req-4d9e0a81-3e28-42f8-9a73-15f3e026dde5 req-e82726ea-4a26-4266-b688-bb180ee77e6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.469 222021 DEBUG oslo_concurrency.lockutils [req-4d9e0a81-3e28-42f8-9a73-15f3e026dde5 req-e82726ea-4a26-4266-b688-bb180ee77e6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.470 222021 DEBUG oslo_concurrency.lockutils [req-4d9e0a81-3e28-42f8-9a73-15f3e026dde5 req-e82726ea-4a26-4266-b688-bb180ee77e6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.470 222021 DEBUG nova.compute.manager [req-4d9e0a81-3e28-42f8-9a73-15f3e026dde5 req-e82726ea-4a26-4266-b688-bb180ee77e6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] No waiting events found dispatching network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.470 222021 WARNING nova.compute.manager [req-4d9e0a81-3e28-42f8-9a73-15f3e026dde5 req-e82726ea-4a26-4266-b688-bb180ee77e6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received unexpected event network-vif-plugged-50456029-5bfc-40d5-8093-da54e568d718 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.912 222021 DEBUG nova.network.neutron [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:20 np0005593233 nova_compute[222017]: 2026-01-23 10:21:20.964 222021 INFO nova.compute.manager [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Took 1.85 seconds to deallocate network for instance.#033[00m
Jan 23 05:21:21 np0005593233 nova_compute[222017]: 2026-01-23 10:21:21.068 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:21 np0005593233 nova_compute[222017]: 2026-01-23 10:21:21.069 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:21.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:21.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:21 np0005593233 nova_compute[222017]: 2026-01-23 10:21:21.859 222021 DEBUG oslo_concurrency.processutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:22 np0005593233 nova_compute[222017]: 2026-01-23 10:21:22.448 222021 DEBUG nova.compute.manager [req-8f80214d-aa66-4bb4-86ee-0fe7f10ad0ea req-04011f0f-4215-43b3-a63d-1c20193ebc80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Received event network-vif-deleted-50456029-5bfc-40d5-8093-da54e568d718 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:22 np0005593233 nova_compute[222017]: 2026-01-23 10:21:22.651 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2014856997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.250 222021 DEBUG oslo_concurrency.processutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.261 222021 DEBUG nova.compute.provider_tree [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.308 222021 DEBUG nova.scheduler.client.report [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.350 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.433 222021 INFO nova.scheduler.client.report [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Deleted allocations for instance b442ed8c-1696-44f7-bef1-b826cbc895be#033[00m
Jan 23 05:21:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:23.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.636 222021 DEBUG oslo_concurrency.lockutils [None req-f3bb92ad-9357-4ec0-a63e-76b422e29b51 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "b442ed8c-1696-44f7-bef1-b826cbc895be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:23.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:23 np0005593233 nova_compute[222017]: 2026-01-23 10:21:23.886 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 e313: 3 total, 3 up, 3 in
Jan 23 05:21:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:25.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:25.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:27.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:27 np0005593233 nova_compute[222017]: 2026-01-23 10:21:27.654 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:27.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:28 np0005593233 nova_compute[222017]: 2026-01-23 10:21:28.888 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:29.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:29.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:21:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:31.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:21:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:31.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:32 np0005593233 nova_compute[222017]: 2026-01-23 10:21:32.612 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163677.611084, b442ed8c-1696-44f7-bef1-b826cbc895be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:32 np0005593233 nova_compute[222017]: 2026-01-23 10:21:32.613 222021 INFO nova.compute.manager [-] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:21:32 np0005593233 nova_compute[222017]: 2026-01-23 10:21:32.657 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:33 np0005593233 nova_compute[222017]: 2026-01-23 10:21:33.082 222021 DEBUG nova.compute.manager [None req-37c268f2-1c99-4be2-b0ff-a45f5d7873c6 - - - - - -] [instance: b442ed8c-1696-44f7-bef1-b826cbc895be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:33.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:33.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:33 np0005593233 nova_compute[222017]: 2026-01-23 10:21:33.892 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:35 np0005593233 podman[281661]: 2026-01-23 10:21:35.065569899 +0000 UTC m=+0.074747495 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 23 05:21:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:35.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:35.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:37.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:37 np0005593233 nova_compute[222017]: 2026-01-23 10:21:37.658 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:37.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:38.177 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:38 np0005593233 nova_compute[222017]: 2026-01-23 10:21:38.177 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:38 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:38.179 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:21:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:38 np0005593233 nova_compute[222017]: 2026-01-23 10:21:38.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:39.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:39.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:41.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:41.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:42 np0005593233 nova_compute[222017]: 2026-01-23 10:21:42.661 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:42.684 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:42.685 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:42.685 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:43.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:43.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:43 np0005593233 nova_compute[222017]: 2026-01-23 10:21:43.897 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:21:44.182 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:45.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:45.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:47 np0005593233 nova_compute[222017]: 2026-01-23 10:21:47.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:47 np0005593233 nova_compute[222017]: 2026-01-23 10:21:47.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:47 np0005593233 nova_compute[222017]: 2026-01-23 10:21:47.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:47.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:47 np0005593233 nova_compute[222017]: 2026-01-23 10:21:47.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:47.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:21:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.352 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.353 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.353 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.353 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.354 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1626713051' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.841 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:48 np0005593233 nova_compute[222017]: 2026-01-23 10:21:48.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:49.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:49.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:50 np0005593233 podman[281777]: 2026-01-23 10:21:50.418024894 +0000 UTC m=+0.154471870 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 05:21:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:51.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:51.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:21:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:21:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:21:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:21:52 np0005593233 nova_compute[222017]: 2026-01-23 10:21:52.743 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.102 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.103 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-00000092 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:21:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.325 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.327 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4155MB free_disk=20.887630462646484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.327 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.327 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:53.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.748 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 95158f57-1f68-4b3e-9d10-e3006c3f2060 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.749 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.749 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:21:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:53.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.902 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:53 np0005593233 nova_compute[222017]: 2026-01-23 10:21:53.957 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:54 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/250605212' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:54 np0005593233 nova_compute[222017]: 2026-01-23 10:21:54.469 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:54 np0005593233 nova_compute[222017]: 2026-01-23 10:21:54.477 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:21:54 np0005593233 nova_compute[222017]: 2026-01-23 10:21:54.701 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:54 np0005593233 nova_compute[222017]: 2026-01-23 10:21:54.979 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:21:54 np0005593233 nova_compute[222017]: 2026-01-23 10:21:54.980 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:55.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:55.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:21:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 12K writes, 66K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 12K writes, 12K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1546 writes, 7884 keys, 1546 commit groups, 1.0 writes per commit group, ingest: 15.85 MB, 0.03 MB/s#012Interval WAL: 1546 writes, 1546 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     48.1      1.68              0.37        42    0.040       0      0       0.0       0.0#012  L6      1/0   10.15 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   4.9     83.9     71.1      5.55              1.29        41    0.135    273K    22K       0.0       0.0#012 Sum      1/0   10.15 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   5.9     64.4     65.8      7.24              1.67        83    0.087    273K    22K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   9.7    109.8    106.7      0.64              0.24        12    0.053     53K   3044       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0     83.9     71.1      5.55              1.29        41    0.135    273K    22K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     48.2      1.68              0.37        41    0.041       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.079, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.46 GB write, 0.10 MB/s write, 0.45 GB read, 0.10 MB/s read, 7.2 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 51.45 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000447 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2952,49.44 MB,16.2633%) FilterBlock(83,774.80 KB,0.248894%) IndexBlock(83,1.25 MB,0.412605%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:21:56 np0005593233 nova_compute[222017]: 2026-01-23 10:21:56.981 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:56 np0005593233 nova_compute[222017]: 2026-01-23 10:21:56.982 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:56 np0005593233 nova_compute[222017]: 2026-01-23 10:21:56.982 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:56 np0005593233 nova_compute[222017]: 2026-01-23 10:21:56.982 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:56 np0005593233 nova_compute[222017]: 2026-01-23 10:21:56.983 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:21:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:57.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:57 np0005593233 nova_compute[222017]: 2026-01-23 10:21:57.777 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:57.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:58 np0005593233 nova_compute[222017]: 2026-01-23 10:21:58.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:59 np0005593233 nova_compute[222017]: 2026-01-23 10:21:59.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:59 np0005593233 nova_compute[222017]: 2026-01-23 10:21:59.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:21:59 np0005593233 nova_compute[222017]: 2026-01-23 10:21:59.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:21:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:59.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:21:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:21:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:59.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:00 np0005593233 nova_compute[222017]: 2026-01-23 10:22:00.478 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:00 np0005593233 nova_compute[222017]: 2026-01-23 10:22:00.478 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:00 np0005593233 nova_compute[222017]: 2026-01-23 10:22:00.478 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:22:00 np0005593233 nova_compute[222017]: 2026-01-23 10:22:00.479 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:22:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:22:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:01.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:01.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:02 np0005593233 nova_compute[222017]: 2026-01-23 10:22:02.820 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:03.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:03.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:03 np0005593233 nova_compute[222017]: 2026-01-23 10:22:03.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:05.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:05.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:06 np0005593233 podman[281935]: 2026-01-23 10:22:06.098825827 +0000 UTC m=+0.096974754 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:22:06 np0005593233 nova_compute[222017]: 2026-01-23 10:22:06.763 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [{"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:06 np0005593233 nova_compute[222017]: 2026-01-23 10:22:06.806 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-95158f57-1f68-4b3e-9d10-e3006c3f2060" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:06 np0005593233 nova_compute[222017]: 2026-01-23 10:22:06.806 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:22:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:07.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:07 np0005593233 nova_compute[222017]: 2026-01-23 10:22:07.800 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:07 np0005593233 nova_compute[222017]: 2026-01-23 10:22:07.875 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:07.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:08 np0005593233 nova_compute[222017]: 2026-01-23 10:22:08.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:09.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:09.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:11.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:11.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:12 np0005593233 nova_compute[222017]: 2026-01-23 10:22:12.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:13.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:13 np0005593233 nova_compute[222017]: 2026-01-23 10:22:13.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e314 e314: 3 total, 3 up, 3 in
Jan 23 05:22:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:15.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:15.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:17 np0005593233 nova_compute[222017]: 2026-01-23 10:22:17.378 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:17.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:17.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:17 np0005593233 nova_compute[222017]: 2026-01-23 10:22:17.904 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:18 np0005593233 nova_compute[222017]: 2026-01-23 10:22:18.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:19.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:19.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:21 np0005593233 podman[281955]: 2026-01-23 10:22:21.180023747 +0000 UTC m=+0.174157447 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:22:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:21.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:21.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 e315: 3 total, 3 up, 3 in
Jan 23 05:22:22 np0005593233 nova_compute[222017]: 2026-01-23 10:22:22.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:23.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:23.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:24 np0005593233 nova_compute[222017]: 2026-01-23 10:22:24.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:25.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:25.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:27.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:27.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:27 np0005593233 nova_compute[222017]: 2026-01-23 10:22:27.910 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:29 np0005593233 nova_compute[222017]: 2026-01-23 10:22:29.014 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:29.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:29.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:31.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:31.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:32 np0005593233 nova_compute[222017]: 2026-01-23 10:22:32.912 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:33.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:33.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:34 np0005593233 nova_compute[222017]: 2026-01-23 10:22:34.016 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:35.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:35.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:37 np0005593233 podman[281983]: 2026-01-23 10:22:37.089703519 +0000 UTC m=+0.093161986 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:22:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:37.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:37.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:37 np0005593233 nova_compute[222017]: 2026-01-23 10:22:37.958 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:39 np0005593233 nova_compute[222017]: 2026-01-23 10:22:39.028 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:39.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:39.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:41.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:41.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:42.685 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:42.685 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:42.686 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:42 np0005593233 nova_compute[222017]: 2026-01-23 10:22:42.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.549 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.550 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.550 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.551 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.551 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.554 222021 INFO nova.compute.manager [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Terminating instance#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.556 222021 DEBUG nova.compute.manager [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:22:43 np0005593233 kernel: tap758c2b61-c9 (unregistering): left promiscuous mode
Jan 23 05:22:43 np0005593233 NetworkManager[48871]: <info>  [1769163763.6202] device (tap758c2b61-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:22:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:43.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:22:43Z|00660|binding|INFO|Releasing lport 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d from this chassis (sb_readonly=0)
Jan 23 05:22:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:22:43Z|00661|binding|INFO|Setting lport 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d down in Southbound
Jan 23 05:22:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:22:43Z|00662|binding|INFO|Removing iface tap758c2b61-c9 ovn-installed in OVS
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:43.655 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:42:f9 10.100.0.3'], port_security=['fa:16:3e:b4:42:f9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '95158f57-1f68-4b3e-9d10-e3006c3f2060', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c59351a1b59c4cc9ad389dff900935f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd59f1dd0-018a-40d5-b9a0-54c6c1f9d925', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c808b115-ccf1-41c4-acea-daabae8abf5b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:43.657 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 758c2b61-c90d-4bca-a2a4-1dbdf2631d0d in datapath 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 unbound from our chassis#033[00m
Jan 23 05:22:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:43.659 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:22:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:43.660 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[50e6665a-3996-4f0b-b2b8-332a6904e876]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:43 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:43.661 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 namespace which is not needed anymore#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 23 05:22:43 np0005593233 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000092.scope: Consumed 27.554s CPU time.
Jan 23 05:22:43 np0005593233 systemd-machined[190954]: Machine qemu-68-instance-00000092 terminated.
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.810 222021 INFO nova.virt.libvirt.driver [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Instance destroyed successfully.#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.812 222021 DEBUG nova.objects.instance [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lazy-loading 'resources' on Instance uuid 95158f57-1f68-4b3e-9d10-e3006c3f2060 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.852 222021 DEBUG nova.virt.libvirt.vif [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-395808227',display_name='tempest-₡-395808227',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest--395808227',id=146,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:17:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c59351a1b59c4cc9ad389dff900935f2',ramdisk_id='',reservation_id='r-42zdutrk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1611255243',owner_user_name='tempest-ServersTestJSON-1611255243-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:17:56Z,user_data=None,user_id='ec99ae7c69d0438280441e0434374cbf',uuid=95158f57-1f68-4b3e-9d10-e3006c3f2060,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.853 222021 DEBUG nova.network.os_vif_util [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converting VIF {"id": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "address": "fa:16:3e:b4:42:f9", "network": {"id": "43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7", "bridge": "br-int", "label": "tempest-ServersTestJSON-244954383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c59351a1b59c4cc9ad389dff900935f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap758c2b61-c9", "ovs_interfaceid": "758c2b61-c90d-4bca-a2a4-1dbdf2631d0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.854 222021 DEBUG nova.network.os_vif_util [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.854 222021 DEBUG os_vif [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.857 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap758c2b61-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [NOTICE]   (278746) : haproxy version is 2.8.14-c23fe91
Jan 23 05:22:43 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [NOTICE]   (278746) : path to executable is /usr/sbin/haproxy
Jan 23 05:22:43 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [WARNING]  (278746) : Exiting Master process...
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [ALERT]    (278746) : Current worker (278748) exited with code 143 (Terminated)
Jan 23 05:22:43 np0005593233 neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7[278742]: [WARNING]  (278746) : All workers exited. Exiting... (0)
Jan 23 05:22:43 np0005593233 systemd[1]: libpod-8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb.scope: Deactivated successfully.
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.865 222021 INFO os_vif [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:42:f9,bridge_name='br-int',has_traffic_filtering=True,id=758c2b61-c90d-4bca-a2a4-1dbdf2631d0d,network=Network(43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap758c2b61-c9')#033[00m
Jan 23 05:22:43 np0005593233 podman[282033]: 2026-01-23 10:22:43.872636824 +0000 UTC m=+0.055520711 container died 8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:22:43 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb-userdata-shm.mount: Deactivated successfully.
Jan 23 05:22:43 np0005593233 systemd[1]: var-lib-containers-storage-overlay-70ef4d2a17100684f1bb784f2980f0fc67f052e8b5602fea29ec318f6b13da07-merged.mount: Deactivated successfully.
Jan 23 05:22:43 np0005593233 podman[282033]: 2026-01-23 10:22:43.931792067 +0000 UTC m=+0.114675964 container cleanup 8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:22:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:43.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:43 np0005593233 systemd[1]: libpod-conmon-8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb.scope: Deactivated successfully.
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.972 222021 DEBUG nova.compute.manager [req-8f8f3b9a-bb18-4460-bee4-6581771f304d req-634b9d24-10da-4ea2-bdf1-c91d90f1d6ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-vif-unplugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.973 222021 DEBUG oslo_concurrency.lockutils [req-8f8f3b9a-bb18-4460-bee4-6581771f304d req-634b9d24-10da-4ea2-bdf1-c91d90f1d6ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.973 222021 DEBUG oslo_concurrency.lockutils [req-8f8f3b9a-bb18-4460-bee4-6581771f304d req-634b9d24-10da-4ea2-bdf1-c91d90f1d6ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.973 222021 DEBUG oslo_concurrency.lockutils [req-8f8f3b9a-bb18-4460-bee4-6581771f304d req-634b9d24-10da-4ea2-bdf1-c91d90f1d6ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.974 222021 DEBUG nova.compute.manager [req-8f8f3b9a-bb18-4460-bee4-6581771f304d req-634b9d24-10da-4ea2-bdf1-c91d90f1d6ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] No waiting events found dispatching network-vif-unplugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:43 np0005593233 nova_compute[222017]: 2026-01-23 10:22:43.974 222021 DEBUG nova.compute.manager [req-8f8f3b9a-bb18-4460-bee4-6581771f304d req-634b9d24-10da-4ea2-bdf1-c91d90f1d6ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-vif-unplugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:22:44 np0005593233 podman[282087]: 2026-01-23 10:22:44.037320972 +0000 UTC m=+0.065160494 container remove 8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.058 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.069 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a165535-2bcb-4ac1-9140-c61f621521ba]: (4, ('Fri Jan 23 10:22:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 (8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb)\n8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb\nFri Jan 23 10:22:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 (8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb)\n8dca9182ec653d0699284083aa882b6e5293a1a037ee32797cd44ab5d0110cfb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.072 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b0aba8-0e10-4b90-9ff0-ba4a905514ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.074 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bdb40a-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:44 np0005593233 kernel: tap43bdb40a-e0: left promiscuous mode
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.101 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.104 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a3098a-8ef1-4a31-8d70-0ec6a3640503]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.115 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[24ee99a6-cf1e-4976-9409-780dda325df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.117 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a66ec1c4-9595-41de-b837-f905075b48a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.143 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3c233b4e-2c24-4b2a-bd21-dacd26f02cf2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738741, 'reachable_time': 32011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282101, 'error': None, 'target': 'ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.145 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-43bdb40a-eff5-45cd-9cb3-cfdf465ad1f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.146 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[282c1405-73d2-42f3-bf4c-f982df86bb14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:44 np0005593233 systemd[1]: run-netns-ovnmeta\x2d43bdb40a\x2deff5\x2d45cd\x2d9cb3\x2dcfdf465ad1f7.mount: Deactivated successfully.
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.503 222021 INFO nova.virt.libvirt.driver [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Deleting instance files /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060_del#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.504 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.504 222021 INFO nova.virt.libvirt.driver [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Deletion of /var/lib/nova/instances/95158f57-1f68-4b3e-9d10-e3006c3f2060_del complete#033[00m
Jan 23 05:22:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:44.506 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.508 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:22:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2058355895' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:22:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:22:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2058355895' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.605 222021 INFO nova.compute.manager [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Took 1.05 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.606 222021 DEBUG oslo.service.loopingcall [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.606 222021 DEBUG nova.compute.manager [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:22:44 np0005593233 nova_compute[222017]: 2026-01-23 10:22:44.607 222021 DEBUG nova.network.neutron [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:22:45 np0005593233 nova_compute[222017]: 2026-01-23 10:22:45.592 222021 DEBUG nova.network.neutron [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:45 np0005593233 nova_compute[222017]: 2026-01-23 10:22:45.613 222021 INFO nova.compute.manager [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Took 1.01 seconds to deallocate network for instance.#033[00m
Jan 23 05:22:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:45.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:45 np0005593233 nova_compute[222017]: 2026-01-23 10:22:45.681 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:45 np0005593233 nova_compute[222017]: 2026-01-23 10:22:45.682 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:45 np0005593233 nova_compute[222017]: 2026-01-23 10:22:45.704 222021 DEBUG nova.compute.manager [req-564f800a-ab73-4031-8cd6-788daa194374 req-315d3fb8-5383-4d30-b598-c79d0aeb9468 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-vif-deleted-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:45 np0005593233 nova_compute[222017]: 2026-01-23 10:22:45.733 222021 DEBUG oslo_concurrency.processutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:45.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.097 222021 DEBUG nova.compute.manager [req-e445c9f3-54de-4341-9e6a-cc4f2f7c81ca req-e2b0cef3-7d8a-47b9-9198-24ec4a803a7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.098 222021 DEBUG oslo_concurrency.lockutils [req-e445c9f3-54de-4341-9e6a-cc4f2f7c81ca req-e2b0cef3-7d8a-47b9-9198-24ec4a803a7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.098 222021 DEBUG oslo_concurrency.lockutils [req-e445c9f3-54de-4341-9e6a-cc4f2f7c81ca req-e2b0cef3-7d8a-47b9-9198-24ec4a803a7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.098 222021 DEBUG oslo_concurrency.lockutils [req-e445c9f3-54de-4341-9e6a-cc4f2f7c81ca req-e2b0cef3-7d8a-47b9-9198-24ec4a803a7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.099 222021 DEBUG nova.compute.manager [req-e445c9f3-54de-4341-9e6a-cc4f2f7c81ca req-e2b0cef3-7d8a-47b9-9198-24ec4a803a7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] No waiting events found dispatching network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.099 222021 WARNING nova.compute.manager [req-e445c9f3-54de-4341-9e6a-cc4f2f7c81ca req-e2b0cef3-7d8a-47b9-9198-24ec4a803a7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Received unexpected event network-vif-plugged-758c2b61-c90d-4bca-a2a4-1dbdf2631d0d for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:22:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:22:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1719680201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.189 222021 DEBUG oslo_concurrency.processutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.197 222021 DEBUG nova.compute.provider_tree [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.213 222021 DEBUG nova.scheduler.client.report [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.234 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.269 222021 INFO nova.scheduler.client.report [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Deleted allocations for instance 95158f57-1f68-4b3e-9d10-e3006c3f2060#033[00m
Jan 23 05:22:46 np0005593233 nova_compute[222017]: 2026-01-23 10:22:46.356 222021 DEBUG oslo_concurrency.lockutils [None req-e0e3b5e9-bfc5-44d5-8677-c371d8213e67 ec99ae7c69d0438280441e0434374cbf c59351a1b59c4cc9ad389dff900935f2 - - default default] Lock "95158f57-1f68-4b3e-9d10-e3006c3f2060" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:47 np0005593233 nova_compute[222017]: 2026-01-23 10:22:47.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:47 np0005593233 nova_compute[222017]: 2026-01-23 10:22:47.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:22:47.509 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:47.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:47.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:48 np0005593233 nova_compute[222017]: 2026-01-23 10:22:48.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.093 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.416 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.416 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:49.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:22:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/608598065' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:49 np0005593233 nova_compute[222017]: 2026-01-23 10:22:49.921 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:49.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.142 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.144 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4400MB free_disk=20.921878814697266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.144 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.144 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.231 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.231 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.257 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:22:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1316301006' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.762 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.771 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.787 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.834 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:22:50 np0005593233 nova_compute[222017]: 2026-01-23 10:22:50.835 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:51.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:51 np0005593233 nova_compute[222017]: 2026-01-23 10:22:51.836 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:51.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:52 np0005593233 podman[282172]: 2026-01-23 10:22:52.109674666 +0000 UTC m=+0.112591526 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:22:52 np0005593233 nova_compute[222017]: 2026-01-23 10:22:52.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:52 np0005593233 nova_compute[222017]: 2026-01-23 10:22:52.410 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:53 np0005593233 nova_compute[222017]: 2026-01-23 10:22:53.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:53.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:53 np0005593233 nova_compute[222017]: 2026-01-23 10:22:53.865 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:53.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:54 np0005593233 nova_compute[222017]: 2026-01-23 10:22:54.096 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:54 np0005593233 nova_compute[222017]: 2026-01-23 10:22:54.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:55 np0005593233 nova_compute[222017]: 2026-01-23 10:22:55.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:55 np0005593233 nova_compute[222017]: 2026-01-23 10:22:55.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:22:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:55.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:55.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:57.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:57.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:22:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:58 np0005593233 nova_compute[222017]: 2026-01-23 10:22:58.810 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163763.8079667, 95158f57-1f68-4b3e-9d10-e3006c3f2060 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:58 np0005593233 nova_compute[222017]: 2026-01-23 10:22:58.810 222021 INFO nova.compute.manager [-] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:22:58 np0005593233 nova_compute[222017]: 2026-01-23 10:22:58.854 222021 DEBUG nova.compute.manager [None req-636ea967-c923-43e9-971d-bac50347c27d - - - - - -] [instance: 95158f57-1f68-4b3e-9d10-e3006c3f2060] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:58 np0005593233 nova_compute[222017]: 2026-01-23 10:22:58.868 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:59 np0005593233 nova_compute[222017]: 2026-01-23 10:22:59.127 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:22:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:59.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:22:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:22:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:22:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:59.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:00 np0005593233 nova_compute[222017]: 2026-01-23 10:23:00.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:01 np0005593233 nova_compute[222017]: 2026-01-23 10:23:01.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:01 np0005593233 nova_compute[222017]: 2026-01-23 10:23:01.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:23:01 np0005593233 nova_compute[222017]: 2026-01-23 10:23:01.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:23:01 np0005593233 nova_compute[222017]: 2026-01-23 10:23:01.403 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:23:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:01.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.905720) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163781905811, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2363, "num_deletes": 252, "total_data_size": 5609686, "memory_usage": 5700048, "flush_reason": "Manual Compaction"}
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163781941360, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3670488, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65221, "largest_seqno": 67579, "table_properties": {"data_size": 3660948, "index_size": 5969, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20183, "raw_average_key_size": 20, "raw_value_size": 3641701, "raw_average_value_size": 3700, "num_data_blocks": 260, "num_entries": 984, "num_filter_entries": 984, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163574, "oldest_key_time": 1769163574, "file_creation_time": 1769163781, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 35710 microseconds, and 11228 cpu microseconds.
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.941440) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3670488 bytes OK
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.941472) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.944068) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.944093) EVENT_LOG_v1 {"time_micros": 1769163781944084, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.944121) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5599085, prev total WAL file size 5661293, number of live WAL files 2.
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.946642) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3584KB)], [135(10MB)]
Jan 23 05:23:01 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163781946704, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 14308457, "oldest_snapshot_seqno": -1}
Jan 23 05:23:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:01.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 8826 keys, 12408461 bytes, temperature: kUnknown
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163782106141, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 12408461, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12350777, "index_size": 34531, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22085, "raw_key_size": 231631, "raw_average_key_size": 26, "raw_value_size": 12195036, "raw_average_value_size": 1381, "num_data_blocks": 1327, "num_entries": 8826, "num_filter_entries": 8826, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163781, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.106690) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 12408461 bytes
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.108449) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.7 rd, 77.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.1 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 9349, records dropped: 523 output_compression: NoCompression
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.108491) EVENT_LOG_v1 {"time_micros": 1769163782108472, "job": 86, "event": "compaction_finished", "compaction_time_micros": 159585, "compaction_time_cpu_micros": 53466, "output_level": 6, "num_output_files": 1, "total_output_size": 12408461, "num_input_records": 9349, "num_output_records": 8826, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163782110686, "job": 86, "event": "table_file_deletion", "file_number": 137}
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163782116707, "job": 86, "event": "table_file_deletion", "file_number": 135}
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:01.946525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.116885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.116895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.116898) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.116901) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:23:02.116904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:03.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:03 np0005593233 nova_compute[222017]: 2026-01-23 10:23:03.872 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:23:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:23:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:23:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:03.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:23:04 np0005593233 nova_compute[222017]: 2026-01-23 10:23:04.129 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:05.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:05.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:07.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:07.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:08 np0005593233 podman[282333]: 2026-01-23 10:23:08.10140227 +0000 UTC m=+0.096433799 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:23:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:08 np0005593233 nova_compute[222017]: 2026-01-23 10:23:08.876 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:09 np0005593233 nova_compute[222017]: 2026-01-23 10:23:09.159 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:09.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:09.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:11.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:11.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:13.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593233 nova_compute[222017]: 2026-01-23 10:23:13.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:13.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:14 np0005593233 nova_compute[222017]: 2026-01-23 10:23:14.162 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:23:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:15.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:23:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:15.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e316 e316: 3 total, 3 up, 3 in
Jan 23 05:23:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:17.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:17.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e316 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:18 np0005593233 nova_compute[222017]: 2026-01-23 10:23:18.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:19 np0005593233 nova_compute[222017]: 2026-01-23 10:23:19.164 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e317 e317: 3 total, 3 up, 3 in
Jan 23 05:23:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:19.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:19.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:21.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:21.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:23 np0005593233 podman[282400]: 2026-01-23 10:23:23.102099305 +0000 UTC m=+0.107609865 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:23:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:23.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:23 np0005593233 nova_compute[222017]: 2026-01-23 10:23:23.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:23.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:24 np0005593233 nova_compute[222017]: 2026-01-23 10:23:24.167 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.269 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.270 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.296 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.441 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.442 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.454 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.455 222021 INFO nova.compute.claims [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:23:25 np0005593233 nova_compute[222017]: 2026-01-23 10:23:25.588 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:25.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:23:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:25.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:23:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2245795766' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.061 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.071 222021 DEBUG nova.compute.provider_tree [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.093 222021 DEBUG nova.scheduler.client.report [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.125 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.126 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.222 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.223 222021 DEBUG nova.network.neutron [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.248 222021 INFO nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.279 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.411 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.412 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.413 222021 INFO nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Creating image(s)#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.444 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.480 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.514 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.519 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c4be56f0f0c1fc933935bae72309434102ff9887" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.520 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c4be56f0f0c1fc933935bae72309434102ff9887" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:26 np0005593233 nova_compute[222017]: 2026-01-23 10:23:26.568 222021 DEBUG nova.policy [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c041da0a601a4260b29fc9c65719597f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b976daabc8124a99814954633f99ed7b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:23:27 np0005593233 nova_compute[222017]: 2026-01-23 10:23:27.262 222021 DEBUG nova.virt.libvirt.imagebackend [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/4e1fa467-77ba-4764-82a0-700986e94bbd/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/4e1fa467-77ba-4764-82a0-700986e94bbd/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:23:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 e318: 3 total, 3 up, 3 in
Jan 23 05:23:27 np0005593233 nova_compute[222017]: 2026-01-23 10:23:27.577 222021 DEBUG nova.network.neutron [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Successfully created port: 5c1030f5-f5e0-41ec-b194-3304e1939c2d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:23:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:23:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:27.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:23:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:27.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:28 np0005593233 nova_compute[222017]: 2026-01-23 10:23:28.939 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.229 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.279 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.333 222021 DEBUG nova.network.neutron [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Successfully updated port: 5c1030f5-f5e0-41ec-b194-3304e1939c2d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.365 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.366 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.366 222021 DEBUG nova.network.neutron [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.397 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.part --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.398 222021 DEBUG nova.virt.images [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] 4e1fa467-77ba-4764-82a0-700986e94bbd was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.399 222021 DEBUG nova.privsep.utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.400 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.part /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.470 222021 DEBUG nova.compute.manager [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.471 222021 DEBUG nova.compute.manager [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.471 222021 DEBUG oslo_concurrency.lockutils [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.621 222021 DEBUG nova.network.neutron [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.627 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.part /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.converted" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.637 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:29.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.726 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887.converted --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.729 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c4be56f0f0c1fc933935bae72309434102ff9887" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.765 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:29 np0005593233 nova_compute[222017]: 2026-01-23 10:23:29.771 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887 c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:30.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:30 np0005593233 nova_compute[222017]: 2026-01-23 10:23:30.718 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887 c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.946s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:30 np0005593233 nova_compute[222017]: 2026-01-23 10:23:30.825 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] resizing rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.281 222021 DEBUG nova.network.neutron [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.314 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.315 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance network_info: |[{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.315 222021 DEBUG oslo_concurrency.lockutils [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.316 222021 DEBUG nova.network.neutron [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.387 222021 DEBUG nova.objects.instance [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'migration_context' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.404 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.404 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Ensure instance console log exists: /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.405 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.405 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.406 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.409 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Start _get_guest_xml network_info=[{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:23:17Z,direct_url=<?>,disk_format='qcow2',id=4e1fa467-77ba-4764-82a0-700986e94bbd,min_disk=0,min_ram=0,name='tempest-scenario-img--1164200911',owner='b976daabc8124a99814954633f99ed7b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:23:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '4e1fa467-77ba-4764-82a0-700986e94bbd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.413 222021 WARNING nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.424 222021 DEBUG nova.virt.libvirt.host [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.425 222021 DEBUG nova.virt.libvirt.host [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.428 222021 DEBUG nova.virt.libvirt.host [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.429 222021 DEBUG nova.virt.libvirt.host [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.430 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.431 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:23:17Z,direct_url=<?>,disk_format='qcow2',id=4e1fa467-77ba-4764-82a0-700986e94bbd,min_disk=0,min_ram=0,name='tempest-scenario-img--1164200911',owner='b976daabc8124a99814954633f99ed7b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:23:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.431 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.432 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.432 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.432 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.433 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.433 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.433 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.434 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.434 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.434 222021 DEBUG nova.virt.hardware [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.438 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:31.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:23:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1347570007' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.955 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:31 np0005593233 nova_compute[222017]: 2026-01-23 10:23:31.993 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.000 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:32.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:23:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1520508417' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.601 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.601s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.604 222021 DEBUG nova.virt.libvirt.vif [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:23:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-539158124',display_name='tempest-TestMinimumBasicScenario-server-539158124',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-539158124',id=160,image_ref='4e1fa467-77ba-4764-82a0-700986e94bbd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmueIfk7VkHSUTAjgS/ZYpfWK/4j45WSHaYA6pCbrndOG+c3X8uZMvR7mBDlRDE24oh1oDcVJFtlCOd5K/FEKJpkR/txNCDTfmQxcShVYuyc8F1nHGiNqkP9PslO3GBJw==',key_name='tempest-TestMinimumBasicScenario-596586326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-t0eooc93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4e1fa467-77ba-4764-82a0-700986e94bbd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:23:26Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.604 222021 DEBUG nova.network.os_vif_util [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.605 222021 DEBUG nova.network.os_vif_util [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.607 222021 DEBUG nova.objects.instance [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'pci_devices' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.628 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <uuid>c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65</uuid>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <name>instance-000000a0</name>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestMinimumBasicScenario-server-539158124</nova:name>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:23:31</nova:creationTime>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:user uuid="c041da0a601a4260b29fc9c65719597f">tempest-TestMinimumBasicScenario-1465373740-project-member</nova:user>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:project uuid="b976daabc8124a99814954633f99ed7b">tempest-TestMinimumBasicScenario-1465373740</nova:project>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="4e1fa467-77ba-4764-82a0-700986e94bbd"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <nova:port uuid="5c1030f5-f5e0-41ec-b194-3304e1939c2d">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <entry name="serial">c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65</entry>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <entry name="uuid">c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65</entry>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk.config">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:d3:c3:ec"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <target dev="tap5c1030f5-f5"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/console.log" append="off"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:23:32 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:23:32 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:23:32 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:23:32 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.631 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Preparing to wait for external event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.632 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.633 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.633 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.634 222021 DEBUG nova.virt.libvirt.vif [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:23:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-539158124',display_name='tempest-TestMinimumBasicScenario-server-539158124',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-539158124',id=160,image_ref='4e1fa467-77ba-4764-82a0-700986e94bbd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmueIfk7VkHSUTAjgS/ZYpfWK/4j45WSHaYA6pCbrndOG+c3X8uZMvR7mBDlRDE24oh1oDcVJFtlCOd5K/FEKJpkR/txNCDTfmQxcShVYuyc8F1nHGiNqkP9PslO3GBJw==',key_name='tempest-TestMinimumBasicScenario-596586326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-t0eooc93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4e1fa467-77ba-4764-82a0-700986e94bbd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:23:26Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.635 222021 DEBUG nova.network.os_vif_util [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.636 222021 DEBUG nova.network.os_vif_util [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.637 222021 DEBUG os_vif [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.639 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.640 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.646 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.647 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c1030f5-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.648 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c1030f5-f5, col_values=(('external_ids', {'iface-id': '5c1030f5-f5e0-41ec-b194-3304e1939c2d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:c3:ec', 'vm-uuid': 'c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:32 np0005593233 NetworkManager[48871]: <info>  [1769163812.6873] manager: (tap5c1030f5-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.690 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.696 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.697 222021 INFO os_vif [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5')#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.747 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.748 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.749 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No VIF found with MAC fa:16:3e:d3:c3:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.750 222021 INFO nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Using config drive#033[00m
Jan 23 05:23:32 np0005593233 nova_compute[222017]: 2026-01-23 10:23:32.791 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:33 np0005593233 nova_compute[222017]: 2026-01-23 10:23:33.040 222021 DEBUG nova.network.neutron [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:33 np0005593233 nova_compute[222017]: 2026-01-23 10:23:33.041 222021 DEBUG nova.network.neutron [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:33 np0005593233 nova_compute[222017]: 2026-01-23 10:23:33.077 222021 DEBUG oslo_concurrency.lockutils [req-dab75729-115e-4de2-9a99-0e4c197f9fa3 req-1cbc4a20-819f-4daf-87fb-5c8f7bf878e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:33.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:34.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:34 np0005593233 nova_compute[222017]: 2026-01-23 10:23:34.232 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:34 np0005593233 nova_compute[222017]: 2026-01-23 10:23:34.661 222021 INFO nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Creating config drive at /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/disk.config#033[00m
Jan 23 05:23:34 np0005593233 nova_compute[222017]: 2026-01-23 10:23:34.671 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkrz5h8rh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:34 np0005593233 nova_compute[222017]: 2026-01-23 10:23:34.819 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkrz5h8rh" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:34 np0005593233 nova_compute[222017]: 2026-01-23 10:23:34.865 222021 DEBUG nova.storage.rbd_utils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:34 np0005593233 nova_compute[222017]: 2026-01-23 10:23:34.870 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/disk.config c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.078 222021 DEBUG oslo_concurrency.processutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/disk.config c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.079 222021 INFO nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Deleting local config drive /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65/disk.config because it was imported into RBD.#033[00m
Jan 23 05:23:35 np0005593233 kernel: tap5c1030f5-f5: entered promiscuous mode
Jan 23 05:23:35 np0005593233 NetworkManager[48871]: <info>  [1769163815.1617] manager: (tap5c1030f5-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Jan 23 05:23:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:35Z|00663|binding|INFO|Claiming lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d for this chassis.
Jan 23 05:23:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:35Z|00664|binding|INFO|5c1030f5-f5e0-41ec-b194-3304e1939c2d: Claiming fa:16:3e:d3:c3:ec 10.100.0.10
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.195 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.215 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:c3:ec 10.100.0.10'], port_security=['fa:16:3e:d3:c3:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77e10692-5f18-4d4e-ba14-6f09047b276a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5c1030f5-f5e0-41ec-b194-3304e1939c2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.216 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5c1030f5-f5e0-41ec-b194-3304e1939c2d in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb bound to our chassis#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.218 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b38c3ca-73e5-4583-a277-cd0670deffdb#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.238 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[46409c47-5606-4108-ae5b-56b3b92554b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.239 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b38c3ca-71 in ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.243 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b38c3ca-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.243 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dd82b3-ae50-4781-9258-dbf44f3440ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.244 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c5fceb8b-67c9-4ae5-a1f2-a73d54c039e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 systemd-machined[190954]: New machine qemu-72-instance-000000a0.
Jan 23 05:23:35 np0005593233 systemd-udevd[282767]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.266 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e9004bea-341b-4e83-80aa-5dfdc44e8e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.268 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 systemd[1]: Started Virtual Machine qemu-72-instance-000000a0.
Jan 23 05:23:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:35Z|00665|binding|INFO|Setting lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d ovn-installed in OVS
Jan 23 05:23:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:35Z|00666|binding|INFO|Setting lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d up in Southbound
Jan 23 05:23:35 np0005593233 NetworkManager[48871]: <info>  [1769163815.2825] device (tap5c1030f5-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:23:35 np0005593233 NetworkManager[48871]: <info>  [1769163815.2839] device (tap5c1030f5-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.284 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.288 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b378fdfe-fe81-4681-90e9-582e6d6fa8df]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.328 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[380da4b5-9bef-4f3e-ab16-0b8f83bf98c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.334 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ae256b9a-75d2-4838-96fc-ec99401bf83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 NetworkManager[48871]: <info>  [1769163815.3359] manager: (tap8b38c3ca-70): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.380 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2286256f-7e32-4573-9632-33c00f6a34a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.385 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4130f9-1cad-47db-a418-97f7123b84c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 NetworkManager[48871]: <info>  [1769163815.4177] device (tap8b38c3ca-70): carrier: link connected
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.423 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a60d79-0dbb-4572-a8b8-b27afb9941a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.444 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[86677b01-49d6-465a-8f16-8a104d22a609]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772804, 'reachable_time': 32117, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282798, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.466 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[13390ebe-1645-4500-b4d5-a937ff0679ac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:fa5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 772804, 'tstamp': 772804}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282799, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.490 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8cf623-7f39-461a-b472-1aee9eade1ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772804, 'reachable_time': 32117, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 282807, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.533 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa0ea3d-4bfa-49d3-a65f-2f55ccf53af3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.622 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa34639-7fd0-41cc-8363-d8baf677095f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.623 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.623 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.624 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b38c3ca-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:35 np0005593233 NetworkManager[48871]: <info>  [1769163815.6268] manager: (tap8b38c3ca-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.626 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 kernel: tap8b38c3ca-70: entered promiscuous mode
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.628 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.630 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b38c3ca-70, col_values=(('external_ids', {'iface-id': '120d9d64-6853-4b50-a095-bddadd015ba1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.631 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:35Z|00667|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.653 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.655 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.656 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[185f0f0f-c1ba-45ad-a891-4f699cb18505]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.657 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:23:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:35.658 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'env', 'PROCESS_TAG=haproxy-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b38c3ca-73e5-4583-a277-cd0670deffdb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.708 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163815.7072306, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.709 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] VM Started (Lifecycle Event)#033[00m
Jan 23 05:23:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:35.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.740 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.751 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163815.70762, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.752 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.783 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.789 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.825 222021 DEBUG nova.compute.manager [req-d592f07c-250b-4253-8a55-09ae325e9380 req-022d2d3c-ee2c-42f4-aaa7-305024556305 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.826 222021 DEBUG oslo_concurrency.lockutils [req-d592f07c-250b-4253-8a55-09ae325e9380 req-022d2d3c-ee2c-42f4-aaa7-305024556305 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.827 222021 DEBUG oslo_concurrency.lockutils [req-d592f07c-250b-4253-8a55-09ae325e9380 req-022d2d3c-ee2c-42f4-aaa7-305024556305 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.827 222021 DEBUG oslo_concurrency.lockutils [req-d592f07c-250b-4253-8a55-09ae325e9380 req-022d2d3c-ee2c-42f4-aaa7-305024556305 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.828 222021 DEBUG nova.compute.manager [req-d592f07c-250b-4253-8a55-09ae325e9380 req-022d2d3c-ee2c-42f4-aaa7-305024556305 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Processing event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.829 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.835 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.836 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163815.8328846, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.836 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.840 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.847 222021 INFO nova.virt.libvirt.driver [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance spawned successfully.#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.847 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.876 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.891 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.896 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.896 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.897 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.898 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.899 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.900 222021 DEBUG nova.virt.libvirt.driver [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.944 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.996 222021 INFO nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Took 9.59 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:23:35 np0005593233 nova_compute[222017]: 2026-01-23 10:23:35.997 222021 DEBUG nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:36.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:36 np0005593233 podman[282874]: 2026-01-23 10:23:36.109940599 +0000 UTC m=+0.078645265 container create 87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:23:36 np0005593233 podman[282874]: 2026-01-23 10:23:36.070642738 +0000 UTC m=+0.039347414 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:23:36 np0005593233 systemd[1]: Started libpod-conmon-87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8.scope.
Jan 23 05:23:36 np0005593233 nova_compute[222017]: 2026-01-23 10:23:36.163 222021 INFO nova.compute.manager [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Took 10.79 seconds to build instance.#033[00m
Jan 23 05:23:36 np0005593233 nova_compute[222017]: 2026-01-23 10:23:36.190 222021 DEBUG oslo_concurrency.lockutils [None req-b0b2ae77-bd7a-4b54-b7d1-62d1082796bc c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:36 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:23:36 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe3eb1ae9318fe5ae0fcf811e9fdb19619073c146f509b158b191982f8f8f060/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:36 np0005593233 podman[282874]: 2026-01-23 10:23:36.228779341 +0000 UTC m=+0.197484067 container init 87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:23:36 np0005593233 podman[282874]: 2026-01-23 10:23:36.240819101 +0000 UTC m=+0.209523777 container start 87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:23:36 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[282889]: [NOTICE]   (282893) : New worker (282895) forked
Jan 23 05:23:36 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[282889]: [NOTICE]   (282893) : Loading success.
Jan 23 05:23:37 np0005593233 nova_compute[222017]: 2026-01-23 10:23:37.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:37 np0005593233 nova_compute[222017]: 2026-01-23 10:23:37.429 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:23:37 np0005593233 nova_compute[222017]: 2026-01-23 10:23:37.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:37 np0005593233 nova_compute[222017]: 2026-01-23 10:23:37.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:37 np0005593233 nova_compute[222017]: 2026-01-23 10:23:37.457 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:37 np0005593233 nova_compute[222017]: 2026-01-23 10:23:37.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:38.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:38 np0005593233 nova_compute[222017]: 2026-01-23 10:23:38.659 222021 DEBUG nova.compute.manager [req-8e57106c-ed57-4639-8eb1-afd287273c07 req-019b23a6-f1a3-48ca-98a1-98e258059c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:38 np0005593233 nova_compute[222017]: 2026-01-23 10:23:38.660 222021 DEBUG oslo_concurrency.lockutils [req-8e57106c-ed57-4639-8eb1-afd287273c07 req-019b23a6-f1a3-48ca-98a1-98e258059c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:38 np0005593233 nova_compute[222017]: 2026-01-23 10:23:38.660 222021 DEBUG oslo_concurrency.lockutils [req-8e57106c-ed57-4639-8eb1-afd287273c07 req-019b23a6-f1a3-48ca-98a1-98e258059c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:38 np0005593233 nova_compute[222017]: 2026-01-23 10:23:38.661 222021 DEBUG oslo_concurrency.lockutils [req-8e57106c-ed57-4639-8eb1-afd287273c07 req-019b23a6-f1a3-48ca-98a1-98e258059c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:38 np0005593233 nova_compute[222017]: 2026-01-23 10:23:38.661 222021 DEBUG nova.compute.manager [req-8e57106c-ed57-4639-8eb1-afd287273c07 req-019b23a6-f1a3-48ca-98a1-98e258059c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:38 np0005593233 nova_compute[222017]: 2026-01-23 10:23:38.661 222021 WARNING nova.compute.manager [req-8e57106c-ed57-4639-8eb1-afd287273c07 req-019b23a6-f1a3-48ca-98a1-98e258059c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received unexpected event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:23:39 np0005593233 podman[282904]: 2026-01-23 10:23:39.083055474 +0000 UTC m=+0.101115421 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:23:39 np0005593233 nova_compute[222017]: 2026-01-23 10:23:39.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:39.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:40.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:41.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:42.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:42.686 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:42.687 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:42.687 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:42 np0005593233 nova_compute[222017]: 2026-01-23 10:23:42.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:43.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:44.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:44 np0005593233 nova_compute[222017]: 2026-01-23 10:23:44.236 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:44 np0005593233 nova_compute[222017]: 2026-01-23 10:23:44.583 222021 DEBUG oslo_concurrency.lockutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:44 np0005593233 nova_compute[222017]: 2026-01-23 10:23:44.584 222021 DEBUG oslo_concurrency.lockutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:44 np0005593233 nova_compute[222017]: 2026-01-23 10:23:44.652 222021 DEBUG nova.objects.instance [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'flavor' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:23:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2037581528' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:23:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:23:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2037581528' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:23:44 np0005593233 nova_compute[222017]: 2026-01-23 10:23:44.747 222021 DEBUG oslo_concurrency.lockutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.080 222021 DEBUG oslo_concurrency.lockutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.081 222021 DEBUG oslo_concurrency.lockutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.081 222021 INFO nova.compute.manager [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Attaching volume 006fb8d9-c6d3-441f-993c-5b105d3e9681 to /dev/vdb#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.254 222021 DEBUG os_brick.utils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.257 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.279 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.280 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[f6cf4076-f7ba-4c66-8e7c-fe2b75c9503f]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.282 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.297 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.298 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[32719a66-51ab-40eb-8439-26df607c1cb8]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.300 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.316 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.316 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[488e9d82-c457-45c1-8051-ebd79a0471e6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.320 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[292c6006-466e-49b6-ac0d-142b9e8b3d75]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.320 222021 DEBUG oslo_concurrency.processutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.371 222021 DEBUG oslo_concurrency.processutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "nvme version" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.375 222021 DEBUG os_brick.initiator.connectors.lightos [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.375 222021 DEBUG os_brick.initiator.connectors.lightos [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.376 222021 DEBUG os_brick.initiator.connectors.lightos [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.376 222021 DEBUG os_brick.utils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] <== get_connector_properties: return (120ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:23:45 np0005593233 nova_compute[222017]: 2026-01-23 10:23:45.377 222021 DEBUG nova.virt.block_device [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating existing volume attachment record: b92e1ba0-b3c1-47fa-971f-c2b351690c32 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:23:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:45.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:46.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:46 np0005593233 nova_compute[222017]: 2026-01-23 10:23:46.430 222021 DEBUG nova.objects.instance [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'flavor' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:46 np0005593233 nova_compute[222017]: 2026-01-23 10:23:46.824 222021 DEBUG nova.virt.libvirt.driver [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Attempting to attach volume 006fb8d9-c6d3-441f-993c-5b105d3e9681 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:23:46 np0005593233 nova_compute[222017]: 2026-01-23 10:23:46.828 222021 DEBUG nova.virt.libvirt.guest [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-006fb8d9-c6d3-441f-993c-5b105d3e9681">
Jan 23 05:23:46 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:23:46 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:23:46 np0005593233 nova_compute[222017]:  <serial>006fb8d9-c6d3-441f-993c-5b105d3e9681</serial>
Jan 23 05:23:46 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:23:46 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:23:47 np0005593233 nova_compute[222017]: 2026-01-23 10:23:47.023 222021 DEBUG nova.virt.libvirt.driver [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:47 np0005593233 nova_compute[222017]: 2026-01-23 10:23:47.024 222021 DEBUG nova.virt.libvirt.driver [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:47 np0005593233 nova_compute[222017]: 2026-01-23 10:23:47.024 222021 DEBUG nova.virt.libvirt.driver [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:47 np0005593233 nova_compute[222017]: 2026-01-23 10:23:47.025 222021 DEBUG nova.virt.libvirt.driver [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No VIF found with MAC fa:16:3e:d3:c3:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:23:47 np0005593233 nova_compute[222017]: 2026-01-23 10:23:47.317 222021 DEBUG oslo_concurrency.lockutils [None req-4bf0eb62-e46b-496f-b9d3-58a0d0ee6ecf c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:47.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:47 np0005593233 nova_compute[222017]: 2026-01-23 10:23:47.722 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:48.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:48 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #52. Immutable memtables: 8.
Jan 23 05:23:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.427 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.736 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.736 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.774 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.875 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:48.878 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:48.880 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.976 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.977 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.991 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:23:48 np0005593233 nova_compute[222017]: 2026-01-23 10:23:48.991 222021 INFO nova.compute.claims [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.172 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:49Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:c3:ec 10.100.0.10
Jan 23 05:23:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:49Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:c3:ec 10.100.0.10
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/254129724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:49.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.736 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.742 222021 DEBUG nova.compute.provider_tree [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.770 222021 DEBUG nova.scheduler.client.report [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.802 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.803 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.887 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.888 222021 DEBUG nova.network.neutron [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.910 222021 INFO nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:23:49 np0005593233 nova_compute[222017]: 2026-01-23 10:23:49.934 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:23:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:23:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:50.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.042 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.043 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.044 222021 INFO nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Creating image(s)#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.081 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.126 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.176 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.182 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.224 222021 DEBUG nova.policy [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d6a628e0dcb441fa41457bf719e65a0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.263 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.263 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.264 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.265 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.292 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.296 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.614 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.318s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.686 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] resizing rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:23:50 np0005593233 NetworkManager[48871]: <info>  [1769163830.7953] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 23 05:23:50 np0005593233 NetworkManager[48871]: <info>  [1769163830.7984] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.809 222021 DEBUG nova.objects.instance [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.861 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.862 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Ensure instance console log exists: /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.863 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.864 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:50 np0005593233 nova_compute[222017]: 2026-01-23 10:23:50.864 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:23:51Z|00668|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.057 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.427 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.428 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.428 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.428 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.428 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:51.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.881 222021 DEBUG nova.network.neutron [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Successfully created port: 45b1f068-9743-4164-a7d2-c1ab991c291f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:23:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1534279326' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.911 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.997 222021 DEBUG nova.compute.manager [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.997 222021 DEBUG nova.compute.manager [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.997 222021 DEBUG oslo_concurrency.lockutils [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.998 222021 DEBUG oslo_concurrency.lockutils [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:51 np0005593233 nova_compute[222017]: 2026-01-23 10:23:51.998 222021 DEBUG nova.network.neutron [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.024 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.024 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.024 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:52.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.223 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.225 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4104MB free_disk=20.868091583251953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.225 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.226 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.315 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.316 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.316 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.317 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.404 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2136454215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.909 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.918 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.954 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.985 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.985 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:52 np0005593233 nova_compute[222017]: 2026-01-23 10:23:52.986 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:23:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:53.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:23:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:54.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:54 np0005593233 podman[283185]: 2026-01-23 10:23:54.179533457 +0000 UTC m=+0.178223852 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 05:23:54 np0005593233 nova_compute[222017]: 2026-01-23 10:23:54.274 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.014 222021 DEBUG nova.compute.manager [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.015 222021 DEBUG nova.compute.manager [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.015 222021 DEBUG oslo_concurrency.lockutils [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.442 222021 DEBUG nova.network.neutron [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Successfully updated port: 45b1f068-9743-4164-a7d2-c1ab991c291f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.460 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.460 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquired lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.460 222021 DEBUG nova.network.neutron [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.619 222021 DEBUG nova.network.neutron [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.620 222021 DEBUG nova.network.neutron [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:55.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.939 222021 DEBUG oslo_concurrency.lockutils [req-a0f93139-0f2d-4b45-83e2-490afc9d6f09 req-8a5f205e-f8dc-42fa-81f4-197b46d66528 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.940 222021 DEBUG oslo_concurrency.lockutils [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:55 np0005593233 nova_compute[222017]: 2026-01-23 10:23:55.940 222021 DEBUG nova.network.neutron [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:56 np0005593233 nova_compute[222017]: 2026-01-23 10:23:56.008 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:56 np0005593233 nova_compute[222017]: 2026-01-23 10:23:56.009 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:56.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.060 222021 DEBUG nova.network.neutron [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.166 222021 DEBUG nova.compute.manager [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.167 222021 DEBUG nova.compute.manager [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.168 222021 DEBUG oslo_concurrency.lockutils [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:23:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:57.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:57 np0005593233 nova_compute[222017]: 2026-01-23 10:23:57.775 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:58.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:23:58.884 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:59 np0005593233 nova_compute[222017]: 2026-01-23 10:23:59.278 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:23:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:59.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:00.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.408 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.666 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:01.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.743 222021 DEBUG nova.network.neutron [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updating instance_info_cache with network_info: [{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.771 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Releasing lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.772 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Instance network_info: |[{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.777 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Start _get_guest_xml network_info=[{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.784 222021 WARNING nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.790 222021 DEBUG nova.virt.libvirt.host [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.791 222021 DEBUG nova.virt.libvirt.host [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.800 222021 DEBUG nova.virt.libvirt.host [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.801 222021 DEBUG nova.virt.libvirt.host [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.803 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.804 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.805 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.805 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.806 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.806 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.807 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.807 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.808 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.808 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.809 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.809 222021 DEBUG nova.virt.hardware [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:24:01 np0005593233 nova_compute[222017]: 2026-01-23 10:24:01.815 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:24:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:02.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:24:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:24:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/902694289' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.317 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.356 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.362 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.446 222021 DEBUG nova.network.neutron [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.447 222021 DEBUG nova.network.neutron [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.467 222021 DEBUG oslo_concurrency.lockutils [req-a6824644-3d5c-4878-abae-61b6ce3018da req-e8b1c3cd-1574-4810-a645-978346ce3fb4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.468 222021 DEBUG oslo_concurrency.lockutils [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.468 222021 DEBUG nova.network.neutron [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.819 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:24:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/852090314' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.860 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.862 222021 DEBUG nova.virt.libvirt.vif [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-250274490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-250274490',id=163,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-bwzrhbs1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:23:49Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=4b43bf7c-8fc3-4ea4-9401-283826c9ed39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.863 222021 DEBUG nova.network.os_vif_util [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.864 222021 DEBUG nova.network.os_vif_util [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.866 222021 DEBUG nova.objects.instance [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.888 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <uuid>4b43bf7c-8fc3-4ea4-9401-283826c9ed39</uuid>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <name>instance-000000a3</name>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-250274490</nova:name>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:24:01</nova:creationTime>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:user uuid="0d6a628e0dcb441fa41457bf719e65a0">tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member</nova:user>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:project uuid="5c27429e1d8f433a8a67ddb76f8798f1">tempest-ServerBootFromVolumeStableRescueTest-1351337832</nova:project>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <nova:port uuid="45b1f068-9743-4164-a7d2-c1ab991c291f">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <entry name="serial">4b43bf7c-8fc3-4ea4-9401-283826c9ed39</entry>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <entry name="uuid">4b43bf7c-8fc3-4ea4-9401-283826c9ed39</entry>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk.config">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:1e:57:6c"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <target dev="tap45b1f068-97"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/console.log" append="off"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:24:02 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:24:02 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:24:02 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:24:02 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.890 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Preparing to wait for external event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.890 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.890 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.890 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.891 222021 DEBUG nova.virt.libvirt.vif [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-250274490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-250274490',id=163,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-bwzrhbs1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:23:49Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=4b43bf7c-8fc3-4ea4-9401-283826c9ed39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.892 222021 DEBUG nova.network.os_vif_util [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.893 222021 DEBUG nova.network.os_vif_util [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.894 222021 DEBUG os_vif [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.895 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.896 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.902 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.903 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45b1f068-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.903 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap45b1f068-97, col_values=(('external_ids', {'iface-id': '45b1f068-9743-4164-a7d2-c1ab991c291f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:57:6c', 'vm-uuid': '4b43bf7c-8fc3-4ea4-9401-283826c9ed39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.905 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593233 NetworkManager[48871]: <info>  [1769163842.9063] manager: (tap45b1f068-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.914 222021 INFO os_vif [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97')#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.974 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.974 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.975 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No VIF found with MAC fa:16:3e:1e:57:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:24:02 np0005593233 nova_compute[222017]: 2026-01-23 10:24:02.975 222021 INFO nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Using config drive#033[00m
Jan 23 05:24:03 np0005593233 nova_compute[222017]: 2026-01-23 10:24:03.010 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:03 np0005593233 nova_compute[222017]: 2026-01-23 10:24:03.629 222021 INFO nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Creating config drive at /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/disk.config#033[00m
Jan 23 05:24:03 np0005593233 nova_compute[222017]: 2026-01-23 10:24:03.638 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9jrpo3aq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:03.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:03 np0005593233 nova_compute[222017]: 2026-01-23 10:24:03.803 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9jrpo3aq" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:03 np0005593233 nova_compute[222017]: 2026-01-23 10:24:03.849 222021 DEBUG nova.storage.rbd_utils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:03 np0005593233 nova_compute[222017]: 2026-01-23 10:24:03.857 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/disk.config 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:04.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.086 222021 DEBUG oslo_concurrency.processutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/disk.config 4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.087 222021 INFO nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Deleting local config drive /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39/disk.config because it was imported into RBD.#033[00m
Jan 23 05:24:04 np0005593233 kernel: tap45b1f068-97: entered promiscuous mode
Jan 23 05:24:04 np0005593233 NetworkManager[48871]: <info>  [1769163844.1703] manager: (tap45b1f068-97): new Tun device (/org/freedesktop/NetworkManager/Devices/311)
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.171 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:04Z|00669|binding|INFO|Claiming lport 45b1f068-9743-4164-a7d2-c1ab991c291f for this chassis.
Jan 23 05:24:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:04Z|00670|binding|INFO|45b1f068-9743-4164-a7d2-c1ab991c291f: Claiming fa:16:3e:1e:57:6c 10.100.0.10
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.197 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:57:6c 10.100.0.10'], port_security=['fa:16:3e:1e:57:6c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4b43bf7c-8fc3-4ea4-9401-283826c9ed39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=45b1f068-9743-4164-a7d2-c1ab991c291f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.199 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.200 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 45b1f068-9743-4164-a7d2-c1ab991c291f in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 bound to our chassis#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.202 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:24:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:04Z|00671|binding|INFO|Setting lport 45b1f068-9743-4164-a7d2-c1ab991c291f ovn-installed in OVS
Jan 23 05:24:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:04Z|00672|binding|INFO|Setting lport 45b1f068-9743-4164-a7d2-c1ab991c291f up in Southbound
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 systemd-udevd[283344]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:24:04 np0005593233 systemd-machined[190954]: New machine qemu-73-instance-000000a3.
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.218 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f84b9896-3c59-4cd7-bf25-3c329461ce69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.219 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbd64ab8-91 in ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.222 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbd64ab8-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.222 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[41e45c04-fd89-44a4-ba67-988079088cdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.223 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0823bd-b426-4250-b92d-b892b3549465]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 NetworkManager[48871]: <info>  [1769163844.2347] device (tap45b1f068-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:24:04 np0005593233 NetworkManager[48871]: <info>  [1769163844.2357] device (tap45b1f068-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:24:04 np0005593233 systemd[1]: Started Virtual Machine qemu-73-instance-000000a3.
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.242 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e74e87fa-83dd-48d9-a962-caecaa2db7a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.269 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cf878b96-9df6-4697-a828-7aec9075b8fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.280 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.308 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e748c22f-e4c3-4f33-9de2-daa13655a934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.315 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c3a01b-413f-45a5-942b-a18ee343099a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 systemd-udevd[283348]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:24:04 np0005593233 NetworkManager[48871]: <info>  [1769163844.3164] manager: (tapfbd64ab8-90): new Veth device (/org/freedesktop/NetworkManager/Devices/312)
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.357 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e850feb8-1fc5-4489-822e-48676d7c4be4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.362 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[625dbbcf-555e-4a57-91c2-417852238370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 NetworkManager[48871]: <info>  [1769163844.3956] device (tapfbd64ab8-90): carrier: link connected
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.405 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[92575873-7fc2-45ad-a0f4-3382277b2ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.432 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9e537a6a-30b1-42bb-b9b2-da64246cfe58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 41204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283377, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.451 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[47e35a16-39af-4813-a4ba-22e367aa63ed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:7c5a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775702, 'tstamp': 775702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283393, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.470 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4edbf343-b727-4708-8a1a-6394ddad41d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 41204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283395, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.510 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[04bb0f60-22fc-4dc5-a354-66e16a232a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.591 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c33722bd-033c-4bac-8176-3972c693595c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.594 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.594 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.595 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.597 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 kernel: tapfbd64ab8-90: entered promiscuous mode
Jan 23 05:24:04 np0005593233 NetworkManager[48871]: <info>  [1769163844.5986] manager: (tapfbd64ab8-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.599 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.603 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:04Z|00673|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.607 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.608 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7622d973-793b-41d8-ac49-9fcfb87eec99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.609 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4.pid.haproxy
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:24:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:04.611 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'env', 'PROCESS_TAG=haproxy-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.624 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.667 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163844.6667693, 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.668 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] VM Started (Lifecycle Event)#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.707 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.712 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163844.6681812, 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.713 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.736 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.741 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.770 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.860 222021 DEBUG nova.compute.manager [req-cdabbc7b-1b6a-4ae6-b418-9903ea886f3b req-3d1afc9c-d848-4daf-99ce-c16528836b08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.861 222021 DEBUG oslo_concurrency.lockutils [req-cdabbc7b-1b6a-4ae6-b418-9903ea886f3b req-3d1afc9c-d848-4daf-99ce-c16528836b08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.861 222021 DEBUG oslo_concurrency.lockutils [req-cdabbc7b-1b6a-4ae6-b418-9903ea886f3b req-3d1afc9c-d848-4daf-99ce-c16528836b08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.861 222021 DEBUG oslo_concurrency.lockutils [req-cdabbc7b-1b6a-4ae6-b418-9903ea886f3b req-3d1afc9c-d848-4daf-99ce-c16528836b08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.862 222021 DEBUG nova.compute.manager [req-cdabbc7b-1b6a-4ae6-b418-9903ea886f3b req-3d1afc9c-d848-4daf-99ce-c16528836b08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Processing event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.863 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.877 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163844.868522, 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.878 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.881 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.885 222021 INFO nova.virt.libvirt.driver [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Instance spawned successfully.#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.885 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.910 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.915 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.915 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.916 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.916 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.916 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.917 222021 DEBUG nova.virt.libvirt.driver [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.921 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.953 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.976 222021 INFO nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Took 14.93 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:24:04 np0005593233 nova_compute[222017]: 2026-01-23 10:24:04.977 222021 DEBUG nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.038 222021 INFO nova.compute.manager [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Took 16.10 seconds to build instance.#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.057 222021 DEBUG oslo_concurrency.lockutils [None req-a173f501-1231-41a3-9bbc-20dae0d1f8ce 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:05 np0005593233 podman[283453]: 2026-01-23 10:24:05.088713551 +0000 UTC m=+0.079111239 container create 6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:24:05 np0005593233 systemd[1]: Started libpod-conmon-6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1.scope.
Jan 23 05:24:05 np0005593233 podman[283453]: 2026-01-23 10:24:05.053777143 +0000 UTC m=+0.044174831 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:24:05 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:24:05 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27e5f96a86838744a9e791ac8d7aae98f6b42ca2b6c4f97343e3d203b9f9afc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:05 np0005593233 podman[283453]: 2026-01-23 10:24:05.206496222 +0000 UTC m=+0.196893990 container init 6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:24:05 np0005593233 podman[283453]: 2026-01-23 10:24:05.218000778 +0000 UTC m=+0.208398486 container start 6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:24:05 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [NOTICE]   (283472) : New worker (283474) forked
Jan 23 05:24:05 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [NOTICE]   (283472) : Loading success.
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.622 222021 DEBUG nova.network.neutron [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.622 222021 DEBUG nova.network.neutron [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.653 222021 DEBUG oslo_concurrency.lockutils [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.654 222021 DEBUG nova.compute.manager [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-changed-45b1f068-9743-4164-a7d2-c1ab991c291f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.654 222021 DEBUG nova.compute.manager [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Refreshing instance network info cache due to event network-changed-45b1f068-9743-4164-a7d2-c1ab991c291f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.654 222021 DEBUG oslo_concurrency.lockutils [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.655 222021 DEBUG oslo_concurrency.lockutils [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.655 222021 DEBUG nova.network.neutron [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Refreshing network info cache for port 45b1f068-9743-4164-a7d2-c1ab991c291f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.657 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.657 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:24:05 np0005593233 nova_compute[222017]: 2026-01-23 10:24:05.657 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:05.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:24:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:06.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.106 222021 DEBUG nova.compute.manager [req-67f67499-1969-4183-8b3c-7e22d14009ea req-dccd0fbc-a991-46a7-bf2c-2198ed371aaa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.107 222021 DEBUG oslo_concurrency.lockutils [req-67f67499-1969-4183-8b3c-7e22d14009ea req-dccd0fbc-a991-46a7-bf2c-2198ed371aaa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.108 222021 DEBUG oslo_concurrency.lockutils [req-67f67499-1969-4183-8b3c-7e22d14009ea req-dccd0fbc-a991-46a7-bf2c-2198ed371aaa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.108 222021 DEBUG oslo_concurrency.lockutils [req-67f67499-1969-4183-8b3c-7e22d14009ea req-dccd0fbc-a991-46a7-bf2c-2198ed371aaa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.109 222021 DEBUG nova.compute.manager [req-67f67499-1969-4183-8b3c-7e22d14009ea req-dccd0fbc-a991-46a7-bf2c-2198ed371aaa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] No waiting events found dispatching network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.109 222021 WARNING nova.compute.manager [req-67f67499-1969-4183-8b3c-7e22d14009ea req-dccd0fbc-a991-46a7-bf2c-2198ed371aaa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received unexpected event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.359671) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847359715, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 970, "num_deletes": 251, "total_data_size": 1889311, "memory_usage": 1919656, "flush_reason": "Manual Compaction"}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847369494, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 824235, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 67584, "largest_seqno": 68549, "table_properties": {"data_size": 820435, "index_size": 1451, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10302, "raw_average_key_size": 21, "raw_value_size": 812251, "raw_average_value_size": 1664, "num_data_blocks": 64, "num_entries": 488, "num_filter_entries": 488, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163781, "oldest_key_time": 1769163781, "file_creation_time": 1769163847, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 9908 microseconds, and 4008 cpu microseconds.
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.369572) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 824235 bytes OK
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.369603) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.373789) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.373806) EVENT_LOG_v1 {"time_micros": 1769163847373800, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.373827) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 1884370, prev total WAL file size 1884370, number of live WAL files 2.
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.374973) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323538' seq:72057594037927935, type:22 .. '6D6772737461740032353039' seq:0, type:0; will stop at (end)
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(804KB)], [138(11MB)]
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847375060, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 13232696, "oldest_snapshot_seqno": -1}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 8822 keys, 9795782 bytes, temperature: kUnknown
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847486123, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 9795782, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9742027, "index_size": 30602, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22085, "raw_key_size": 231843, "raw_average_key_size": 26, "raw_value_size": 9590365, "raw_average_value_size": 1087, "num_data_blocks": 1165, "num_entries": 8822, "num_filter_entries": 8822, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163847, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.486507) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 9795782 bytes
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.487787) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 119.0 rd, 88.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 11.8 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(27.9) write-amplify(11.9) OK, records in: 9314, records dropped: 492 output_compression: NoCompression
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.487809) EVENT_LOG_v1 {"time_micros": 1769163847487798, "job": 88, "event": "compaction_finished", "compaction_time_micros": 111183, "compaction_time_cpu_micros": 26573, "output_level": 6, "num_output_files": 1, "total_output_size": 9795782, "num_input_records": 9314, "num_output_records": 8822, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847488220, "job": 88, "event": "table_file_deletion", "file_number": 140}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847490528, "job": 88, "event": "table_file_deletion", "file_number": 138}
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.374825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.490666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.490677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.490680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.490683) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:07.490685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:07.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:07 np0005593233 nova_compute[222017]: 2026-01-23 10:24:07.989 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:08.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:08 np0005593233 nova_compute[222017]: 2026-01-23 10:24:08.488 222021 DEBUG nova.compute.manager [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:08 np0005593233 nova_compute[222017]: 2026-01-23 10:24:08.556 222021 INFO nova.compute.manager [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] instance snapshotting#033[00m
Jan 23 05:24:08 np0005593233 nova_compute[222017]: 2026-01-23 10:24:08.970 222021 INFO nova.virt.libvirt.driver [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Beginning live snapshot process#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.164 222021 DEBUG nova.virt.libvirt.imagebackend [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.171 222021 DEBUG nova.network.neutron [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updated VIF entry in instance network info cache for port 45b1f068-9743-4164-a7d2-c1ab991c291f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.171 222021 DEBUG nova.network.neutron [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updating instance_info_cache with network_info: [{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.178 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.204 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.205 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.205 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.207 222021 DEBUG oslo_concurrency.lockutils [req-87fbf312-2b68-4697-8688-ec8a9ec45d4b req-f40c57be-f049-4d15-a96f-9b5a356e5da7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.284 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:09 np0005593233 nova_compute[222017]: 2026-01-23 10:24:09.494 222021 DEBUG nova.storage.rbd_utils [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] creating snapshot(5d716d7013b941f9942cdba37a7e562b) on rbd image(4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:24:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:24:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:09.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:24:10 np0005593233 podman[283534]: 2026-01-23 10:24:10.054834557 +0000 UTC m=+0.065386760 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 05:24:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:10.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.214 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e319 e319: 3 total, 3 up, 3 in
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.442 222021 DEBUG nova.storage.rbd_utils [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] cloning vms/4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk@5d716d7013b941f9942cdba37a7e562b to images/a891f488-4cba-4fea-b482-6ac469142f81 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.504 222021 DEBUG nova.compute.manager [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.505 222021 DEBUG nova.compute.manager [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.505 222021 DEBUG oslo_concurrency.lockutils [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.505 222021 DEBUG oslo_concurrency.lockutils [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.506 222021 DEBUG nova.network.neutron [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.596 222021 DEBUG nova.storage.rbd_utils [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] flattening images/a891f488-4cba-4fea-b482-6ac469142f81 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:24:10 np0005593233 nova_compute[222017]: 2026-01-23 10:24:10.939 222021 DEBUG nova.storage.rbd_utils [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] removing snapshot(5d716d7013b941f9942cdba37a7e562b) on rbd image(4b43bf7c-8fc3-4ea4-9401-283826c9ed39_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:24:11 np0005593233 nova_compute[222017]: 2026-01-23 10:24:11.221 222021 DEBUG oslo_concurrency.lockutils [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:11 np0005593233 nova_compute[222017]: 2026-01-23 10:24:11.223 222021 DEBUG oslo_concurrency.lockutils [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:11 np0005593233 nova_compute[222017]: 2026-01-23 10:24:11.223 222021 INFO nova.compute.manager [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Rebooting instance#033[00m
Jan 23 05:24:11 np0005593233 nova_compute[222017]: 2026-01-23 10:24:11.237 222021 DEBUG oslo_concurrency.lockutils [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e320 e320: 3 total, 3 up, 3 in
Jan 23 05:24:11 np0005593233 nova_compute[222017]: 2026-01-23 10:24:11.453 222021 DEBUG nova.storage.rbd_utils [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] creating snapshot(snap) on rbd image(a891f488-4cba-4fea-b482-6ac469142f81) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:24:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:11.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:12.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:12 np0005593233 nova_compute[222017]: 2026-01-23 10:24:12.252 222021 DEBUG nova.network.neutron [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:12 np0005593233 nova_compute[222017]: 2026-01-23 10:24:12.253 222021 DEBUG nova.network.neutron [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:12 np0005593233 nova_compute[222017]: 2026-01-23 10:24:12.268 222021 DEBUG oslo_concurrency.lockutils [req-e3fa0a2f-47d1-44e0-bca5-03946c9acce0 req-fe9ed5f3-ade4-4f0b-9817-774f5d8d453c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:12 np0005593233 nova_compute[222017]: 2026-01-23 10:24:12.268 222021 DEBUG oslo_concurrency.lockutils [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:12 np0005593233 nova_compute[222017]: 2026-01-23 10:24:12.269 222021 DEBUG nova.network.neutron [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:24:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e321 e321: 3 total, 3 up, 3 in
Jan 23 05:24:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:13 np0005593233 nova_compute[222017]: 2026-01-23 10:24:13.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:24:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:24:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:13.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:14.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.534 222021 INFO nova.virt.libvirt.driver [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Snapshot image upload complete#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.535 222021 INFO nova.compute.manager [None req-511ef405-790f-4a69-bc68-9cc488917bc6 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Took 5.97 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.800 222021 DEBUG nova.network.neutron [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.832 222021 DEBUG oslo_concurrency.lockutils [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:14 np0005593233 nova_compute[222017]: 2026-01-23 10:24:14.835 222021 DEBUG nova.compute.manager [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:15.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:16.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 e322: 3 total, 3 up, 3 in
Jan 23 05:24:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:17.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:17 np0005593233 kernel: tap5c1030f5-f5 (unregistering): left promiscuous mode
Jan 23 05:24:17 np0005593233 NetworkManager[48871]: <info>  [1769163857.8701] device (tap5c1030f5-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:24:17 np0005593233 nova_compute[222017]: 2026-01-23 10:24:17.930 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:17Z|00674|binding|INFO|Releasing lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d from this chassis (sb_readonly=0)
Jan 23 05:24:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:17Z|00675|binding|INFO|Setting lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d down in Southbound
Jan 23 05:24:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:17Z|00676|binding|INFO|Removing iface tap5c1030f5-f5 ovn-installed in OVS
Jan 23 05:24:17 np0005593233 nova_compute[222017]: 2026-01-23 10:24:17.933 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:17.939 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:c3:ec 10.100.0.10'], port_security=['fa:16:3e:d3:c3:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '30cb1bda-deda-42f9-9353-a25e3f64a4f8 77e10692-5f18-4d4e-ba14-6f09047b276a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5c1030f5-f5e0-41ec-b194-3304e1939c2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:17.943 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5c1030f5-f5e0-41ec-b194-3304e1939c2d in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb unbound from our chassis#033[00m
Jan 23 05:24:17 np0005593233 nova_compute[222017]: 2026-01-23 10:24:17.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:17.946 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b38c3ca-73e5-4583-a277-cd0670deffdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:24:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:17.948 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee8e97c-957f-4844-827e-325944ea342e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:17.949 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace which is not needed anymore#033[00m
Jan 23 05:24:17 np0005593233 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Jan 23 05:24:17 np0005593233 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a0.scope: Consumed 15.399s CPU time.
Jan 23 05:24:17 np0005593233 systemd-machined[190954]: Machine qemu-72-instance-000000a0 terminated.
Jan 23 05:24:18 np0005593233 nova_compute[222017]: 2026-01-23 10:24:18.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:18.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:18 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[282889]: [NOTICE]   (282893) : haproxy version is 2.8.14-c23fe91
Jan 23 05:24:18 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[282889]: [NOTICE]   (282893) : path to executable is /usr/sbin/haproxy
Jan 23 05:24:18 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[282889]: [ALERT]    (282893) : Current worker (282895) exited with code 143 (Terminated)
Jan 23 05:24:18 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[282889]: [WARNING]  (282893) : All workers exited. Exiting... (0)
Jan 23 05:24:18 np0005593233 systemd[1]: libpod-87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8.scope: Deactivated successfully.
Jan 23 05:24:18 np0005593233 podman[283917]: 2026-01-23 10:24:18.103284359 +0000 UTC m=+0.048310907 container died 87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:24:18 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8-userdata-shm.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593233 systemd[1]: var-lib-containers-storage-overlay-fe3eb1ae9318fe5ae0fcf811e9fdb19619073c146f509b158b191982f8f8f060-merged.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593233 podman[283917]: 2026-01-23 10:24:18.154132378 +0000 UTC m=+0.099158896 container cleanup 87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:24:18 np0005593233 systemd[1]: libpod-conmon-87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8.scope: Deactivated successfully.
Jan 23 05:24:18 np0005593233 podman[283957]: 2026-01-23 10:24:18.230049345 +0000 UTC m=+0.049287065 container remove 87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.236 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[90be2695-74b9-451e-aa47-66a28f0d92af]: (4, ('Fri Jan 23 10:24:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8)\n87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8\nFri Jan 23 10:24:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8)\n87a8c1181341e674ad2c65600417d3036d496625f08bb09e08a9276bd97b46b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.239 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3d95670a-7b45-49d5-89a5-04449e2b864c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.241 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:18 np0005593233 kernel: tap8b38c3ca-70: left promiscuous mode
Jan 23 05:24:18 np0005593233 nova_compute[222017]: 2026-01-23 10:24:18.244 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:18 np0005593233 nova_compute[222017]: 2026-01-23 10:24:18.261 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.263 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[02b01f86-51da-4529-83f3-199c53dd70d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.284 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4e8118-9d77-412d-8db0-259fa9d9086a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.287 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d92da61f-a7d5-4164-b457-0970cd14bc98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.311 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[369e2210-ea2a-4f95-a511-d471714b0b63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772795, 'reachable_time': 44204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283977, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 systemd[1]: run-netns-ovnmeta\x2d8b38c3ca\x2d73e5\x2d4583\x2da277\x2dcd0670deffdb.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.319 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:24:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:18.319 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[c4cf73f4-0301-45e3-a227-cdd37f04fdff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:18Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:57:6c 10.100.0.10
Jan 23 05:24:18 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:18Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:57:6c 10.100.0.10
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.069 222021 DEBUG nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-unplugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.070 222021 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.070 222021 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.071 222021 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.071 222021 DEBUG nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-unplugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.071 222021 WARNING nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received unexpected event network-vif-unplugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with vm_state active and task_state reboot_started.#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.071 222021 DEBUG nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.072 222021 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.072 222021 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.072 222021 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.073 222021 DEBUG nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.073 222021 WARNING nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received unexpected event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with vm_state active and task_state reboot_started.#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.118 222021 INFO nova.virt.libvirt.driver [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance shutdown successfully.#033[00m
Jan 23 05:24:19 np0005593233 kernel: tap5c1030f5-f5: entered promiscuous mode
Jan 23 05:24:19 np0005593233 systemd-udevd[283897]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:24:19 np0005593233 NetworkManager[48871]: <info>  [1769163859.2031] manager: (tap5c1030f5-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Jan 23 05:24:19 np0005593233 NetworkManager[48871]: <info>  [1769163859.2194] device (tap5c1030f5-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:24:19 np0005593233 NetworkManager[48871]: <info>  [1769163859.2204] device (tap5c1030f5-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:24:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:19Z|00677|binding|INFO|Claiming lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d for this chassis.
Jan 23 05:24:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:19Z|00678|binding|INFO|5c1030f5-f5e0-41ec-b194-3304e1939c2d: Claiming fa:16:3e:d3:c3:ec 10.100.0.10
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.244 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.254 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:c3:ec 10.100.0.10'], port_security=['fa:16:3e:d3:c3:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '30cb1bda-deda-42f9-9353-a25e3f64a4f8 77e10692-5f18-4d4e-ba14-6f09047b276a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5c1030f5-f5e0-41ec-b194-3304e1939c2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.256 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5c1030f5-f5e0-41ec-b194-3304e1939c2d in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb bound to our chassis#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.259 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b38c3ca-73e5-4583-a277-cd0670deffdb#033[00m
Jan 23 05:24:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:19Z|00679|binding|INFO|Setting lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d ovn-installed in OVS
Jan 23 05:24:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:19Z|00680|binding|INFO|Setting lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d up in Southbound
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.264 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.275 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce3440d0-635e-4459-8cad-0a70944beee4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.276 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b38c3ca-71 in ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.278 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b38c3ca-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.279 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[38316d80-3908-4b16-bf3c-b10d243d9865]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 systemd-machined[190954]: New machine qemu-74-instance-000000a0.
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.280 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4a5d6c-310e-4f51-bf60-d90124b86aa7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 systemd[1]: Started Virtual Machine qemu-74-instance-000000a0.
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.294 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[92a9c064-18d3-466d-b86d-478ebb2dec5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.302 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.321 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[99aec8ce-8099-4259-acf0-f161140101c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.356 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a4292c6b-2f42-40c5-828d-df88390130b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 NetworkManager[48871]: <info>  [1769163859.3655] manager: (tap8b38c3ca-70): new Veth device (/org/freedesktop/NetworkManager/Devices/315)
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.364 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2976ae-469e-4367-a75f-13006ef8cc2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.403 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1870e1ab-6c51-4d87-8706-eb8ae16a897e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.407 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6f476b77-634a-4e61-b0f7-c8669af19ecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 NetworkManager[48871]: <info>  [1769163859.4317] device (tap8b38c3ca-70): carrier: link connected
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.437 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[107358be-7e91-4de1-921f-48f9a2ae2da9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.456 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c445203-2c6e-4041-ab5c-fe53b6baec89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 777206, 'reachable_time': 31712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284022, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.478 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[98a2a680-91f6-4a71-a153-73574657e2e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:fa5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 777206, 'tstamp': 777206}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284023, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.500 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f41c5026-aa44-4f92-9d00-f5ebaef12756]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 777206, 'reachable_time': 31712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284031, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.544 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4ab196-9e54-407f-9c6e-1bab44469607]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.624 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[593b2a65-a739-4bf1-988f-4354c6242afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.627 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.628 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.628 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b38c3ca-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:19 np0005593233 NetworkManager[48871]: <info>  [1769163859.6313] manager: (tap8b38c3ca-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 23 05:24:19 np0005593233 kernel: tap8b38c3ca-70: entered promiscuous mode
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.633 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.633 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b38c3ca-70, col_values=(('external_ids', {'iface-id': '120d9d64-6853-4b50-a095-bddadd015ba1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:19 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:19Z|00681|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.693 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.695 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[588e0063-5c64-4fc3-b21c-dbc86da73d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.696 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:24:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:19.697 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'env', 'PROCESS_TAG=haproxy-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b38c3ca-73e5-4583-a277-cd0670deffdb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:24:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:19.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.955 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.956 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163859.9547737, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.956 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.965 222021 INFO nova.virt.libvirt.driver [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance running successfully.#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.966 222021 INFO nova.virt.libvirt.driver [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance soft rebooted successfully.#033[00m
Jan 23 05:24:19 np0005593233 nova_compute[222017]: 2026-01-23 10:24:19.967 222021 DEBUG nova.compute.manager [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:20.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:20 np0005593233 podman[284117]: 2026-01-23 10:24:20.095414037 +0000 UTC m=+0.039891979 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.199 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.205 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.237 222021 DEBUG oslo_concurrency.lockutils [None req-6d7617d5-1597-457b-aa4c-7b8ec97c1eb8 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 9.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.243 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163859.9562821, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.244 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] VM Started (Lifecycle Event)#033[00m
Jan 23 05:24:20 np0005593233 podman[284117]: 2026-01-23 10:24:20.25922124 +0000 UTC m=+0.203699202 container create 9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.271 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:20 np0005593233 nova_compute[222017]: 2026-01-23 10:24:20.279 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:20 np0005593233 systemd[1]: Started libpod-conmon-9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782.scope.
Jan 23 05:24:20 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:24:20 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cba769ff8908f05afcd4424c15ac520b6c99c657a2377e066b97e001440b0af8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:20 np0005593233 podman[284117]: 2026-01-23 10:24:20.646627258 +0000 UTC m=+0.591105210 container init 9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:24:20 np0005593233 podman[284117]: 2026-01-23 10:24:20.654448669 +0000 UTC m=+0.598926601 container start 9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 05:24:20 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [NOTICE]   (284187) : New worker (284189) forked
Jan 23 05:24:20 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [NOTICE]   (284187) : Loading success.
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.237 222021 DEBUG nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.239 222021 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.239 222021 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.240 222021 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.240 222021 DEBUG nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.241 222021 WARNING nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received unexpected event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.242 222021 DEBUG nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.242 222021 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.243 222021 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.243 222021 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.244 222021 DEBUG nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.244 222021 WARNING nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received unexpected event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:21 np0005593233 nova_compute[222017]: 2026-01-23 10:24:21.393 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:21.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:22.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:22 np0005593233 nova_compute[222017]: 2026-01-23 10:24:22.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:22 np0005593233 nova_compute[222017]: 2026-01-23 10:24:22.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:24:22 np0005593233 nova_compute[222017]: 2026-01-23 10:24:22.693 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:24:23 np0005593233 nova_compute[222017]: 2026-01-23 10:24:23.032 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:23.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:24.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:24 np0005593233 nova_compute[222017]: 2026-01-23 10:24:24.306 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:25 np0005593233 podman[284198]: 2026-01-23 10:24:25.184216373 +0000 UTC m=+0.176767431 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:24:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:25.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:26.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:24:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/610776583' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:27.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:28 np0005593233 nova_compute[222017]: 2026-01-23 10:24:28.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:28.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:29 np0005593233 nova_compute[222017]: 2026-01-23 10:24:29.323 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:30.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:31.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:32.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:33 np0005593233 nova_compute[222017]: 2026-01-23 10:24:33.039 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:33.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:34.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:34 np0005593233 nova_compute[222017]: 2026-01-23 10:24:34.331 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:35 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:35Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:c3:ec 10.100.0.10
Jan 23 05:24:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:35.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:36.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:37.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:38 np0005593233 nova_compute[222017]: 2026-01-23 10:24:38.043 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:38.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:39 np0005593233 nova_compute[222017]: 2026-01-23 10:24:39.336 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:39.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:40.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:41 np0005593233 podman[284227]: 2026-01-23 10:24:41.116720734 +0000 UTC m=+0.109909290 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:24:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:41.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:42.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.684012) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882684188, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 690, "num_deletes": 257, "total_data_size": 1140024, "memory_usage": 1158160, "flush_reason": "Manual Compaction"}
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Jan 23 05:24:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:42.687 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:42.688 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:42.690 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882695161, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 752775, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68554, "largest_seqno": 69239, "table_properties": {"data_size": 749273, "index_size": 1345, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8106, "raw_average_key_size": 19, "raw_value_size": 742106, "raw_average_value_size": 1766, "num_data_blocks": 58, "num_entries": 420, "num_filter_entries": 420, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163848, "oldest_key_time": 1769163848, "file_creation_time": 1769163882, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 11192 microseconds, and 6440 cpu microseconds.
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.695233) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 752775 bytes OK
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.695266) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.697587) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.697616) EVENT_LOG_v1 {"time_micros": 1769163882697606, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.697647) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 1136197, prev total WAL file size 1136197, number of live WAL files 2.
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.698801) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353131' seq:72057594037927935, type:22 .. '6C6F676D0032373633' seq:0, type:0; will stop at (end)
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(735KB)], [141(9566KB)]
Jan 23 05:24:42 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882698892, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 10548557, "oldest_snapshot_seqno": -1}
Jan 23 05:24:43 np0005593233 nova_compute[222017]: 2026-01-23 10:24:43.046 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 8710 keys, 10400839 bytes, temperature: kUnknown
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163883446139, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 10400839, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10346625, "index_size": 31317, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21829, "raw_key_size": 230484, "raw_average_key_size": 26, "raw_value_size": 10195727, "raw_average_value_size": 1170, "num_data_blocks": 1192, "num_entries": 8710, "num_filter_entries": 8710, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163882, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.447631) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 10400839 bytes
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.449422) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 14.1 rd, 13.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.3 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(27.8) write-amplify(13.8) OK, records in: 9242, records dropped: 532 output_compression: NoCompression
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.449458) EVENT_LOG_v1 {"time_micros": 1769163883449441, "job": 90, "event": "compaction_finished", "compaction_time_micros": 747224, "compaction_time_cpu_micros": 38731, "output_level": 6, "num_output_files": 1, "total_output_size": 10400839, "num_input_records": 9242, "num_output_records": 8710, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163883450027, "job": 90, "event": "table_file_deletion", "file_number": 143}
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163883453832, "job": 90, "event": "table_file_deletion", "file_number": 141}
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:42.698614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.453961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.453966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.453968) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.453971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:43 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:24:43.453972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:43.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:44.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:44 np0005593233 nova_compute[222017]: 2026-01-23 10:24:44.340 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:24:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/111370733' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:24:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:24:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/111370733' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:24:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:45.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:46.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:47.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:48 np0005593233 nova_compute[222017]: 2026-01-23 10:24:48.048 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:48.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:49 np0005593233 nova_compute[222017]: 2026-01-23 10:24:49.342 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:49.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:50.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:50 np0005593233 nova_compute[222017]: 2026-01-23 10:24:50.694 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:50 np0005593233 nova_compute[222017]: 2026-01-23 10:24:50.696 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:50 np0005593233 nova_compute[222017]: 2026-01-23 10:24:50.696 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:51 np0005593233 nova_compute[222017]: 2026-01-23 10:24:51.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:51 np0005593233 nova_compute[222017]: 2026-01-23 10:24:51.664 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:51 np0005593233 nova_compute[222017]: 2026-01-23 10:24:51.664 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:51 np0005593233 nova_compute[222017]: 2026-01-23 10:24:51.665 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:51 np0005593233 nova_compute[222017]: 2026-01-23 10:24:51.665 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:24:51 np0005593233 nova_compute[222017]: 2026-01-23 10:24:51.666 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:51.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:52Z|00682|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:24:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:24:52Z|00683|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.129 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:52.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:24:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3722609721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.183 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.567 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.568 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.568 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.572 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.573 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.795 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3990MB free_disk=20.851356506347656GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.799 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.799 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.917 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.918 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.918 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.918 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.936 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.953 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.953 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.966 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:24:52 np0005593233 nova_compute[222017]: 2026-01-23 10:24:52.986 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.049 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.097 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:24:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2274223017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.787 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.738s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.797 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:53.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.835 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.878 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:24:53 np0005593233 nova_compute[222017]: 2026-01-23 10:24:53.878 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:54.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:54 np0005593233 nova_compute[222017]: 2026-01-23 10:24:54.344 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:54 np0005593233 nova_compute[222017]: 2026-01-23 10:24:54.702 222021 DEBUG nova.compute.manager [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:54 np0005593233 nova_compute[222017]: 2026-01-23 10:24:54.703 222021 DEBUG nova.compute.manager [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:54 np0005593233 nova_compute[222017]: 2026-01-23 10:24:54.703 222021 DEBUG oslo_concurrency.lockutils [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:54 np0005593233 nova_compute[222017]: 2026-01-23 10:24:54.704 222021 DEBUG oslo_concurrency.lockutils [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:54 np0005593233 nova_compute[222017]: 2026-01-23 10:24:54.704 222021 DEBUG nova.network.neutron [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:55 np0005593233 nova_compute[222017]: 2026-01-23 10:24:55.166 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:55.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:56.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:56 np0005593233 podman[284294]: 2026-01-23 10:24:56.27753546 +0000 UTC m=+0.112392340 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:24:56 np0005593233 nova_compute[222017]: 2026-01-23 10:24:56.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:56.337 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:24:56.338 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:24:56 np0005593233 nova_compute[222017]: 2026-01-23 10:24:56.696 222021 DEBUG nova.network.neutron [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:56 np0005593233 nova_compute[222017]: 2026-01-23 10:24:56.698 222021 DEBUG nova.network.neutron [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:56 np0005593233 nova_compute[222017]: 2026-01-23 10:24:56.764 222021 DEBUG oslo_concurrency.lockutils [req-ffd9a68d-a952-498d-b069-a8dd0534f9e0 req-247b5e11-b475-44d6-bb6d-e7d3287026d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:57.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:57 np0005593233 nova_compute[222017]: 2026-01-23 10:24:57.879 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:57 np0005593233 nova_compute[222017]: 2026-01-23 10:24:57.880 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:24:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:58.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:24:58 np0005593233 nova_compute[222017]: 2026-01-23 10:24:58.148 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:58 np0005593233 nova_compute[222017]: 2026-01-23 10:24:58.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:58 np0005593233 nova_compute[222017]: 2026-01-23 10:24:58.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:24:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:59 np0005593233 nova_compute[222017]: 2026-01-23 10:24:59.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:24:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:59.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:00.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:01.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:02.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:02.341 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.388 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.443 222021 DEBUG nova.compute.manager [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.444 222021 DEBUG nova.compute.manager [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing instance network info cache due to event network-changed-5c1030f5-f5e0-41ec-b194-3304e1939c2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.445 222021 DEBUG oslo_concurrency.lockutils [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.445 222021 DEBUG oslo_concurrency.lockutils [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.446 222021 DEBUG nova.network.neutron [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Refreshing network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.598 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:02 np0005593233 nova_compute[222017]: 2026-01-23 10:25:02.642 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.199 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.227 222021 DEBUG oslo_concurrency.lockutils [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.228 222021 DEBUG oslo_concurrency.lockutils [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.272 222021 INFO nova.compute.manager [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Detaching volume 006fb8d9-c6d3-441f-993c-5b105d3e9681#033[00m
Jan 23 05:25:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.517 222021 INFO nova.virt.block_device [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Attempting to driver detach volume 006fb8d9-c6d3-441f-993c-5b105d3e9681 from mountpoint /dev/vdb#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.532 222021 DEBUG nova.virt.libvirt.driver [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Attempting to detach device vdb from instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.533 222021 DEBUG nova.virt.libvirt.guest [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-006fb8d9-c6d3-441f-993c-5b105d3e9681">
Jan 23 05:25:03 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <serial>006fb8d9-c6d3-441f-993c-5b105d3e9681</serial>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:25:03 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.590 222021 INFO nova.virt.libvirt.driver [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully detached device vdb from instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 from the persistent domain config.#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.590 222021 DEBUG nova.virt.libvirt.driver [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.591 222021 DEBUG nova.virt.libvirt.guest [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-006fb8d9-c6d3-441f-993c-5b105d3e9681">
Jan 23 05:25:03 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <serial>006fb8d9-c6d3-441f-993c-5b105d3e9681</serial>
Jan 23 05:25:03 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:25:03 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:25:03 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:25:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:03.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.826 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769163903.8259947, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.828 222021 DEBUG nova.virt.libvirt.driver [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:25:03 np0005593233 nova_compute[222017]: 2026-01-23 10:25:03.833 222021 INFO nova.virt.libvirt.driver [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully detached device vdb from instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 from the live domain config.#033[00m
Jan 23 05:25:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:04.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:04 np0005593233 nova_compute[222017]: 2026-01-23 10:25:04.351 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:05.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:06.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:06 np0005593233 nova_compute[222017]: 2026-01-23 10:25:06.775 222021 DEBUG nova.objects.instance [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'flavor' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:07 np0005593233 nova_compute[222017]: 2026-01-23 10:25:07.636 222021 DEBUG nova.network.neutron [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated VIF entry in instance network info cache for port 5c1030f5-f5e0-41ec-b194-3304e1939c2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:07 np0005593233 nova_compute[222017]: 2026-01-23 10:25:07.637 222021 DEBUG nova.network.neutron [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:07.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:08.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:08 np0005593233 nova_compute[222017]: 2026-01-23 10:25:08.202 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:08 np0005593233 nova_compute[222017]: 2026-01-23 10:25:08.674 222021 DEBUG oslo_concurrency.lockutils [req-3ca61acd-6d96-4457-8968-820970f621dd req-7fdf6a73-f312-41c0-b1ef-141d1ea53b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:08 np0005593233 nova_compute[222017]: 2026-01-23 10:25:08.676 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:08 np0005593233 nova_compute[222017]: 2026-01-23 10:25:08.676 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:25:08 np0005593233 nova_compute[222017]: 2026-01-23 10:25:08.677 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:09 np0005593233 nova_compute[222017]: 2026-01-23 10:25:09.270 222021 DEBUG oslo_concurrency.lockutils [None req-9b5cbc2c-93c9-41fa-8416-d7da7e246c02 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 6.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:09 np0005593233 nova_compute[222017]: 2026-01-23 10:25:09.354 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:09.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:10.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:11.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:12 np0005593233 podman[284325]: 2026-01-23 10:25:12.105021389 +0000 UTC m=+0.089678857 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:25:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:25:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:12.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:25:12 np0005593233 nova_compute[222017]: 2026-01-23 10:25:12.639 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [{"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:12 np0005593233 nova_compute[222017]: 2026-01-23 10:25:12.737 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:12 np0005593233 nova_compute[222017]: 2026-01-23 10:25:12.737 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.249 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.250 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.251 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.251 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.252 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.254 222021 INFO nova.compute.manager [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Terminating instance#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.256 222021 DEBUG nova.compute.manager [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:25:13 np0005593233 nova_compute[222017]: 2026-01-23 10:25:13.257 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:25:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:13.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:25:14 np0005593233 kernel: tap5c1030f5-f5 (unregistering): left promiscuous mode
Jan 23 05:25:14 np0005593233 NetworkManager[48871]: <info>  [1769163914.1497] device (tap5c1030f5-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:25:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:14Z|00684|binding|INFO|Releasing lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d from this chassis (sb_readonly=0)
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.164 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:14Z|00685|binding|INFO|Setting lport 5c1030f5-f5e0-41ec-b194-3304e1939c2d down in Southbound
Jan 23 05:25:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:14Z|00686|binding|INFO|Removing iface tap5c1030f5-f5 ovn-installed in OVS
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.169 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:14.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:14.174 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:c3:ec 10.100.0.10'], port_security=['fa:16:3e:d3:c3:ec 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '77e10692-5f18-4d4e-ba14-6f09047b276a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5c1030f5-f5e0-41ec-b194-3304e1939c2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:14.176 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5c1030f5-f5e0-41ec-b194-3304e1939c2d in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb unbound from our chassis#033[00m
Jan 23 05:25:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:14.178 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b38c3ca-73e5-4583-a277-cd0670deffdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:25:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:14.179 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e3957618-c5c1-4e2b-8470-b661f3c1c885]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:14.180 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace which is not needed anymore#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.186 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:14 np0005593233 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Jan 23 05:25:14 np0005593233 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000a0.scope: Consumed 16.671s CPU time.
Jan 23 05:25:14 np0005593233 systemd-machined[190954]: Machine qemu-74-instance-000000a0 terminated.
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.306 222021 INFO nova.virt.libvirt.driver [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Instance destroyed successfully.#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.307 222021 DEBUG nova.objects.instance [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'resources' on Instance uuid c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.334 222021 DEBUG nova.virt.libvirt.vif [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-539158124',display_name='tempest-TestMinimumBasicScenario-server-539158124',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-539158124',id=160,image_ref='4e1fa467-77ba-4764-82a0-700986e94bbd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmueIfk7VkHSUTAjgS/ZYpfWK/4j45WSHaYA6pCbrndOG+c3X8uZMvR7mBDlRDE24oh1oDcVJFtlCOd5K/FEKJpkR/txNCDTfmQxcShVYuyc8F1nHGiNqkP9PslO3GBJw==',key_name='tempest-TestMinimumBasicScenario-596586326',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-t0eooc93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4e1fa467-77ba-4764-82a0-700986e94bbd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:20Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.335 222021 DEBUG nova.network.os_vif_util [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "address": "fa:16:3e:d3:c3:ec", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c1030f5-f5", "ovs_interfaceid": "5c1030f5-f5e0-41ec-b194-3304e1939c2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.336 222021 DEBUG nova.network.os_vif_util [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.337 222021 DEBUG os_vif [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.341 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.341 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c1030f5-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.362 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.365 222021 INFO os_vif [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:c3:ec,bridge_name='br-int',has_traffic_filtering=True,id=5c1030f5-f5e0-41ec-b194-3304e1939c2d,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c1030f5-f5')#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.730 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:14 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [NOTICE]   (284187) : haproxy version is 2.8.14-c23fe91
Jan 23 05:25:14 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [NOTICE]   (284187) : path to executable is /usr/sbin/haproxy
Jan 23 05:25:14 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [WARNING]  (284187) : Exiting Master process...
Jan 23 05:25:14 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [ALERT]    (284187) : Current worker (284189) exited with code 143 (Terminated)
Jan 23 05:25:14 np0005593233 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[284183]: [WARNING]  (284187) : All workers exited. Exiting... (0)
Jan 23 05:25:14 np0005593233 systemd[1]: libpod-9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782.scope: Deactivated successfully.
Jan 23 05:25:14 np0005593233 podman[284368]: 2026-01-23 10:25:14.770571995 +0000 UTC m=+0.471399104 container died 9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.920 222021 DEBUG nova.compute.manager [req-71d1a2dc-7edf-49f5-9d3b-56961bfc6d37 req-ca053912-bf7a-4dca-997a-a1df0c05fcb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-unplugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.921 222021 DEBUG oslo_concurrency.lockutils [req-71d1a2dc-7edf-49f5-9d3b-56961bfc6d37 req-ca053912-bf7a-4dca-997a-a1df0c05fcb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.921 222021 DEBUG oslo_concurrency.lockutils [req-71d1a2dc-7edf-49f5-9d3b-56961bfc6d37 req-ca053912-bf7a-4dca-997a-a1df0c05fcb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.922 222021 DEBUG oslo_concurrency.lockutils [req-71d1a2dc-7edf-49f5-9d3b-56961bfc6d37 req-ca053912-bf7a-4dca-997a-a1df0c05fcb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.922 222021 DEBUG nova.compute.manager [req-71d1a2dc-7edf-49f5-9d3b-56961bfc6d37 req-ca053912-bf7a-4dca-997a-a1df0c05fcb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-unplugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:14 np0005593233 nova_compute[222017]: 2026-01-23 10:25:14.923 222021 DEBUG nova.compute.manager [req-71d1a2dc-7edf-49f5-9d3b-56961bfc6d37 req-ca053912-bf7a-4dca-997a-a1df0c05fcb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-unplugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:25:15 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782-userdata-shm.mount: Deactivated successfully.
Jan 23 05:25:15 np0005593233 systemd[1]: var-lib-containers-storage-overlay-cba769ff8908f05afcd4424c15ac520b6c99c657a2377e066b97e001440b0af8-merged.mount: Deactivated successfully.
Jan 23 05:25:15 np0005593233 podman[284368]: 2026-01-23 10:25:15.044147223 +0000 UTC m=+0.744974252 container cleanup 9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:25:15 np0005593233 systemd[1]: libpod-conmon-9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782.scope: Deactivated successfully.
Jan 23 05:25:15 np0005593233 podman[284424]: 2026-01-23 10:25:15.184730319 +0000 UTC m=+0.105249918 container remove 9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.197 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0c706964-f6b9-402f-ac46-7126176f05f3]: (4, ('Fri Jan 23 10:25:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782)\n9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782\nFri Jan 23 10:25:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782)\n9144d1bbde9ec65bb29f0f7fe2d568317a5834818cb90b408dc36ad4d3930782\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.200 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f0865477-839c-4adb-b8ad-4b34f1e84610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.201 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:15 np0005593233 kernel: tap8b38c3ca-70: left promiscuous mode
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.208 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.211 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e21c24-648e-4316-9413-d1c853288dbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.241 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d68457ec-c53d-4c5b-8f79-fa1a5e4118fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.243 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[28a2f025-7847-4798-bee0-a585546c66eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.265 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f81edd53-e382-4bfa-b95b-27fa5df046f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 777198, 'reachable_time': 33684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284439, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 systemd[1]: run-netns-ovnmeta\x2d8b38c3ca\x2d73e5\x2d4583\x2da277\x2dcd0670deffdb.mount: Deactivated successfully.
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.269 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:25:15 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:15.269 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[71dffc3b-44b6-4be3-bbcb-3ad1d0f2ba6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.505 222021 INFO nova.virt.libvirt.driver [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Deleting instance files /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_del#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.507 222021 INFO nova.virt.libvirt.driver [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Deletion of /var/lib/nova/instances/c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65_del complete#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.567 222021 INFO nova.compute.manager [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Took 2.31 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.568 222021 DEBUG oslo.service.loopingcall [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.569 222021 DEBUG nova.compute.manager [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:25:15 np0005593233 nova_compute[222017]: 2026-01-23 10:25:15.569 222021 DEBUG nova.network.neutron [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:25:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:15.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:16.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:16 np0005593233 nova_compute[222017]: 2026-01-23 10:25:16.998 222021 DEBUG nova.compute.manager [req-3590a248-869a-4e1d-8232-68c21f4aaa14 req-1032e45a-966d-483e-87dc-e0d29f1800b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:17 np0005593233 nova_compute[222017]: 2026-01-23 10:25:16.999 222021 DEBUG oslo_concurrency.lockutils [req-3590a248-869a-4e1d-8232-68c21f4aaa14 req-1032e45a-966d-483e-87dc-e0d29f1800b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:17 np0005593233 nova_compute[222017]: 2026-01-23 10:25:17.000 222021 DEBUG oslo_concurrency.lockutils [req-3590a248-869a-4e1d-8232-68c21f4aaa14 req-1032e45a-966d-483e-87dc-e0d29f1800b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:17 np0005593233 nova_compute[222017]: 2026-01-23 10:25:17.000 222021 DEBUG oslo_concurrency.lockutils [req-3590a248-869a-4e1d-8232-68c21f4aaa14 req-1032e45a-966d-483e-87dc-e0d29f1800b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:17 np0005593233 nova_compute[222017]: 2026-01-23 10:25:17.001 222021 DEBUG nova.compute.manager [req-3590a248-869a-4e1d-8232-68c21f4aaa14 req-1032e45a-966d-483e-87dc-e0d29f1800b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] No waiting events found dispatching network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:17 np0005593233 nova_compute[222017]: 2026-01-23 10:25:17.001 222021 WARNING nova.compute.manager [req-3590a248-869a-4e1d-8232-68c21f4aaa14 req-1032e45a-966d-483e-87dc-e0d29f1800b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received unexpected event network-vif-plugged-5c1030f5-f5e0-41ec-b194-3304e1939c2d for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:25:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:17.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.049 222021 DEBUG nova.network.neutron [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.118 222021 INFO nova.compute.manager [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Took 2.55 seconds to deallocate network for instance.#033[00m
Jan 23 05:25:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:18.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.214 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.215 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.245 222021 DEBUG nova.compute.manager [req-ebb19311-da98-4978-a02a-c4c858ebccaa req-2a3049c7-6355-4658-8b89-f4a9db5c0367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Received event network-vif-deleted-5c1030f5-f5e0-41ec-b194-3304e1939c2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.339 222021 DEBUG oslo_concurrency.processutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1339878094' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.893 222021 DEBUG oslo_concurrency.processutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.906 222021 DEBUG nova.compute.provider_tree [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.932 222021 DEBUG nova.scheduler.client.report [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:18 np0005593233 nova_compute[222017]: 2026-01-23 10:25:18.972 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:19 np0005593233 nova_compute[222017]: 2026-01-23 10:25:19.037 222021 INFO nova.scheduler.client.report [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Deleted allocations for instance c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65#033[00m
Jan 23 05:25:19 np0005593233 nova_compute[222017]: 2026-01-23 10:25:19.139 222021 DEBUG oslo_concurrency.lockutils [None req-1f7a877b-acb9-410d-8089-24e5e28c3e5f c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:19 np0005593233 nova_compute[222017]: 2026-01-23 10:25:19.362 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:19.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:20.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:21.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:22.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e323 e323: 3 total, 3 up, 3 in
Jan 23 05:25:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:25:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:25:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:25:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:23 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:25:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:23.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:24.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:24 np0005593233 nova_compute[222017]: 2026-01-23 10:25:24.364 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.543 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.544 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.580 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.688 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.689 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.696 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.697 222021 INFO nova.compute.claims [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:25:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:25.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:25 np0005593233 nova_compute[222017]: 2026-01-23 10:25:25.888 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:26.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2151418142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.379 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.385 222021 DEBUG nova.compute.provider_tree [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.406 222021 DEBUG nova.scheduler.client.report [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.445 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.445 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.512 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.513 222021 DEBUG nova.network.neutron [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.537 222021 INFO nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.555 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.695 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.697 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.698 222021 INFO nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Creating image(s)#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.741 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.784 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.831 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.836 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.939 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.940 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.941 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.941 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.970 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:26 np0005593233 nova_compute[222017]: 2026-01-23 10:25:26.975 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cd085899-5656-4c87-b235-43567e6b816e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:27 np0005593233 podman[284690]: 2026-01-23 10:25:27.108027424 +0000 UTC m=+0.100603137 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.160 222021 DEBUG nova.policy [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.357 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cd085899-5656-4c87-b235-43567e6b816e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.382s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.476 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image cd085899-5656-4c87-b235-43567e6b816e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.600 222021 DEBUG nova.objects.instance [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid cd085899-5656-4c87-b235-43567e6b816e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.618 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.619 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Ensure instance console log exists: /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.620 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.621 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:27 np0005593233 nova_compute[222017]: 2026-01-23 10:25:27.621 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:27.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:28.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.303 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163914.3004525, c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.304 222021 INFO nova.compute.manager [-] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.328 222021 DEBUG nova.compute.manager [None req-91f06a46-0df9-41ab-9d56-dd0926487198 - - - - - -] [instance: c5c48cf7-dc8c-42fe-bc7e-4a85b6fc2f65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.367 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:29Z|00687|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.385 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.456 222021 DEBUG nova.network.neutron [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Successfully updated port: 5bed4276-208c-472d-b28a-4a76e26a3285 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.481 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.481 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.481 222021 DEBUG nova.network.neutron [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.549 222021 DEBUG nova.compute.manager [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-changed-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.549 222021 DEBUG nova.compute.manager [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Refreshing instance network info cache due to event network-changed-5bed4276-208c-472d-b28a-4a76e26a3285. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.550 222021 DEBUG oslo_concurrency.lockutils [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:29Z|00688|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.632 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:29 np0005593233 nova_compute[222017]: 2026-01-23 10:25:29.710 222021 DEBUG nova.network.neutron [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:25:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:29 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:25:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:29.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:29 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:25:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:30.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:31.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:32.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.408 222021 DEBUG nova.network.neutron [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Updating instance_info_cache with network_info: [{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.433 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.433 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Instance network_info: |[{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.434 222021 DEBUG oslo_concurrency.lockutils [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.435 222021 DEBUG nova.network.neutron [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Refreshing network info cache for port 5bed4276-208c-472d-b28a-4a76e26a3285 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.440 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Start _get_guest_xml network_info=[{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.447 222021 WARNING nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.454 222021 DEBUG nova.virt.libvirt.host [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.455 222021 DEBUG nova.virt.libvirt.host [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.459 222021 DEBUG nova.virt.libvirt.host [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.460 222021 DEBUG nova.virt.libvirt.host [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.462 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.463 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.464 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.464 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.465 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.465 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.466 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.466 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.467 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.467 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.468 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.468 222021 DEBUG nova.virt.hardware [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.473 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 e324: 3 total, 3 up, 3 in
Jan 23 05:25:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:25:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3941323184' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:32 np0005593233 nova_compute[222017]: 2026-01-23 10:25:32.982 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.037 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.043 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:25:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/671780183' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.501 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.504 222021 DEBUG nova.virt.libvirt.vif [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-932408062',display_name='tempest-TestNetworkBasicOps-server-932408062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-932408062',id=165,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKK8Atuka4AJtrZoE156bOV/fyJgqZXVsquRFlMysHXHrafbklSy5dVpV/aMD/KRbSZHlePXDsSv775nCAPQZZ1XbAJzMyp1k/QF7F5sNrn+rRNpl+EHkbtgnTpAgIReSw==',key_name='tempest-TestNetworkBasicOps-1199685944',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-l5ylr06b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:26Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=cd085899-5656-4c87-b235-43567e6b816e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.505 222021 DEBUG nova.network.os_vif_util [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.506 222021 DEBUG nova.network.os_vif_util [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.508 222021 DEBUG nova.objects.instance [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid cd085899-5656-4c87-b235-43567e6b816e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.583 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <uuid>cd085899-5656-4c87-b235-43567e6b816e</uuid>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <name>instance-000000a5</name>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkBasicOps-server-932408062</nova:name>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:25:32</nova:creationTime>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <nova:port uuid="5bed4276-208c-472d-b28a-4a76e26a3285">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <entry name="serial">cd085899-5656-4c87-b235-43567e6b816e</entry>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <entry name="uuid">cd085899-5656-4c87-b235-43567e6b816e</entry>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cd085899-5656-4c87-b235-43567e6b816e_disk">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/cd085899-5656-4c87-b235-43567e6b816e_disk.config">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:38:3d:f1"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <target dev="tap5bed4276-20"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/console.log" append="off"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:25:33 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:25:33 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:25:33 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:25:33 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.585 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Preparing to wait for external event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.586 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.587 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.587 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.589 222021 DEBUG nova.virt.libvirt.vif [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-932408062',display_name='tempest-TestNetworkBasicOps-server-932408062',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-932408062',id=165,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKK8Atuka4AJtrZoE156bOV/fyJgqZXVsquRFlMysHXHrafbklSy5dVpV/aMD/KRbSZHlePXDsSv775nCAPQZZ1XbAJzMyp1k/QF7F5sNrn+rRNpl+EHkbtgnTpAgIReSw==',key_name='tempest-TestNetworkBasicOps-1199685944',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-l5ylr06b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:26Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=cd085899-5656-4c87-b235-43567e6b816e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.589 222021 DEBUG nova.network.os_vif_util [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.590 222021 DEBUG nova.network.os_vif_util [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.591 222021 DEBUG os_vif [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.593 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.594 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.598 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.599 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bed4276-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.600 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5bed4276-20, col_values=(('external_ids', {'iface-id': '5bed4276-208c-472d-b28a-4a76e26a3285', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:3d:f1', 'vm-uuid': 'cd085899-5656-4c87-b235-43567e6b816e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:33 np0005593233 NetworkManager[48871]: <info>  [1769163933.6044] manager: (tap5bed4276-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.613 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.615 222021 INFO os_vif [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20')#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.691 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.691 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.691 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:38:3d:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.692 222021 INFO nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Using config drive#033[00m
Jan 23 05:25:33 np0005593233 nova_compute[222017]: 2026-01-23 10:25:33.727 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:33.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:34.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:34 np0005593233 nova_compute[222017]: 2026-01-23 10:25:34.372 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:35 np0005593233 nova_compute[222017]: 2026-01-23 10:25:35.588 222021 INFO nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Creating config drive at /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/disk.config#033[00m
Jan 23 05:25:35 np0005593233 nova_compute[222017]: 2026-01-23 10:25:35.599 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgwvj0ulx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:35 np0005593233 nova_compute[222017]: 2026-01-23 10:25:35.762 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgwvj0ulx" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:35 np0005593233 nova_compute[222017]: 2026-01-23 10:25:35.815 222021 DEBUG nova.storage.rbd_utils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image cd085899-5656-4c87-b235-43567e6b816e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:35 np0005593233 nova_compute[222017]: 2026-01-23 10:25:35.821 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/disk.config cd085899-5656-4c87-b235-43567e6b816e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.065 222021 DEBUG oslo_concurrency.processutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/disk.config cd085899-5656-4c87-b235-43567e6b816e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.244s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.067 222021 INFO nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Deleting local config drive /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e/disk.config because it was imported into RBD.#033[00m
Jan 23 05:25:36 np0005593233 kernel: tap5bed4276-20: entered promiscuous mode
Jan 23 05:25:36 np0005593233 NetworkManager[48871]: <info>  [1769163936.1531] manager: (tap5bed4276-20): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 23 05:25:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:36Z|00689|binding|INFO|Claiming lport 5bed4276-208c-472d-b28a-4a76e26a3285 for this chassis.
Jan 23 05:25:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:36Z|00690|binding|INFO|5bed4276-208c-472d-b28a-4a76e26a3285: Claiming fa:16:3e:38:3d:f1 10.100.0.6
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.175 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:3d:f1 10.100.0.6'], port_security=['fa:16:3e:38:3d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cd085899-5656-4c87-b235-43567e6b816e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '104c556a-4616-455b-9049-a55a5af0ff57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf31d3f7-11b9-4331-9cba-3dbe5755a315, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5bed4276-208c-472d-b28a-4a76e26a3285) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.178 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5bed4276-208c-472d-b28a-4a76e26a3285 in datapath e713ea69-7ac7-43ba-86db-9b05cba5a525 bound to our chassis#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.181 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e713ea69-7ac7-43ba-86db-9b05cba5a525#033[00m
Jan 23 05:25:36 np0005593233 systemd-udevd[284992]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.204 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[614c62a4-18c0-42ae-935f-3fabfd3e3c69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.205 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape713ea69-71 in ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:25:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:36.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:36 np0005593233 NetworkManager[48871]: <info>  [1769163936.2122] device (tap5bed4276-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.210 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape713ea69-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.210 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebed4e6-2d3b-4972-9dea-94e0cccacad3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 NetworkManager[48871]: <info>  [1769163936.2140] device (tap5bed4276-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.214 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8884ee-51d0-4823-9348-edbfc954ebf1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 systemd-machined[190954]: New machine qemu-75-instance-000000a5.
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.234 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[36f37363-bae6-4c53-b961-7a19d1f17fb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 systemd[1]: Started Virtual Machine qemu-75-instance-000000a5.
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:36Z|00691|binding|INFO|Setting lport 5bed4276-208c-472d-b28a-4a76e26a3285 ovn-installed in OVS
Jan 23 05:25:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:36Z|00692|binding|INFO|Setting lport 5bed4276-208c-472d-b28a-4a76e26a3285 up in Southbound
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.260 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.265 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8dca7f35-e75f-4719-a7b7-26cbe5842ac6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.315 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f54ec46a-0fea-4fd6-a6ed-7a08cfa282d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 NetworkManager[48871]: <info>  [1769163936.3254] manager: (tape713ea69-70): new Veth device (/org/freedesktop/NetworkManager/Devices/319)
Jan 23 05:25:36 np0005593233 systemd-udevd[284998]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.326 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b152e3f1-7585-4f22-ad82-5514e8cd6089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.382 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[738f3fd6-c877-4f09-ac76-47997f4b86b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.386 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a665868f-6a39-4c3c-8106-f38e9c8755c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 NetworkManager[48871]: <info>  [1769163936.4221] device (tape713ea69-70): carrier: link connected
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.434 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b33c7a12-26d5-4694-83af-308d4e2cda5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.463 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd16478-75c2-4398-bbb6-385112da47c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape713ea69-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:2a:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 784905, 'reachable_time': 25870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285028, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.491 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0ee4d0-7420-4394-8f27-3163e89c779b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:2ab1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 784905, 'tstamp': 784905}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285029, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.519 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[17b4e47a-b6d1-4663-9013-c10fa6bad496]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape713ea69-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:2a:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 784905, 'reachable_time': 25870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285030, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.579 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[786785dc-8d67-4490-8904-ae91ba36764c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.662 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[42df3226-61ff-4359-80ba-cdb809b0936c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.664 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape713ea69-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.664 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.665 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape713ea69-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.706 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:36 np0005593233 kernel: tape713ea69-70: entered promiscuous mode
Jan 23 05:25:36 np0005593233 NetworkManager[48871]: <info>  [1769163936.7075] manager: (tape713ea69-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.711 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape713ea69-70, col_values=(('external_ids', {'iface-id': '5ee55fc7-887b-474a-9dd6-1dad231cc12c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:36Z|00693|binding|INFO|Releasing lport 5ee55fc7-887b-474a-9dd6-1dad231cc12c from this chassis (sb_readonly=0)
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.719 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.729 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.730 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e713ea69-7ac7-43ba-86db-9b05cba5a525.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e713ea69-7ac7-43ba-86db-9b05cba5a525.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.731 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1030dad9-7dcd-4012-ae45-5e1d297c9fce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.732 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-e713ea69-7ac7-43ba-86db-9b05cba5a525
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/e713ea69-7ac7-43ba-86db-9b05cba5a525.pid.haproxy
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID e713ea69-7ac7-43ba-86db-9b05cba5a525
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:25:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:36.733 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'env', 'PROCESS_TAG=haproxy-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e713ea69-7ac7-43ba-86db-9b05cba5a525.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.817 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163936.817351, cd085899-5656-4c87-b235-43567e6b816e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.818 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] VM Started (Lifecycle Event)#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.841 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.846 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163936.8184795, cd085899-5656-4c87-b235-43567e6b816e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.846 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.866 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.871 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:25:36 np0005593233 nova_compute[222017]: 2026-01-23 10:25:36.895 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:25:37 np0005593233 podman[285104]: 2026-01-23 10:25:37.273830905 +0000 UTC m=+0.076928897 container create 2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:37 np0005593233 podman[285104]: 2026-01-23 10:25:37.231395584 +0000 UTC m=+0.034493626 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:25:37 np0005593233 systemd[1]: Started libpod-conmon-2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5.scope.
Jan 23 05:25:37 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:25:37 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3ce9bc2fffbff4f71bbab8f5382dc7ca838f22583e9e1c06d57908181561fc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:37 np0005593233 podman[285104]: 2026-01-23 10:25:37.406346323 +0000 UTC m=+0.209444295 container init 2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:25:37 np0005593233 podman[285104]: 2026-01-23 10:25:37.417722505 +0000 UTC m=+0.220820467 container start 2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:25:37 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [NOTICE]   (285125) : New worker (285127) forked
Jan 23 05:25:37 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [NOTICE]   (285125) : Loading success.
Jan 23 05:25:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.879 222021 DEBUG nova.compute.manager [req-871b35e7-58e2-4bb9-a0c6-302c0c25d9df req-676203e7-d2c5-4a37-8546-8cb26668ad28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.880 222021 DEBUG oslo_concurrency.lockutils [req-871b35e7-58e2-4bb9-a0c6-302c0c25d9df req-676203e7-d2c5-4a37-8546-8cb26668ad28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.881 222021 DEBUG oslo_concurrency.lockutils [req-871b35e7-58e2-4bb9-a0c6-302c0c25d9df req-676203e7-d2c5-4a37-8546-8cb26668ad28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.881 222021 DEBUG oslo_concurrency.lockutils [req-871b35e7-58e2-4bb9-a0c6-302c0c25d9df req-676203e7-d2c5-4a37-8546-8cb26668ad28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.882 222021 DEBUG nova.compute.manager [req-871b35e7-58e2-4bb9-a0c6-302c0c25d9df req-676203e7-d2c5-4a37-8546-8cb26668ad28 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Processing event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.883 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.891 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.892 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163937.8907251, cd085899-5656-4c87-b235-43567e6b816e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.893 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.901 222021 INFO nova.virt.libvirt.driver [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] Instance spawned successfully.#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.901 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.916 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.927 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.935 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.936 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.937 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.938 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.938 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.939 222021 DEBUG nova.virt.libvirt.driver [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:37 np0005593233 nova_compute[222017]: 2026-01-23 10:25:37.975 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.036 222021 INFO nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Took 11.34 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.037 222021 DEBUG nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.124 222021 INFO nova.compute.manager [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Took 12.48 seconds to build instance.#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.147 222021 DEBUG oslo_concurrency.lockutils [None req-729dc440-e43c-4ea2-8f64-e87f66006f98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:38.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.796 222021 DEBUG nova.network.neutron [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Updated VIF entry in instance network info cache for port 5bed4276-208c-472d-b28a-4a76e26a3285. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.797 222021 DEBUG nova.network.neutron [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Updating instance_info_cache with network_info: [{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:38 np0005593233 nova_compute[222017]: 2026-01-23 10:25:38.980 222021 DEBUG oslo_concurrency.lockutils [req-5db2454e-8bb4-4575-abac-5e54c7d15d96 req-6cc7b61d-8d97-46d6-b585-7abe132f685c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:39 np0005593233 nova_compute[222017]: 2026-01-23 10:25:39.375 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:40.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:40 np0005593233 nova_compute[222017]: 2026-01-23 10:25:40.859 222021 DEBUG nova.compute.manager [req-80c4582a-3f92-4831-8384-3c237f6b2c09 req-fac6790e-15e2-4777-bf06-2531c77ab261 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:40 np0005593233 nova_compute[222017]: 2026-01-23 10:25:40.860 222021 DEBUG oslo_concurrency.lockutils [req-80c4582a-3f92-4831-8384-3c237f6b2c09 req-fac6790e-15e2-4777-bf06-2531c77ab261 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:40 np0005593233 nova_compute[222017]: 2026-01-23 10:25:40.860 222021 DEBUG oslo_concurrency.lockutils [req-80c4582a-3f92-4831-8384-3c237f6b2c09 req-fac6790e-15e2-4777-bf06-2531c77ab261 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:40 np0005593233 nova_compute[222017]: 2026-01-23 10:25:40.860 222021 DEBUG oslo_concurrency.lockutils [req-80c4582a-3f92-4831-8384-3c237f6b2c09 req-fac6790e-15e2-4777-bf06-2531c77ab261 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:40 np0005593233 nova_compute[222017]: 2026-01-23 10:25:40.861 222021 DEBUG nova.compute.manager [req-80c4582a-3f92-4831-8384-3c237f6b2c09 req-fac6790e-15e2-4777-bf06-2531c77ab261 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] No waiting events found dispatching network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:40 np0005593233 nova_compute[222017]: 2026-01-23 10:25:40.861 222021 WARNING nova.compute.manager [req-80c4582a-3f92-4831-8384-3c237f6b2c09 req-fac6790e-15e2-4777-bf06-2531c77ab261 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received unexpected event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:25:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:42.688 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:42.689 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:42.690 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:43 np0005593233 podman[285136]: 2026-01-23 10:25:43.093949547 +0000 UTC m=+0.089224595 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:25:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:43 np0005593233 nova_compute[222017]: 2026-01-23 10:25:43.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:43 np0005593233 nova_compute[222017]: 2026-01-23 10:25:43.924 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:43 np0005593233 NetworkManager[48871]: <info>  [1769163943.9267] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Jan 23 05:25:43 np0005593233 NetworkManager[48871]: <info>  [1769163943.9297] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 23 05:25:44 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:44Z|00694|binding|INFO|Releasing lport 5ee55fc7-887b-474a-9dd6-1dad231cc12c from this chassis (sb_readonly=0)
Jan 23 05:25:44 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:44Z|00695|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:25:44 np0005593233 nova_compute[222017]: 2026-01-23 10:25:44.033 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:44 np0005593233 nova_compute[222017]: 2026-01-23 10:25:44.047 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:44.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:44 np0005593233 nova_compute[222017]: 2026-01-23 10:25:44.383 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:25:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3772001341' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:25:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:25:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3772001341' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.217 222021 DEBUG nova.compute.manager [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-changed-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.218 222021 DEBUG nova.compute.manager [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Refreshing instance network info cache due to event network-changed-5bed4276-208c-472d-b28a-4a76e26a3285. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.219 222021 DEBUG oslo_concurrency.lockutils [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.219 222021 DEBUG oslo_concurrency.lockutils [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.219 222021 DEBUG nova.network.neutron [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Refreshing network info cache for port 5bed4276-208c-472d-b28a-4a76e26a3285 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.646 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.647 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.647 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.649 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.649 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.651 222021 INFO nova.compute.manager [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Terminating instance#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.653 222021 DEBUG nova.compute.manager [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:25:45 np0005593233 kernel: tap5bed4276-20 (unregistering): left promiscuous mode
Jan 23 05:25:45 np0005593233 NetworkManager[48871]: <info>  [1769163945.7055] device (tap5bed4276-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:45Z|00696|binding|INFO|Releasing lport 5bed4276-208c-472d-b28a-4a76e26a3285 from this chassis (sb_readonly=0)
Jan 23 05:25:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:45Z|00697|binding|INFO|Setting lport 5bed4276-208c-472d-b28a-4a76e26a3285 down in Southbound
Jan 23 05:25:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:25:45Z|00698|binding|INFO|Removing iface tap5bed4276-20 ovn-installed in OVS
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.713 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:45.719 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:3d:f1 10.100.0.6'], port_security=['fa:16:3e:38:3d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cd085899-5656-4c87-b235-43567e6b816e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '104c556a-4616-455b-9049-a55a5af0ff57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf31d3f7-11b9-4331-9cba-3dbe5755a315, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5bed4276-208c-472d-b28a-4a76e26a3285) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:45.720 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5bed4276-208c-472d-b28a-4a76e26a3285 in datapath e713ea69-7ac7-43ba-86db-9b05cba5a525 unbound from our chassis#033[00m
Jan 23 05:25:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:45.721 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e713ea69-7ac7-43ba-86db-9b05cba5a525, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:25:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:45.723 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8e42ca-9264-4080-95a7-1034239c0c85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:45.723 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 namespace which is not needed anymore#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.732 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Jan 23 05:25:45 np0005593233 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a5.scope: Consumed 8.550s CPU time.
Jan 23 05:25:45 np0005593233 systemd-machined[190954]: Machine qemu-75-instance-000000a5 terminated.
Jan 23 05:25:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:45.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.891 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.899 222021 INFO nova.virt.libvirt.driver [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] Instance destroyed successfully.#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.900 222021 DEBUG nova.objects.instance [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid cd085899-5656-4c87-b235-43567e6b816e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:45 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [NOTICE]   (285125) : haproxy version is 2.8.14-c23fe91
Jan 23 05:25:45 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [NOTICE]   (285125) : path to executable is /usr/sbin/haproxy
Jan 23 05:25:45 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [WARNING]  (285125) : Exiting Master process...
Jan 23 05:25:45 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [ALERT]    (285125) : Current worker (285127) exited with code 143 (Terminated)
Jan 23 05:25:45 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285121]: [WARNING]  (285125) : All workers exited. Exiting... (0)
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.922 222021 DEBUG nova.virt.libvirt.vif [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:25:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-932408062',display_name='tempest-TestNetworkBasicOps-server-932408062',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-932408062',id=165,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKK8Atuka4AJtrZoE156bOV/fyJgqZXVsquRFlMysHXHrafbklSy5dVpV/aMD/KRbSZHlePXDsSv775nCAPQZZ1XbAJzMyp1k/QF7F5sNrn+rRNpl+EHkbtgnTpAgIReSw==',key_name='tempest-TestNetworkBasicOps-1199685944',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:25:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-l5ylr06b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:38Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=cd085899-5656-4c87-b235-43567e6b816e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.922 222021 DEBUG nova.network.os_vif_util [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.923 222021 DEBUG nova.network.os_vif_util [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:45 np0005593233 systemd[1]: libpod-2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5.scope: Deactivated successfully.
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.923 222021 DEBUG os_vif [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.924 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.925 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bed4276-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.926 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:45 np0005593233 podman[285183]: 2026-01-23 10:25:45.928677136 +0000 UTC m=+0.080297472 container died 2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.930 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.933 222021 INFO os_vif [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20')#033[00m
Jan 23 05:25:45 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5-userdata-shm.mount: Deactivated successfully.
Jan 23 05:25:45 np0005593233 systemd[1]: var-lib-containers-storage-overlay-e3ce9bc2fffbff4f71bbab8f5382dc7ca838f22583e9e1c06d57908181561fc9-merged.mount: Deactivated successfully.
Jan 23 05:25:45 np0005593233 podman[285183]: 2026-01-23 10:25:45.969958054 +0000 UTC m=+0.121578420 container cleanup 2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.975 222021 DEBUG nova.compute.manager [req-21fb5746-9d89-4561-97df-5a31f018270e req-0cd034b5-78cd-4ead-8858-c70595ef0d59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-vif-unplugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.977 222021 DEBUG oslo_concurrency.lockutils [req-21fb5746-9d89-4561-97df-5a31f018270e req-0cd034b5-78cd-4ead-8858-c70595ef0d59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.978 222021 DEBUG oslo_concurrency.lockutils [req-21fb5746-9d89-4561-97df-5a31f018270e req-0cd034b5-78cd-4ead-8858-c70595ef0d59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.981 222021 DEBUG oslo_concurrency.lockutils [req-21fb5746-9d89-4561-97df-5a31f018270e req-0cd034b5-78cd-4ead-8858-c70595ef0d59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.982 222021 DEBUG nova.compute.manager [req-21fb5746-9d89-4561-97df-5a31f018270e req-0cd034b5-78cd-4ead-8858-c70595ef0d59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] No waiting events found dispatching network-vif-unplugged-5bed4276-208c-472d-b28a-4a76e26a3285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:45 np0005593233 nova_compute[222017]: 2026-01-23 10:25:45.983 222021 DEBUG nova.compute.manager [req-21fb5746-9d89-4561-97df-5a31f018270e req-0cd034b5-78cd-4ead-8858-c70595ef0d59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-vif-unplugged-5bed4276-208c-472d-b28a-4a76e26a3285 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:25:46 np0005593233 systemd[1]: libpod-conmon-2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5.scope: Deactivated successfully.
Jan 23 05:25:46 np0005593233 podman[285232]: 2026-01-23 10:25:46.093075506 +0000 UTC m=+0.071955776 container remove 2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.102 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6dce9848-b987-41c3-b690-b3883cabae0b]: (4, ('Fri Jan 23 10:25:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 (2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5)\n2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5\nFri Jan 23 10:25:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 (2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5)\n2d093f1087b572d8a4f013d09fa086af8ab15f2f8126da5518d2bc595db675e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.104 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd478af-e26d-4abe-ac94-b2edb76f2cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.105 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape713ea69-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:46 np0005593233 kernel: tape713ea69-70: left promiscuous mode
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.123 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.126 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5479a77e-65c4-4426-a310-487304ab326d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.143 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5e963a8e-199b-415c-bf2b-844903cd841e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.145 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e86210-383a-4f8b-ae07-77dad1299094]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.169 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccb98b3-71bc-4098-a97f-08a8c3612666]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 784893, 'reachable_time': 36020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285253, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 systemd[1]: run-netns-ovnmeta\x2de713ea69\x2d7ac7\x2d43ba\x2d86db\x2d9b05cba5a525.mount: Deactivated successfully.
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.175 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:25:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:46.175 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a78bd6-815c-43ad-9b19-7a07d2e4626c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:46.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.411 222021 INFO nova.virt.libvirt.driver [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Deleting instance files /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e_del#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.413 222021 INFO nova.virt.libvirt.driver [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Deletion of /var/lib/nova/instances/cd085899-5656-4c87-b235-43567e6b816e_del complete#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.486 222021 INFO nova.compute.manager [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.487 222021 DEBUG oslo.service.loopingcall [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.488 222021 DEBUG nova.compute.manager [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.488 222021 DEBUG nova.network.neutron [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.908 222021 DEBUG nova.network.neutron [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Updated VIF entry in instance network info cache for port 5bed4276-208c-472d-b28a-4a76e26a3285. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.909 222021 DEBUG nova.network.neutron [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Updating instance_info_cache with network_info: [{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:46 np0005593233 nova_compute[222017]: 2026-01-23 10:25:46.934 222021 DEBUG oslo_concurrency.lockutils [req-d360fc15-b200-465d-9d96-6b3b58eb5427 req-74c88494-8451-4fef-9ca9-a925093d2293 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cd085899-5656-4c87-b235-43567e6b816e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:47.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.178 222021 DEBUG nova.compute.manager [req-db1bf348-b17d-4831-a6f6-beafe3fbbf6d req-7c0420b5-369e-4224-832f-046bc0ef8359 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.179 222021 DEBUG oslo_concurrency.lockutils [req-db1bf348-b17d-4831-a6f6-beafe3fbbf6d req-7c0420b5-369e-4224-832f-046bc0ef8359 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cd085899-5656-4c87-b235-43567e6b816e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.180 222021 DEBUG oslo_concurrency.lockutils [req-db1bf348-b17d-4831-a6f6-beafe3fbbf6d req-7c0420b5-369e-4224-832f-046bc0ef8359 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.181 222021 DEBUG oslo_concurrency.lockutils [req-db1bf348-b17d-4831-a6f6-beafe3fbbf6d req-7c0420b5-369e-4224-832f-046bc0ef8359 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.181 222021 DEBUG nova.compute.manager [req-db1bf348-b17d-4831-a6f6-beafe3fbbf6d req-7c0420b5-369e-4224-832f-046bc0ef8359 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] No waiting events found dispatching network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.182 222021 WARNING nova.compute.manager [req-db1bf348-b17d-4831-a6f6-beafe3fbbf6d req-7c0420b5-369e-4224-832f-046bc0ef8359 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cd085899-5656-4c87-b235-43567e6b816e] Received unexpected event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:25:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:48.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.826 222021 DEBUG nova.network.neutron [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.869 222021 INFO nova.compute.manager [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] Took 2.38 seconds to deallocate network for instance.#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.958 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:48 np0005593233 nova_compute[222017]: 2026-01-23 10:25:48.959 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.038 222021 DEBUG oslo_concurrency.processutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/864784306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.547 222021 DEBUG oslo_concurrency.processutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.558 222021 DEBUG nova.compute.provider_tree [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.635 222021 DEBUG nova.scheduler.client.report [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.685 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.733 222021 INFO nova.scheduler.client.report [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance cd085899-5656-4c87-b235-43567e6b816e#033[00m
Jan 23 05:25:49 np0005593233 nova_compute[222017]: 2026-01-23 10:25:49.847 222021 DEBUG oslo_concurrency.lockutils [None req-40d8658b-cd96-4e02-b982-ea530b8588f0 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "cd085899-5656-4c87-b235-43567e6b816e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:49.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:50.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:50 np0005593233 nova_compute[222017]: 2026-01-23 10:25:50.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:51 np0005593233 nova_compute[222017]: 2026-01-23 10:25:51.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:51.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:52.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.416 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.417 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.417 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.418 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.418 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4238971846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:52 np0005593233 nova_compute[222017]: 2026-01-23 10:25:52.944 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.073 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.075 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.297 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.299 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4184MB free_disk=20.94263458251953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.299 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.300 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.407 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.408 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.409 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.490 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:53.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/171495245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.949 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.957 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:53 np0005593233 nova_compute[222017]: 2026-01-23 10:25:53.977 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:54 np0005593233 nova_compute[222017]: 2026-01-23 10:25:54.007 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:25:54 np0005593233 nova_compute[222017]: 2026-01-23 10:25:54.007 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:54.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:54 np0005593233 nova_compute[222017]: 2026-01-23 10:25:54.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:55 np0005593233 nova_compute[222017]: 2026-01-23 10:25:55.212 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:55.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:55 np0005593233 nova_compute[222017]: 2026-01-23 10:25:55.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:56.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:56.468 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:25:56.469 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:25:56 np0005593233 nova_compute[222017]: 2026-01-23 10:25:56.510 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:57.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:58 np0005593233 nova_compute[222017]: 2026-01-23 10:25:58.007 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:58 np0005593233 nova_compute[222017]: 2026-01-23 10:25:58.007 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:58 np0005593233 podman[285322]: 2026-01-23 10:25:58.105404956 +0000 UTC m=+0.116562618 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 23 05:25:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:25:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:58.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:25:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:59 np0005593233 nova_compute[222017]: 2026-01-23 10:25:59.409 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:25:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:59.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:00.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:00 np0005593233 nova_compute[222017]: 2026-01-23 10:26:00.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:00 np0005593233 nova_compute[222017]: 2026-01-23 10:26:00.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:26:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:00.471 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:00 np0005593233 nova_compute[222017]: 2026-01-23 10:26:00.898 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163945.8959608, cd085899-5656-4c87-b235-43567e6b816e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:00 np0005593233 nova_compute[222017]: 2026-01-23 10:26:00.898 222021 INFO nova.compute.manager [-] [instance: cd085899-5656-4c87-b235-43567e6b816e] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:26:00 np0005593233 nova_compute[222017]: 2026-01-23 10:26:00.932 222021 DEBUG nova.compute.manager [None req-1b10e6f3-2522-430d-9a49-cb189b39260d - - - - - -] [instance: cd085899-5656-4c87-b235-43567e6b816e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:00 np0005593233 nova_compute[222017]: 2026-01-23 10:26:00.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:01.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:02.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:03.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:04.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:04 np0005593233 nova_compute[222017]: 2026-01-23 10:26:04.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:04 np0005593233 nova_compute[222017]: 2026-01-23 10:26:04.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:26:04 np0005593233 nova_compute[222017]: 2026-01-23 10:26:04.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:04 np0005593233 nova_compute[222017]: 2026-01-23 10:26:04.767 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:04 np0005593233 nova_compute[222017]: 2026-01-23 10:26:04.767 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:04 np0005593233 nova_compute[222017]: 2026-01-23 10:26:04.893 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.001 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.002 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.011 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.012 222021 INFO nova.compute.claims [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.716 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.768 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.769 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.769 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:26:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:05.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:05 np0005593233 nova_compute[222017]: 2026-01-23 10:26:05.936 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1429278563' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:06 np0005593233 nova_compute[222017]: 2026-01-23 10:26:06.214 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:06 np0005593233 nova_compute[222017]: 2026-01-23 10:26:06.223 222021 DEBUG nova.compute.provider_tree [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:06.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:06 np0005593233 nova_compute[222017]: 2026-01-23 10:26:06.885 222021 DEBUG nova.scheduler.client.report [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.111 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.112 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.177 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.178 222021 DEBUG nova.network.neutron [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.205 222021 INFO nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.229 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.339 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.341 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.341 222021 INFO nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Creating image(s)#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.374 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.414 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.490 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.495 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.589 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.591 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.592 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.592 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.624 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:07 np0005593233 nova_compute[222017]: 2026-01-23 10:26:07.629 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3bfa1478-0947-4811-85ff-d4060ce986d5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:07.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:08 np0005593233 nova_compute[222017]: 2026-01-23 10:26:08.021 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3bfa1478-0947-4811-85ff-d4060ce986d5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:08 np0005593233 nova_compute[222017]: 2026-01-23 10:26:08.130 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:26:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:08.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:08 np0005593233 nova_compute[222017]: 2026-01-23 10:26:08.280 222021 DEBUG nova.objects.instance [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 3bfa1478-0947-4811-85ff-d4060ce986d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.072 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.073 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Ensure instance console log exists: /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.074 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.074 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.075 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.243 222021 DEBUG nova.policy [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:26:09 np0005593233 nova_compute[222017]: 2026-01-23 10:26:09.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:09.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:10.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:10 np0005593233 nova_compute[222017]: 2026-01-23 10:26:10.939 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:11.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:12.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:12 np0005593233 nova_compute[222017]: 2026-01-23 10:26:12.473 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updating instance_info_cache with network_info: [{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:12 np0005593233 nova_compute[222017]: 2026-01-23 10:26:12.544 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:12 np0005593233 nova_compute[222017]: 2026-01-23 10:26:12.545 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:26:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:13.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:14 np0005593233 podman[285538]: 2026-01-23 10:26:14.130341441 +0000 UTC m=+0.118310373 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:26:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:14.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:14 np0005593233 nova_compute[222017]: 2026-01-23 10:26:14.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593233 nova_compute[222017]: 2026-01-23 10:26:14.539 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:15 np0005593233 nova_compute[222017]: 2026-01-23 10:26:15.843 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:15.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:15 np0005593233 nova_compute[222017]: 2026-01-23 10:26:15.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.239 222021 DEBUG nova.network.neutron [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Successfully updated port: 5bed4276-208c-472d-b28a-4a76e26a3285 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:26:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:16.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.269 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-3bfa1478-0947-4811-85ff-d4060ce986d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.270 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-3bfa1478-0947-4811-85ff-d4060ce986d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.270 222021 DEBUG nova.network.neutron [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.397 222021 DEBUG nova.compute.manager [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received event network-changed-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.398 222021 DEBUG nova.compute.manager [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Refreshing instance network info cache due to event network-changed-5bed4276-208c-472d-b28a-4a76e26a3285. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.398 222021 DEBUG oslo_concurrency.lockutils [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3bfa1478-0947-4811-85ff-d4060ce986d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:16 np0005593233 nova_compute[222017]: 2026-01-23 10:26:16.574 222021 DEBUG nova.network.neutron [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:26:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:17.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.146 222021 DEBUG nova.network.neutron [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Updating instance_info_cache with network_info: [{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.195 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-3bfa1478-0947-4811-85ff-d4060ce986d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.196 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Instance network_info: |[{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.197 222021 DEBUG oslo_concurrency.lockutils [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3bfa1478-0947-4811-85ff-d4060ce986d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.198 222021 DEBUG nova.network.neutron [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Refreshing network info cache for port 5bed4276-208c-472d-b28a-4a76e26a3285 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.203 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Start _get_guest_xml network_info=[{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.213 222021 WARNING nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.221 222021 DEBUG nova.virt.libvirt.host [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.223 222021 DEBUG nova.virt.libvirt.host [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.233 222021 DEBUG nova.virt.libvirt.host [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.234 222021 DEBUG nova.virt.libvirt.host [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.236 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.237 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.238 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.238 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.239 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.239 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.240 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.240 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.241 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.242 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.242 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.243 222021 DEBUG nova.virt.hardware [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.248 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:18.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/122562768' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.728 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.770 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:18 np0005593233 nova_compute[222017]: 2026-01-23 10:26:18.779 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2359491398' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.283 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.287 222021 DEBUG nova.virt.libvirt.vif [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1818933455',display_name='tempest-TestNetworkBasicOps-server-1818933455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1818933455',id=167,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJV3DQUn9v/PqATvS0xW9EKwQ+s1mwCIfuAQUovJGyzqGg9YbwQnX3+Q3f+XhHouwk8ltI98mwJS81A9BK2zM75KOD2Z+eu45Wv9M0CfKV0DODbZ3EI60nuDBJpWTSgPrg==',key_name='tempest-TestNetworkBasicOps-1304979968',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-zja4fskr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:07Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=3bfa1478-0947-4811-85ff-d4060ce986d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.288 222021 DEBUG nova.network.os_vif_util [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.290 222021 DEBUG nova.network.os_vif_util [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.292 222021 DEBUG nova.objects.instance [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bfa1478-0947-4811-85ff-d4060ce986d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.311 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <uuid>3bfa1478-0947-4811-85ff-d4060ce986d5</uuid>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <name>instance-000000a7</name>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkBasicOps-server-1818933455</nova:name>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:26:18</nova:creationTime>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <nova:port uuid="5bed4276-208c-472d-b28a-4a76e26a3285">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <entry name="serial">3bfa1478-0947-4811-85ff-d4060ce986d5</entry>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <entry name="uuid">3bfa1478-0947-4811-85ff-d4060ce986d5</entry>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3bfa1478-0947-4811-85ff-d4060ce986d5_disk">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3bfa1478-0947-4811-85ff-d4060ce986d5_disk.config">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:38:3d:f1"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <target dev="tap5bed4276-20"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/console.log" append="off"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:26:19 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:26:19 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:26:19 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:26:19 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.311 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Preparing to wait for external event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.312 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.312 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.312 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.313 222021 DEBUG nova.virt.libvirt.vif [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1818933455',display_name='tempest-TestNetworkBasicOps-server-1818933455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1818933455',id=167,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJV3DQUn9v/PqATvS0xW9EKwQ+s1mwCIfuAQUovJGyzqGg9YbwQnX3+Q3f+XhHouwk8ltI98mwJS81A9BK2zM75KOD2Z+eu45Wv9M0CfKV0DODbZ3EI60nuDBJpWTSgPrg==',key_name='tempest-TestNetworkBasicOps-1304979968',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-zja4fskr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:07Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=3bfa1478-0947-4811-85ff-d4060ce986d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.313 222021 DEBUG nova.network.os_vif_util [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.313 222021 DEBUG nova.network.os_vif_util [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.314 222021 DEBUG os_vif [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.314 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.315 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.315 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.318 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bed4276-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.319 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5bed4276-20, col_values=(('external_ids', {'iface-id': '5bed4276-208c-472d-b28a-4a76e26a3285', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:3d:f1', 'vm-uuid': '3bfa1478-0947-4811-85ff-d4060ce986d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.320 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593233 NetworkManager[48871]: <info>  [1769163979.3216] manager: (tap5bed4276-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.322 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.330 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.330 222021 INFO os_vif [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20')#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.446 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.458 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.458 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.458 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:38:3d:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.459 222021 INFO nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Using config drive#033[00m
Jan 23 05:26:19 np0005593233 nova_compute[222017]: 2026-01-23 10:26:19.490 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:19.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:20.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:20 np0005593233 nova_compute[222017]: 2026-01-23 10:26:20.895 222021 INFO nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Creating config drive at /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/disk.config#033[00m
Jan 23 05:26:20 np0005593233 nova_compute[222017]: 2026-01-23 10:26:20.907 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbk9d8l6i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.073 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbk9d8l6i" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.123 222021 DEBUG nova.storage.rbd_utils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 3bfa1478-0947-4811-85ff-d4060ce986d5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.129 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/disk.config 3bfa1478-0947-4811-85ff-d4060ce986d5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.360 222021 DEBUG oslo_concurrency.processutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/disk.config 3bfa1478-0947-4811-85ff-d4060ce986d5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.361 222021 INFO nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Deleting local config drive /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5/disk.config because it was imported into RBD.#033[00m
Jan 23 05:26:21 np0005593233 kernel: tap5bed4276-20: entered promiscuous mode
Jan 23 05:26:21 np0005593233 NetworkManager[48871]: <info>  [1769163981.4452] manager: (tap5bed4276-20): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Jan 23 05:26:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:21Z|00699|binding|INFO|Claiming lport 5bed4276-208c-472d-b28a-4a76e26a3285 for this chassis.
Jan 23 05:26:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:21Z|00700|binding|INFO|5bed4276-208c-472d-b28a-4a76e26a3285: Claiming fa:16:3e:38:3d:f1 10.100.0.6
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.448 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.460 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:3d:f1 10.100.0.6'], port_security=['fa:16:3e:38:3d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3bfa1478-0947-4811-85ff-d4060ce986d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '7', 'neutron:security_group_ids': '104c556a-4616-455b-9049-a55a5af0ff57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.242'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf31d3f7-11b9-4331-9cba-3dbe5755a315, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5bed4276-208c-472d-b28a-4a76e26a3285) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.463 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5bed4276-208c-472d-b28a-4a76e26a3285 in datapath e713ea69-7ac7-43ba-86db-9b05cba5a525 bound to our chassis#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.466 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e713ea69-7ac7-43ba-86db-9b05cba5a525#033[00m
Jan 23 05:26:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:21Z|00701|binding|INFO|Setting lport 5bed4276-208c-472d-b28a-4a76e26a3285 ovn-installed in OVS
Jan 23 05:26:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:21Z|00702|binding|INFO|Setting lport 5bed4276-208c-472d-b28a-4a76e26a3285 up in Southbound
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.486 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.489 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6046fbf2-1f5b-48e8-81d0-38bc4f444a14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.491 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape713ea69-71 in ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.493 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.495 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape713ea69-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.496 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dd55eac5-0b3a-4445-9431-19a6d80edbe6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.497 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c3124356-c779-40b7-b2b8-45ef78cf9eef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 systemd-udevd[285694]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:21 np0005593233 systemd-machined[190954]: New machine qemu-76-instance-000000a7.
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.518 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[729df6fe-79b6-43ee-98cf-61d8129d8934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 NetworkManager[48871]: <info>  [1769163981.5304] device (tap5bed4276-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:26:21 np0005593233 NetworkManager[48871]: <info>  [1769163981.5314] device (tap5bed4276-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:26:21 np0005593233 systemd[1]: Started Virtual Machine qemu-76-instance-000000a7.
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.553 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d592c02b-6bce-41e4-837f-69e38b110c22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.599 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[773dcbe3-cf51-4868-a186-1b7fc5392ea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 systemd-udevd[285697]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:21 np0005593233 NetworkManager[48871]: <info>  [1769163981.6072] manager: (tape713ea69-70): new Veth device (/org/freedesktop/NetworkManager/Devices/325)
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.606 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e87cfd0e-6ae6-4dae-a043-bd681620b28e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.660 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b9423b-2080-4e53-a642-d0fe795107d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.664 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9224f3a3-fbb7-4ce7-afd5-5f3bd773666a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 NetworkManager[48871]: <info>  [1769163981.7037] device (tape713ea69-70): carrier: link connected
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.710 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ef135981-2f88-4659-b8d8-5101ac0e32e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.727 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6eab017b-d299-4669-92bc-19cfa344e8a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape713ea69-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:2a:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 789433, 'reachable_time': 31384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285725, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.748 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a000886-aee6-45f8-b808-fda14fc52b6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:2ab1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 789433, 'tstamp': 789433}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285726, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.780 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[39c2464a-68d1-4bf4-8c62-3aa95c6b8075]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape713ea69-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:2a:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 789433, 'reachable_time': 31384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285727, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.841 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe64e28-5d56-427a-83b5-c6eb4c8c54c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:21.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.938 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[06e8350b-738b-45cf-9595-eb0a01aae876]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.940 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape713ea69-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.941 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.942 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape713ea69-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 NetworkManager[48871]: <info>  [1769163981.9461] manager: (tape713ea69-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Jan 23 05:26:21 np0005593233 kernel: tape713ea69-70: entered promiscuous mode
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.950 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape713ea69-70, col_values=(('external_ids', {'iface-id': '5ee55fc7-887b-474a-9dd6-1dad231cc12c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.952 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:21Z|00703|binding|INFO|Releasing lport 5ee55fc7-887b-474a-9dd6-1dad231cc12c from this chassis (sb_readonly=0)
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.970 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.972 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e713ea69-7ac7-43ba-86db-9b05cba5a525.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e713ea69-7ac7-43ba-86db-9b05cba5a525.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.973 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[356137cc-6fef-4753-8542-a403945362e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.974 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-e713ea69-7ac7-43ba-86db-9b05cba5a525
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/e713ea69-7ac7-43ba-86db-9b05cba5a525.pid.haproxy
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID e713ea69-7ac7-43ba-86db-9b05cba5a525
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:26:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:21.975 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'env', 'PROCESS_TAG=haproxy-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e713ea69-7ac7-43ba-86db-9b05cba5a525.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.991 222021 DEBUG nova.compute.manager [req-b6ac9f40-38ec-4ee4-bd89-a2fd020ddf78 req-d65fdd5a-a627-44de-b2d2-2a1fec8e6954 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.991 222021 DEBUG oslo_concurrency.lockutils [req-b6ac9f40-38ec-4ee4-bd89-a2fd020ddf78 req-d65fdd5a-a627-44de-b2d2-2a1fec8e6954 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.991 222021 DEBUG oslo_concurrency.lockutils [req-b6ac9f40-38ec-4ee4-bd89-a2fd020ddf78 req-d65fdd5a-a627-44de-b2d2-2a1fec8e6954 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.991 222021 DEBUG oslo_concurrency.lockutils [req-b6ac9f40-38ec-4ee4-bd89-a2fd020ddf78 req-d65fdd5a-a627-44de-b2d2-2a1fec8e6954 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:21 np0005593233 nova_compute[222017]: 2026-01-23 10:26:21.992 222021 DEBUG nova.compute.manager [req-b6ac9f40-38ec-4ee4-bd89-a2fd020ddf78 req-d65fdd5a-a627-44de-b2d2-2a1fec8e6954 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Processing event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.028 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163982.028082, 3bfa1478-0947-4811-85ff-d4060ce986d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.029 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] VM Started (Lifecycle Event)#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.032 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.041 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.045 222021 INFO nova.virt.libvirt.driver [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Instance spawned successfully.#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.046 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.091 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.096 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.130 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.130 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163982.0320292, 3bfa1478-0947-4811-85ff-d4060ce986d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.131 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.167 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.168 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.168 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.168 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.168 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.169 222021 DEBUG nova.virt.libvirt.driver [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:22.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.367 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.372 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769163982.0372543, 3bfa1478-0947-4811-85ff-d4060ce986d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.372 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.411 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.416 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.426 222021 INFO nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Took 15.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.426 222021 DEBUG nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.441 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:22 np0005593233 podman[285801]: 2026-01-23 10:26:22.459693353 +0000 UTC m=+0.060119860 container create d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.500 222021 INFO nova.compute.manager [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Took 17.53 seconds to build instance.#033[00m
Jan 23 05:26:22 np0005593233 systemd[1]: Started libpod-conmon-d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c.scope.
Jan 23 05:26:22 np0005593233 nova_compute[222017]: 2026-01-23 10:26:22.521 222021 DEBUG oslo_concurrency.lockutils [None req-f9efd16f-02c5-4263-92ef-6366fd5490ad 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:22 np0005593233 podman[285801]: 2026-01-23 10:26:22.427629072 +0000 UTC m=+0.028055569 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:26:22 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:26:22 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e72a77bd27a9248bd61d15240765b5e72c0363ba9f0e948c5855c3186249137a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:26:22 np0005593233 podman[285801]: 2026-01-23 10:26:22.593884885 +0000 UTC m=+0.194311362 container init d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:22 np0005593233 podman[285801]: 2026-01-23 10:26:22.60109029 +0000 UTC m=+0.201516747 container start d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:26:22 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [NOTICE]   (285822) : New worker (285824) forked
Jan 23 05:26:22 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [NOTICE]   (285822) : Loading success.
Jan 23 05:26:23 np0005593233 nova_compute[222017]: 2026-01-23 10:26:23.237 222021 DEBUG nova.network.neutron [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Updated VIF entry in instance network info cache for port 5bed4276-208c-472d-b28a-4a76e26a3285. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:23 np0005593233 nova_compute[222017]: 2026-01-23 10:26:23.238 222021 DEBUG nova.network.neutron [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Updating instance_info_cache with network_info: [{"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:23 np0005593233 nova_compute[222017]: 2026-01-23 10:26:23.258 222021 DEBUG oslo_concurrency.lockutils [req-d8bc1f7e-d3b2-46a8-98b2-7837ce326061 req-68c2ea66-c34a-47ac-8101-618f4f9f5801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3bfa1478-0947-4811-85ff-d4060ce986d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:23 np0005593233 nova_compute[222017]: 2026-01-23 10:26:23.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:23.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.246 222021 DEBUG nova.compute.manager [req-4a6171fb-f2b4-46a0-8fc4-124d4488084c req-c773ead7-48ba-4125-80cf-0fe30cab58d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.247 222021 DEBUG oslo_concurrency.lockutils [req-4a6171fb-f2b4-46a0-8fc4-124d4488084c req-c773ead7-48ba-4125-80cf-0fe30cab58d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.247 222021 DEBUG oslo_concurrency.lockutils [req-4a6171fb-f2b4-46a0-8fc4-124d4488084c req-c773ead7-48ba-4125-80cf-0fe30cab58d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.248 222021 DEBUG oslo_concurrency.lockutils [req-4a6171fb-f2b4-46a0-8fc4-124d4488084c req-c773ead7-48ba-4125-80cf-0fe30cab58d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.248 222021 DEBUG nova.compute.manager [req-4a6171fb-f2b4-46a0-8fc4-124d4488084c req-c773ead7-48ba-4125-80cf-0fe30cab58d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] No waiting events found dispatching network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.248 222021 WARNING nova.compute.manager [req-4a6171fb-f2b4-46a0-8fc4-124d4488084c req-c773ead7-48ba-4125-80cf-0fe30cab58d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received unexpected event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:24.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.323 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:24 np0005593233 nova_compute[222017]: 2026-01-23 10:26:24.449 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.237 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.238 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.238 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.239 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.239 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.243 222021 INFO nova.compute.manager [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Terminating instance#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.244 222021 DEBUG nova.compute.manager [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:26:25 np0005593233 kernel: tap5bed4276-20 (unregistering): left promiscuous mode
Jan 23 05:26:25 np0005593233 NetworkManager[48871]: <info>  [1769163985.2926] device (tap5bed4276-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:26:25 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:25Z|00704|binding|INFO|Releasing lport 5bed4276-208c-472d-b28a-4a76e26a3285 from this chassis (sb_readonly=0)
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.353 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:25Z|00705|binding|INFO|Setting lport 5bed4276-208c-472d-b28a-4a76e26a3285 down in Southbound
Jan 23 05:26:25 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:25Z|00706|binding|INFO|Removing iface tap5bed4276-20 ovn-installed in OVS
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.359 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.363 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:3d:f1 10.100.0.6'], port_security=['fa:16:3e:38:3d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3bfa1478-0947-4811-85ff-d4060ce986d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-629113122', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '9', 'neutron:security_group_ids': '104c556a-4616-455b-9049-a55a5af0ff57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.242', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf31d3f7-11b9-4331-9cba-3dbe5755a315, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=5bed4276-208c-472d-b28a-4a76e26a3285) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.364 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 5bed4276-208c-472d-b28a-4a76e26a3285 in datapath e713ea69-7ac7-43ba-86db-9b05cba5a525 unbound from our chassis#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.366 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e713ea69-7ac7-43ba-86db-9b05cba5a525, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.367 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[19dfd38f-7d5c-4264-827f-6ad640ff0776]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.368 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 namespace which is not needed anymore#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 23 05:26:25 np0005593233 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a7.scope: Consumed 3.852s CPU time.
Jan 23 05:26:25 np0005593233 systemd-machined[190954]: Machine qemu-76-instance-000000a7 terminated.
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.506 222021 INFO nova.virt.libvirt.driver [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Instance destroyed successfully.#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.509 222021 DEBUG nova.objects.instance [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid 3bfa1478-0947-4811-85ff-d4060ce986d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.535 222021 DEBUG nova.virt.libvirt.vif [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:25:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1818933455',display_name='tempest-TestNetworkBasicOps-server-1818933455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1818933455',id=167,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJV3DQUn9v/PqATvS0xW9EKwQ+s1mwCIfuAQUovJGyzqGg9YbwQnX3+Q3f+XhHouwk8ltI98mwJS81A9BK2zM75KOD2Z+eu45Wv9M0CfKV0DODbZ3EI60nuDBJpWTSgPrg==',key_name='tempest-TestNetworkBasicOps-1304979968',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-zja4fskr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:22Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=3bfa1478-0947-4811-85ff-d4060ce986d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.537 222021 DEBUG nova.network.os_vif_util [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "5bed4276-208c-472d-b28a-4a76e26a3285", "address": "fa:16:3e:38:3d:f1", "network": {"id": "e713ea69-7ac7-43ba-86db-9b05cba5a525", "bridge": "br-int", "label": "tempest-network-smoke--1178081706", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5bed4276-20", "ovs_interfaceid": "5bed4276-208c-472d-b28a-4a76e26a3285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.539 222021 DEBUG nova.network.os_vif_util [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.540 222021 DEBUG os_vif [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [NOTICE]   (285822) : haproxy version is 2.8.14-c23fe91
Jan 23 05:26:25 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [NOTICE]   (285822) : path to executable is /usr/sbin/haproxy
Jan 23 05:26:25 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [WARNING]  (285822) : Exiting Master process...
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.544 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bed4276-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [ALERT]    (285822) : Current worker (285824) exited with code 143 (Terminated)
Jan 23 05:26:25 np0005593233 neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525[285818]: [WARNING]  (285822) : All workers exited. Exiting... (0)
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.549 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 systemd[1]: libpod-d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c.scope: Deactivated successfully.
Jan 23 05:26:25 np0005593233 conmon[285818]: conmon d96d66be4dc1cfc747fb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c.scope/container/memory.events
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.554 222021 INFO os_vif [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:3d:f1,bridge_name='br-int',has_traffic_filtering=True,id=5bed4276-208c-472d-b28a-4a76e26a3285,network=Network(e713ea69-7ac7-43ba-86db-9b05cba5a525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5bed4276-20')#033[00m
Jan 23 05:26:25 np0005593233 podman[285861]: 2026-01-23 10:26:25.557532938 +0000 UTC m=+0.058121932 container died d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:26:25 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:26:25 np0005593233 systemd[1]: var-lib-containers-storage-overlay-e72a77bd27a9248bd61d15240765b5e72c0363ba9f0e948c5855c3186249137a-merged.mount: Deactivated successfully.
Jan 23 05:26:25 np0005593233 podman[285861]: 2026-01-23 10:26:25.607611501 +0000 UTC m=+0.108200475 container cleanup d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:26:25 np0005593233 systemd[1]: libpod-conmon-d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c.scope: Deactivated successfully.
Jan 23 05:26:25 np0005593233 podman[285916]: 2026-01-23 10:26:25.710164305 +0000 UTC m=+0.071714288 container remove d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.720 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d945836b-303a-4cff-af5f-044143d8706e]: (4, ('Fri Jan 23 10:26:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 (d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c)\nd96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c\nFri Jan 23 10:26:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 (d96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c)\nd96d66be4dc1cfc747fb86de16e0eb85bff5faf48f0490d4065d5dee67edaa4c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.722 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fd510482-b911-43b0-a0c8-188218d617dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.724 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape713ea69-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.727261) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985727551, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 1252, "num_deletes": 252, "total_data_size": 2760676, "memory_usage": 2793552, "flush_reason": "Manual Compaction"}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Jan 23 05:26:25 np0005593233 kernel: tape713ea69-70: left promiscuous mode
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.731 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985750268, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 1823916, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 69244, "largest_seqno": 70491, "table_properties": {"data_size": 1818311, "index_size": 2935, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12543, "raw_average_key_size": 20, "raw_value_size": 1806945, "raw_average_value_size": 2938, "num_data_blocks": 128, "num_entries": 615, "num_filter_entries": 615, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163883, "oldest_key_time": 1769163883, "file_creation_time": 1769163985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 22890 microseconds, and 10581 cpu microseconds.
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.750346) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 1823916 bytes OK
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.750380) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.753613) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.753638) EVENT_LOG_v1 {"time_micros": 1769163985753630, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.753662) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 2754552, prev total WAL file size 2754552, number of live WAL files 2.
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.755070) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(1781KB)], [144(10157KB)]
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985755130, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 12224755, "oldest_snapshot_seqno": -1}
Jan 23 05:26:25 np0005593233 nova_compute[222017]: 2026-01-23 10:26:25.759 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.762 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8524bc45-056d-4fdf-8ae5-8ef6259ec14f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.779 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b4657474-f4c6-4601-9f79-5fffec0f579c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.780 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[193f16a0-d6a4-419a-a394-0b66958fe39a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.806 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ce299f-c5ef-4d29-841c-6bb7dc998222]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 789422, 'reachable_time': 31161, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285931, 'error': None, 'target': 'ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.810 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e713ea69-7ac7-43ba-86db-9b05cba5a525 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:26:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:25.810 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef5c831-1425-44cf-85f3-03a71314cf91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:25 np0005593233 systemd[1]: run-netns-ovnmeta\x2de713ea69\x2d7ac7\x2d43ba\x2d86db\x2d9b05cba5a525.mount: Deactivated successfully.
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 8802 keys, 10313682 bytes, temperature: kUnknown
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985898749, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 10313682, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10259015, "index_size": 31578, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22021, "raw_key_size": 233255, "raw_average_key_size": 26, "raw_value_size": 10106568, "raw_average_value_size": 1148, "num_data_blocks": 1198, "num_entries": 8802, "num_filter_entries": 8802, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769163985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.899329) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 10313682 bytes
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.901959) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 84.9 rd, 71.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.9 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(12.4) write-amplify(5.7) OK, records in: 9325, records dropped: 523 output_compression: NoCompression
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.901995) EVENT_LOG_v1 {"time_micros": 1769163985901978, "job": 92, "event": "compaction_finished", "compaction_time_micros": 143987, "compaction_time_cpu_micros": 50940, "output_level": 6, "num_output_files": 1, "total_output_size": 10313682, "num_input_records": 9325, "num_output_records": 8802, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985902869, "job": 92, "event": "table_file_deletion", "file_number": 146}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985906419, "job": 92, "event": "table_file_deletion", "file_number": 144}
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.754950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.906578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.906589) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.906593) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.906598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:26:25.906608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:25.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.244 222021 INFO nova.virt.libvirt.driver [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Deleting instance files /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5_del#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.246 222021 INFO nova.virt.libvirt.driver [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Deletion of /var/lib/nova/instances/3bfa1478-0947-4811-85ff-d4060ce986d5_del complete#033[00m
Jan 23 05:26:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:26.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.314 222021 INFO nova.compute.manager [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Took 1.07 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.316 222021 DEBUG oslo.service.loopingcall [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.317 222021 DEBUG nova.compute.manager [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.317 222021 DEBUG nova.network.neutron [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.386 222021 DEBUG nova.compute.manager [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received event network-vif-unplugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.386 222021 DEBUG oslo_concurrency.lockutils [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.387 222021 DEBUG oslo_concurrency.lockutils [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.387 222021 DEBUG oslo_concurrency.lockutils [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.387 222021 DEBUG nova.compute.manager [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] No waiting events found dispatching network-vif-unplugged-5bed4276-208c-472d-b28a-4a76e26a3285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.388 222021 DEBUG nova.compute.manager [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received event network-vif-unplugged-5bed4276-208c-472d-b28a-4a76e26a3285 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.388 222021 DEBUG nova.compute.manager [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.389 222021 DEBUG oslo_concurrency.lockutils [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.389 222021 DEBUG oslo_concurrency.lockutils [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.389 222021 DEBUG oslo_concurrency.lockutils [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.390 222021 DEBUG nova.compute.manager [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] No waiting events found dispatching network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:26 np0005593233 nova_compute[222017]: 2026-01-23 10:26:26.390 222021 WARNING nova.compute.manager [req-a628d618-5bae-427d-b318-ade9bc986a92 req-c99bd59d-e048-43c7-8784-360949de5751 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Received unexpected event network-vif-plugged-5bed4276-208c-472d-b28a-4a76e26a3285 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:26:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3965147079' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:27.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:29 np0005593233 podman[285933]: 2026-01-23 10:26:29.155503647 +0000 UTC m=+0.154201142 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:26:29 np0005593233 nova_compute[222017]: 2026-01-23 10:26:29.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:29.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:30.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:30 np0005593233 nova_compute[222017]: 2026-01-23 10:26:30.500 222021 DEBUG nova.network.neutron [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:30 np0005593233 nova_compute[222017]: 2026-01-23 10:26:30.521 222021 INFO nova.compute.manager [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Took 4.20 seconds to deallocate network for instance.#033[00m
Jan 23 05:26:30 np0005593233 nova_compute[222017]: 2026-01-23 10:26:30.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:30 np0005593233 nova_compute[222017]: 2026-01-23 10:26:30.577 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:30 np0005593233 nova_compute[222017]: 2026-01-23 10:26:30.578 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:30 np0005593233 nova_compute[222017]: 2026-01-23 10:26:30.804 222021 DEBUG oslo_concurrency.processutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2697495844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:31 np0005593233 nova_compute[222017]: 2026-01-23 10:26:31.350 222021 DEBUG oslo_concurrency.processutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:31 np0005593233 nova_compute[222017]: 2026-01-23 10:26:31.360 222021 DEBUG nova.compute.provider_tree [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:31 np0005593233 nova_compute[222017]: 2026-01-23 10:26:31.384 222021 DEBUG nova.scheduler.client.report [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:31 np0005593233 nova_compute[222017]: 2026-01-23 10:26:31.424 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:31 np0005593233 nova_compute[222017]: 2026-01-23 10:26:31.455 222021 INFO nova.scheduler.client.report [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance 3bfa1478-0947-4811-85ff-d4060ce986d5#033[00m
Jan 23 05:26:31 np0005593233 nova_compute[222017]: 2026-01-23 10:26:31.527 222021 DEBUG oslo_concurrency.lockutils [None req-fc62d629-4f0e-4425-a0ea-e23b8aaa766a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "3bfa1478-0947-4811-85ff-d4060ce986d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:26:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:26:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:26:31 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:26:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:31.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:32.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:33.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:34 np0005593233 nova_compute[222017]: 2026-01-23 10:26:34.455 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e325 e325: 3 total, 3 up, 3 in
Jan 23 05:26:35 np0005593233 nova_compute[222017]: 2026-01-23 10:26:35.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:35.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e326 e326: 3 total, 3 up, 3 in
Jan 23 05:26:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:36.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e327 e327: 3 total, 3 up, 3 in
Jan 23 05:26:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:37.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:38.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:39 np0005593233 nova_compute[222017]: 2026-01-23 10:26:39.458 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:26:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:26:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:39.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:40.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:40 np0005593233 nova_compute[222017]: 2026-01-23 10:26:40.506 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163985.5023205, 3bfa1478-0947-4811-85ff-d4060ce986d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:40 np0005593233 nova_compute[222017]: 2026-01-23 10:26:40.506 222021 INFO nova.compute.manager [-] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:26:40 np0005593233 nova_compute[222017]: 2026-01-23 10:26:40.535 222021 DEBUG nova.compute.manager [None req-177732af-8d33-4aab-99ce-d1d41a7e6af1 - - - - - -] [instance: 3bfa1478-0947-4811-85ff-d4060ce986d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:40 np0005593233 nova_compute[222017]: 2026-01-23 10:26:40.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:41.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:42Z|00707|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:26:42 np0005593233 nova_compute[222017]: 2026-01-23 10:26:42.172 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:42.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:26:42Z|00708|binding|INFO|Releasing lport b648300b-e46c-4d3b-b02e-94ff684c03ae from this chassis (sb_readonly=0)
Jan 23 05:26:42 np0005593233 nova_compute[222017]: 2026-01-23 10:26:42.409 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:42.689 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:42.690 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:26:42.691 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 e328: 3 total, 3 up, 3 in
Jan 23 05:26:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:43.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.050 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.051 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.069 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.184 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.185 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.195 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.196 222021 INFO nova.compute.claims [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:26:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:44.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.339 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:26:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3074119583' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:26:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:26:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3074119583' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:26:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4020487012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.931 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.593s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.938 222021 DEBUG nova.compute.provider_tree [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.961 222021 DEBUG nova.scheduler.client.report [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.995 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:44 np0005593233 nova_compute[222017]: 2026-01-23 10:26:44.996 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.066 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.067 222021 DEBUG nova.network.neutron [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:26:45 np0005593233 podman[286189]: 2026-01-23 10:26:45.100833108 +0000 UTC m=+0.102945356 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.105 222021 INFO nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.136 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.185 222021 INFO nova.virt.block_device [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Booting with volume-backed-image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at /dev/vda#033[00m
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.294 222021 DEBUG nova.policy [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d6a628e0dcb441fa41457bf719e65a0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:26:45 np0005593233 nova_compute[222017]: 2026-01-23 10:26:45.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:45.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:46 np0005593233 nova_compute[222017]: 2026-01-23 10:26:46.134 222021 DEBUG nova.network.neutron [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Successfully created port: dff7c478-3227-40b0-86ed-ff699a5e5ccf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:26:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:46.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.090 222021 DEBUG nova.network.neutron [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Successfully updated port: dff7c478-3227-40b0-86ed-ff699a5e5ccf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.119 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.120 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquired lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.120 222021 DEBUG nova.network.neutron [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.217 222021 DEBUG nova.compute.manager [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-changed-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.218 222021 DEBUG nova.compute.manager [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Refreshing instance network info cache due to event network-changed-dff7c478-3227-40b0-86ed-ff699a5e5ccf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.218 222021 DEBUG oslo_concurrency.lockutils [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:47 np0005593233 nova_compute[222017]: 2026-01-23 10:26:47.373 222021 DEBUG nova.network.neutron [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:26:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:48.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:48 np0005593233 nova_compute[222017]: 2026-01-23 10:26:48.958 222021 DEBUG nova.network.neutron [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating instance_info_cache with network_info: [{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:48 np0005593233 nova_compute[222017]: 2026-01-23 10:26:48.982 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Releasing lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:48 np0005593233 nova_compute[222017]: 2026-01-23 10:26:48.983 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance network_info: |[{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:26:48 np0005593233 nova_compute[222017]: 2026-01-23 10:26:48.983 222021 DEBUG oslo_concurrency.lockutils [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:48 np0005593233 nova_compute[222017]: 2026-01-23 10:26:48.983 222021 DEBUG nova.network.neutron [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Refreshing network info cache for port dff7c478-3227-40b0-86ed-ff699a5e5ccf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:49 np0005593233 nova_compute[222017]: 2026-01-23 10:26:49.465 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:49.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:50.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:50 np0005593233 nova_compute[222017]: 2026-01-23 10:26:50.559 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:51 np0005593233 nova_compute[222017]: 2026-01-23 10:26:51.081 222021 DEBUG nova.network.neutron [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updated VIF entry in instance network info cache for port dff7c478-3227-40b0-86ed-ff699a5e5ccf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:51 np0005593233 nova_compute[222017]: 2026-01-23 10:26:51.082 222021 DEBUG nova.network.neutron [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating instance_info_cache with network_info: [{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:51 np0005593233 nova_compute[222017]: 2026-01-23 10:26:51.112 222021 DEBUG oslo_concurrency.lockutils [req-2cfd7720-34c8-48b7-976d-27fcce8c0fba req-1aa5e38f-6a7f-4e01-986e-ffe544e13d56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:51.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:26:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:52.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:26:52 np0005593233 nova_compute[222017]: 2026-01-23 10:26:52.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:52 np0005593233 nova_compute[222017]: 2026-01-23 10:26:52.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:53.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:54.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.468 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.564 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.564 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.564 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.564 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:26:54 np0005593233 nova_compute[222017]: 2026-01-23 10:26:54.565 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/460805387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.068 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.510 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.511 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.562 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.762 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.764 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4137MB free_disk=20.80698013305664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.765 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:55 np0005593233 nova_compute[222017]: 2026-01-23 10:26:55.765 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:55.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.094 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.094 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 87859c15-e250-4a5d-aab0-8bc67aae1bc3 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.095 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.095 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.246 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:26:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:56.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.738 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.748 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.771 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.798 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:26:56 np0005593233 nova_compute[222017]: 2026-01-23 10:26:56.799 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:57.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:58.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:59 np0005593233 nova_compute[222017]: 2026-01-23 10:26:59.473 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:59 np0005593233 nova_compute[222017]: 2026-01-23 10:26:59.799 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:59 np0005593233 nova_compute[222017]: 2026-01-23 10:26:59.800 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:26:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:59.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:00 np0005593233 podman[286255]: 2026-01-23 10:27:00.152675475 +0000 UTC m=+0.155959733 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 05:27:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:00.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.459 222021 DEBUG os_brick.utils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.462 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.480 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.480 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[cc537426-8755-4541-9e05-23925f2a0206]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.482 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.493 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.494 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac9dddb-b539-4f53-b600-5415b1b7b83d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.496 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.512 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.512 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[34b63d4a-5b95-4f80-8220-a76d41e813b1]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.515 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[b5172e3b-4bc0-454f-bd37-31c9b9eae6f3]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.515 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.565 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.568 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "nvme version" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.572 222021 DEBUG os_brick.initiator.connectors.lightos [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.572 222021 DEBUG os_brick.initiator.connectors.lightos [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.573 222021 DEBUG os_brick.initiator.connectors.lightos [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.574 222021 DEBUG os_brick.utils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] <== get_connector_properties: return (113ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:27:00 np0005593233 nova_compute[222017]: 2026-01-23 10:27:00.575 222021 DEBUG nova.virt.block_device [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating existing volume attachment record: e4c73434-a994-4a1d-92e4-1fe78b682693 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:27:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:27:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3668887486' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:27:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:01.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:02.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:03.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.224 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.226 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.226 222021 INFO nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Creating image(s)#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.226 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.227 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Ensure instance console log exists: /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.227 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.227 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.228 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.230 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Start _get_guest_xml network_info=[{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-03c13a28-044c-43b3-8aa2-67640721a086', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '03c13a28-044c-43b3-8aa2-67640721a086', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'attached_at': '', 'detached_at': '', 'volume_id': '03c13a28-044c-43b3-8aa2-67640721a086', 'serial': '03c13a28-044c-43b3-8aa2-67640721a086'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'e4c73434-a994-4a1d-92e4-1fe78b682693', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.235 222021 WARNING nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.245 222021 DEBUG nova.virt.libvirt.host [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.246 222021 DEBUG nova.virt.libvirt.host [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.250 222021 DEBUG nova.virt.libvirt.host [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.251 222021 DEBUG nova.virt.libvirt.host [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.252 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.252 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.253 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.253 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.253 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.253 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.254 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.254 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.254 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.254 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.254 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.255 222021 DEBUG nova.virt.hardware [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.288 222021 DEBUG nova.storage.rbd_utils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.293 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:04.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.520 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:27:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1604465940' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:27:04 np0005593233 nova_compute[222017]: 2026-01-23 10:27:04.809 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.345 222021 DEBUG nova.virt.libvirt.vif [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-240824956',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-240824956',id=170,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-x0tjh5et',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:45Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=87859c15-e250-4a5d-aab0-8bc67aae1bc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.345 222021 DEBUG nova.network.os_vif_util [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.346 222021 DEBUG nova.network.os_vif_util [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.348 222021 DEBUG nova.objects.instance [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.424 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.441 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <uuid>87859c15-e250-4a5d-aab0-8bc67aae1bc3</uuid>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <name>instance-000000aa</name>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-240824956</nova:name>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:27:04</nova:creationTime>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:user uuid="0d6a628e0dcb441fa41457bf719e65a0">tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member</nova:user>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:project uuid="5c27429e1d8f433a8a67ddb76f8798f1">tempest-ServerBootFromVolumeStableRescueTest-1351337832</nova:project>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <nova:port uuid="dff7c478-3227-40b0-86ed-ff699a5e5ccf">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <entry name="serial">87859c15-e250-4a5d-aab0-8bc67aae1bc3</entry>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <entry name="uuid">87859c15-e250-4a5d-aab0-8bc67aae1bc3</entry>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-03c13a28-044c-43b3-8aa2-67640721a086">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <serial>03c13a28-044c-43b3-8aa2-67640721a086</serial>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:3e:a9:bf"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <target dev="tapdff7c478-32"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/console.log" append="off"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:27:05 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:27:05 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:27:05 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:27:05 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.443 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Preparing to wait for external event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.444 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.444 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.444 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.445 222021 DEBUG nova.virt.libvirt.vif [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-240824956',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-240824956',id=170,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-x0tjh5et',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:45Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=87859c15-e250-4a5d-aab0-8bc67aae1bc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.446 222021 DEBUG nova.network.os_vif_util [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.446 222021 DEBUG nova.network.os_vif_util [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.447 222021 DEBUG os_vif [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.449 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.449 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.450 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.454 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdff7c478-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.455 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdff7c478-32, col_values=(('external_ids', {'iface-id': 'dff7c478-3227-40b0-86ed-ff699a5e5ccf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:a9:bf', 'vm-uuid': '87859c15-e250-4a5d-aab0-8bc67aae1bc3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.457 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:05 np0005593233 NetworkManager[48871]: <info>  [1769164025.4575] manager: (tapdff7c478-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.459 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.464 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:05 np0005593233 nova_compute[222017]: 2026-01-23 10:27:05.466 222021 INFO os_vif [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32')#033[00m
Jan 23 05:27:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:05.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:06.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.771 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.771 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.771 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.772 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.902 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.902 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.902 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No VIF found with MAC fa:16:3e:3e:a9:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.903 222021 INFO nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Using config drive#033[00m
Jan 23 05:27:06 np0005593233 nova_compute[222017]: 2026-01-23 10:27:06.929 222021 DEBUG nova.storage.rbd_utils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:07.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:08.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:09 np0005593233 nova_compute[222017]: 2026-01-23 10:27:09.523 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:10.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:10.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:10 np0005593233 nova_compute[222017]: 2026-01-23 10:27:10.457 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593233 nova_compute[222017]: 2026-01-23 10:27:11.789 222021 INFO nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Creating config drive at /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config#033[00m
Jan 23 05:27:11 np0005593233 nova_compute[222017]: 2026-01-23 10:27:11.794 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84t3w01f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:11 np0005593233 nova_compute[222017]: 2026-01-23 10:27:11.947 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84t3w01f" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:11 np0005593233 nova_compute[222017]: 2026-01-23 10:27:11.982 222021 DEBUG nova.storage.rbd_utils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:11 np0005593233 nova_compute[222017]: 2026-01-23 10:27:11.986 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:12.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:12 np0005593233 nova_compute[222017]: 2026-01-23 10:27:12.206 222021 DEBUG oslo_concurrency.processutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:12 np0005593233 nova_compute[222017]: 2026-01-23 10:27:12.208 222021 INFO nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Deleting local config drive /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config because it was imported into RBD.#033[00m
Jan 23 05:27:12 np0005593233 kernel: tapdff7c478-32: entered promiscuous mode
Jan 23 05:27:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:12Z|00709|binding|INFO|Claiming lport dff7c478-3227-40b0-86ed-ff699a5e5ccf for this chassis.
Jan 23 05:27:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:12Z|00710|binding|INFO|dff7c478-3227-40b0-86ed-ff699a5e5ccf: Claiming fa:16:3e:3e:a9:bf 10.100.0.12
Jan 23 05:27:12 np0005593233 NetworkManager[48871]: <info>  [1769164032.2970] manager: (tapdff7c478-32): new Tun device (/org/freedesktop/NetworkManager/Devices/328)
Jan 23 05:27:12 np0005593233 nova_compute[222017]: 2026-01-23 10:27:12.296 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:12Z|00711|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf ovn-installed in OVS
Jan 23 05:27:12 np0005593233 nova_compute[222017]: 2026-01-23 10:27:12.319 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:12 np0005593233 nova_compute[222017]: 2026-01-23 10:27:12.320 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:12 np0005593233 systemd-machined[190954]: New machine qemu-77-instance-000000aa.
Jan 23 05:27:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:12.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:12 np0005593233 systemd[1]: Started Virtual Machine qemu-77-instance-000000aa.
Jan 23 05:27:12 np0005593233 systemd-udevd[286402]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:27:12 np0005593233 NetworkManager[48871]: <info>  [1769164032.4054] device (tapdff7c478-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:27:12 np0005593233 NetworkManager[48871]: <info>  [1769164032.4062] device (tapdff7c478-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:27:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:12Z|00712|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf up in Southbound
Jan 23 05:27:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:12.979 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:a9:bf 10.100.0.12'], port_security=['fa:16:3e:3e:a9:bf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dff7c478-3227-40b0-86ed-ff699a5e5ccf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:12.981 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dff7c478-3227-40b0-86ed-ff699a5e5ccf in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 bound to our chassis#033[00m
Jan 23 05:27:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:12.984 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.021 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd7428a-e162-4775-b980-545ac4612474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.082 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7d088313-92f6-4301-a23a-5c999ddffbb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.087 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a08277-ae78-4c75-b5b4-cf8b808f8063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.138 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[844ed991-f104-4043-9cb3-737c4cf8ee61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.160 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d9be79c7-4a82-4d14-b09a-6165797cc4a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 26634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286458, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.163 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164033.1623383, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.163 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Started (Lifecycle Event)#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.188 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0d248c18-c716-4bce-9927-a3909795a08b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775717, 'tstamp': 775717}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286459, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775721, 'tstamp': 775721}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286459, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.190 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.192 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.194 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.194 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.194 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.194 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:13.195 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.300 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.307 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164033.1625555, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.307 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.334 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.338 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.381 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:27:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.966 222021 DEBUG nova.compute.manager [req-1d93263c-0793-4d56-b0b2-fd2a39674937 req-e37a9362-9dca-4023-a5bf-cd1532b82fc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.967 222021 DEBUG oslo_concurrency.lockutils [req-1d93263c-0793-4d56-b0b2-fd2a39674937 req-e37a9362-9dca-4023-a5bf-cd1532b82fc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.968 222021 DEBUG oslo_concurrency.lockutils [req-1d93263c-0793-4d56-b0b2-fd2a39674937 req-e37a9362-9dca-4023-a5bf-cd1532b82fc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.968 222021 DEBUG oslo_concurrency.lockutils [req-1d93263c-0793-4d56-b0b2-fd2a39674937 req-e37a9362-9dca-4023-a5bf-cd1532b82fc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.969 222021 DEBUG nova.compute.manager [req-1d93263c-0793-4d56-b0b2-fd2a39674937 req-e37a9362-9dca-4023-a5bf-cd1532b82fc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Processing event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.970 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.974 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164033.9742465, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.974 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.976 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.981 222021 INFO nova.virt.libvirt.driver [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance spawned successfully.#033[00m
Jan 23 05:27:13 np0005593233 nova_compute[222017]: 2026-01-23 10:27:13.981 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:27:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:14.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.118 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.124 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.128 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.128 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.129 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.129 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.129 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.130 222021 DEBUG nova.virt.libvirt.driver [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.335 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:27:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:14.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.527 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.586 222021 INFO nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Took 10.36 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:27:14 np0005593233 nova_compute[222017]: 2026-01-23 10:27:14.587 222021 DEBUG nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:15 np0005593233 nova_compute[222017]: 2026-01-23 10:27:15.461 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:15 np0005593233 nova_compute[222017]: 2026-01-23 10:27:15.775 222021 INFO nova.compute.manager [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Took 31.63 seconds to build instance.#033[00m
Jan 23 05:27:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:16.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:16 np0005593233 podman[286460]: 2026-01-23 10:27:16.106249233 +0000 UTC m=+0.115059821 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:27:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:16.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:16 np0005593233 nova_compute[222017]: 2026-01-23 10:27:16.831 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updating instance_info_cache with network_info: [{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:17 np0005593233 nova_compute[222017]: 2026-01-23 10:27:17.732 222021 DEBUG oslo_concurrency.lockutils [None req-2c3d85e4-5a47-4ee2-b6af-a904402c9759 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 33.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:17.815 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:17.816 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:27:17 np0005593233 nova_compute[222017]: 2026-01-23 10:27:17.816 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:17 np0005593233 nova_compute[222017]: 2026-01-23 10:27:17.859 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:17 np0005593233 nova_compute[222017]: 2026-01-23 10:27:17.859 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:27:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:18.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.127 222021 DEBUG nova.compute.manager [req-9fe50074-a4c4-4330-bf0b-1aebf083716c req-e1de5e29-4962-4968-bbbe-5d2c247d1beb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.127 222021 DEBUG oslo_concurrency.lockutils [req-9fe50074-a4c4-4330-bf0b-1aebf083716c req-e1de5e29-4962-4968-bbbe-5d2c247d1beb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.128 222021 DEBUG oslo_concurrency.lockutils [req-9fe50074-a4c4-4330-bf0b-1aebf083716c req-e1de5e29-4962-4968-bbbe-5d2c247d1beb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.128 222021 DEBUG oslo_concurrency.lockutils [req-9fe50074-a4c4-4330-bf0b-1aebf083716c req-e1de5e29-4962-4968-bbbe-5d2c247d1beb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.128 222021 DEBUG nova.compute.manager [req-9fe50074-a4c4-4330-bf0b-1aebf083716c req-e1de5e29-4962-4968-bbbe-5d2c247d1beb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.128 222021 WARNING nova.compute.manager [req-9fe50074-a4c4-4330-bf0b-1aebf083716c req-e1de5e29-4962-4968-bbbe-5d2c247d1beb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state active and task_state None.#033[00m
Jan 23 05:27:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:18.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:18 np0005593233 nova_compute[222017]: 2026-01-23 10:27:18.853 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:19 np0005593233 nova_compute[222017]: 2026-01-23 10:27:19.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:20.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:20.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:20 np0005593233 nova_compute[222017]: 2026-01-23 10:27:20.463 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:22 np0005593233 nova_compute[222017]: 2026-01-23 10:27:22.029 222021 INFO nova.compute.manager [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Rescuing#033[00m
Jan 23 05:27:22 np0005593233 nova_compute[222017]: 2026-01-23 10:27:22.030 222021 DEBUG oslo_concurrency.lockutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:22 np0005593233 nova_compute[222017]: 2026-01-23 10:27:22.031 222021 DEBUG oslo_concurrency.lockutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquired lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:22 np0005593233 nova_compute[222017]: 2026-01-23 10:27:22.031 222021 DEBUG nova.network.neutron [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:27:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:22.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:23.818 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:24.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:24 np0005593233 nova_compute[222017]: 2026-01-23 10:27:24.532 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:25 np0005593233 nova_compute[222017]: 2026-01-23 10:27:25.508 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:25 np0005593233 nova_compute[222017]: 2026-01-23 10:27:25.630 222021 DEBUG nova.network.neutron [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating instance_info_cache with network_info: [{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:25 np0005593233 nova_compute[222017]: 2026-01-23 10:27:25.736 222021 DEBUG oslo_concurrency.lockutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Releasing lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:26.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:26.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:26 np0005593233 nova_compute[222017]: 2026-01-23 10:27:26.405 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:27:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:28.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:28.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:29 np0005593233 nova_compute[222017]: 2026-01-23 10:27:29.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593233 kernel: tapdff7c478-32 (unregistering): left promiscuous mode
Jan 23 05:27:29 np0005593233 NetworkManager[48871]: <info>  [1769164049.9233] device (tapdff7c478-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:27:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:29Z|00713|binding|INFO|Releasing lport dff7c478-3227-40b0-86ed-ff699a5e5ccf from this chassis (sb_readonly=0)
Jan 23 05:27:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:29Z|00714|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf down in Southbound
Jan 23 05:27:29 np0005593233 nova_compute[222017]: 2026-01-23 10:27:29.983 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:29Z|00715|binding|INFO|Removing iface tapdff7c478-32 ovn-installed in OVS
Jan 23 05:27:29 np0005593233 nova_compute[222017]: 2026-01-23 10:27:29.987 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:29.993 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:a9:bf 10.100.0.12'], port_security=['fa:16:3e:3e:a9:bf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dff7c478-3227-40b0-86ed-ff699a5e5ccf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:29.996 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dff7c478-3227-40b0-86ed-ff699a5e5ccf in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 unbound from our chassis#033[00m
Jan 23 05:27:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:29.999 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.002 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.018 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b5549a-1df5-423f-87df-1d3c01113dff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:30.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:30 np0005593233 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 23 05:27:30 np0005593233 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000aa.scope: Consumed 14.118s CPU time.
Jan 23 05:27:30 np0005593233 systemd-machined[190954]: Machine qemu-77-instance-000000aa terminated.
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.066 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[582b8035-0b26-44f4-bb94-9f9952345fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.071 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0e0299-1ebb-402f-b784-07a7738ff59a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.112 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5f182db7-85a3-4100-9654-d0916a26c279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.142 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2e7084-1364-4a2f-acb1-09aeb69bd6fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 26634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286491, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.175 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cb51556a-cd8c-49fd-9132-e042e4406bd7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775717, 'tstamp': 775717}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286494, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775721, 'tstamp': 775721}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286494, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.178 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.180 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.186 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.187 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.187 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.188 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:30.188 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:30 np0005593233 podman[286505]: 2026-01-23 10:27:30.376338372 +0000 UTC m=+0.131658522 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:27:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:30.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.428 222021 INFO nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance shutdown successfully after 4 seconds.#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.438 222021 INFO nova.virt.libvirt.driver [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance destroyed successfully.#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.438 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.484 222021 INFO nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Attempting a stable device rescue#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.510 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.769 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.776 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.777 222021 INFO nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Creating image(s)#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.822 222021 DEBUG nova.storage.rbd_utils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.827 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.878 222021 DEBUG nova.storage.rbd_utils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.911 222021 DEBUG nova.storage.rbd_utils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.915 222021 DEBUG oslo_concurrency.lockutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "d2721a4293f1885af36fa2fac7d66751d8acbeea" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:30 np0005593233 nova_compute[222017]: 2026-01-23 10:27:30.916 222021 DEBUG oslo_concurrency.lockutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "d2721a4293f1885af36fa2fac7d66751d8acbeea" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.223 222021 DEBUG nova.compute.manager [req-0e7e2573-7431-4bd9-ba1d-d4b62f6e978d req-918ed83f-9877-449e-a1ac-0b54ee4e0584 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.223 222021 DEBUG oslo_concurrency.lockutils [req-0e7e2573-7431-4bd9-ba1d-d4b62f6e978d req-918ed83f-9877-449e-a1ac-0b54ee4e0584 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.224 222021 DEBUG oslo_concurrency.lockutils [req-0e7e2573-7431-4bd9-ba1d-d4b62f6e978d req-918ed83f-9877-449e-a1ac-0b54ee4e0584 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.224 222021 DEBUG oslo_concurrency.lockutils [req-0e7e2573-7431-4bd9-ba1d-d4b62f6e978d req-918ed83f-9877-449e-a1ac-0b54ee4e0584 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.224 222021 DEBUG nova.compute.manager [req-0e7e2573-7431-4bd9-ba1d-d4b62f6e978d req-918ed83f-9877-449e-a1ac-0b54ee4e0584 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.224 222021 WARNING nova.compute.manager [req-0e7e2573-7431-4bd9-ba1d-d4b62f6e978d req-918ed83f-9877-449e-a1ac-0b54ee4e0584 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.311 222021 DEBUG nova.virt.libvirt.imagebackend [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/468a5657-114a-42e3-900b-a72cd304261c/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/468a5657-114a-42e3-900b-a72cd304261c/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.411 222021 DEBUG nova.virt.libvirt.imagebackend [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/468a5657-114a-42e3-900b-a72cd304261c/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.412 222021 DEBUG nova.storage.rbd_utils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] cloning images/468a5657-114a-42e3-900b-a72cd304261c@snap to None/87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.570 222021 DEBUG oslo_concurrency.lockutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "d2721a4293f1885af36fa2fac7d66751d8acbeea" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.643 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'migration_context' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.666 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.672 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Start _get_guest_xml network_info=[{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "vif_mac": "fa:16:3e:3e:a9:bf"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '468a5657-114a-42e3-900b-a72cd304261c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-03c13a28-044c-43b3-8aa2-67640721a086', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '03c13a28-044c-43b3-8aa2-67640721a086', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'attached_at': '', 'detached_at': '', 'volume_id': '03c13a28-044c-43b3-8aa2-67640721a086', 'serial': '03c13a28-044c-43b3-8aa2-67640721a086'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'e4c73434-a994-4a1d-92e4-1fe78b682693', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.673 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'resources' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.710 222021 WARNING nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.718 222021 DEBUG nova.virt.libvirt.host [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.720 222021 DEBUG nova.virt.libvirt.host [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.725 222021 DEBUG nova.virt.libvirt.host [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.726 222021 DEBUG nova.virt.libvirt.host [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.728 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.728 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.729 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.730 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.730 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.731 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.731 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.732 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.732 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.733 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.733 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.734 222021 DEBUG nova.virt.hardware [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.734 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:31 np0005593233 nova_compute[222017]: 2026-01-23 10:27:31.810 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:32.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:27:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2110631224' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:27:32 np0005593233 nova_compute[222017]: 2026-01-23 10:27:32.347 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:32.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:32 np0005593233 nova_compute[222017]: 2026-01-23 10:27:32.776 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:27:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3174985252' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.253 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.255 222021 DEBUG nova.virt.libvirt.vif [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-240824956',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-240824956',id=170,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:27:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-x0tjh5et',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:27:15Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=87859c15-e250-4a5d-aab0-8bc67aae1bc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "vif_mac": "fa:16:3e:3e:a9:bf"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.255 222021 DEBUG nova.network.os_vif_util [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "vif_mac": "fa:16:3e:3e:a9:bf"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.257 222021 DEBUG nova.network.os_vif_util [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.258 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.307 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <uuid>87859c15-e250-4a5d-aab0-8bc67aae1bc3</uuid>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <name>instance-000000aa</name>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerBootFromVolumeStableRescueTest-server-240824956</nova:name>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:27:31</nova:creationTime>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:user uuid="0d6a628e0dcb441fa41457bf719e65a0">tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member</nova:user>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:project uuid="5c27429e1d8f433a8a67ddb76f8798f1">tempest-ServerBootFromVolumeStableRescueTest-1351337832</nova:project>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <nova:port uuid="dff7c478-3227-40b0-86ed-ff699a5e5ccf">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <entry name="serial">87859c15-e250-4a5d-aab0-8bc67aae1bc3</entry>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <entry name="uuid">87859c15-e250-4a5d-aab0-8bc67aae1bc3</entry>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-03c13a28-044c-43b3-8aa2-67640721a086">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <serial>03c13a28-044c-43b3-8aa2-67640721a086</serial>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.rescue">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <boot order="1"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:3e:a9:bf"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <target dev="tapdff7c478-32"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/console.log" append="off"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:27:33 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:27:33 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:27:33 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:27:33 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.320 222021 INFO nova.virt.libvirt.driver [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance destroyed successfully.#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.429 222021 DEBUG nova.compute.manager [req-308571c3-32c0-4a45-8717-7b57ae5e501e req-f7339076-94ca-4908-9f2c-2f8eda84574b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.429 222021 DEBUG oslo_concurrency.lockutils [req-308571c3-32c0-4a45-8717-7b57ae5e501e req-f7339076-94ca-4908-9f2c-2f8eda84574b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.430 222021 DEBUG oslo_concurrency.lockutils [req-308571c3-32c0-4a45-8717-7b57ae5e501e req-f7339076-94ca-4908-9f2c-2f8eda84574b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.431 222021 DEBUG oslo_concurrency.lockutils [req-308571c3-32c0-4a45-8717-7b57ae5e501e req-f7339076-94ca-4908-9f2c-2f8eda84574b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.431 222021 DEBUG nova.compute.manager [req-308571c3-32c0-4a45-8717-7b57ae5e501e req-f7339076-94ca-4908-9f2c-2f8eda84574b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.432 222021 WARNING nova.compute.manager [req-308571c3-32c0-4a45-8717-7b57ae5e501e req-f7339076-94ca-4908-9f2c-2f8eda84574b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.439 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.439 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.440 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.440 222021 DEBUG nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] No VIF found with MAC fa:16:3e:3e:a9:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.441 222021 INFO nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Using config drive#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.482 222021 DEBUG nova.storage.rbd_utils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.509 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:33 np0005593233 nova_compute[222017]: 2026-01-23 10:27:33.570 222021 DEBUG nova.objects.instance [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'keypairs' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:34.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.057 222021 INFO nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Creating config drive at /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config.rescue#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.066 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8yvsnum2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.222 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8yvsnum2" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.271 222021 DEBUG nova.storage.rbd_utils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] rbd image 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.276 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config.rescue 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:34.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.476 222021 DEBUG oslo_concurrency.processutils [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config.rescue 87859c15-e250-4a5d-aab0-8bc67aae1bc3_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.478 222021 INFO nova.virt.libvirt.driver [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Deleting local config drive /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.536 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593233 kernel: tapdff7c478-32: entered promiscuous mode
Jan 23 05:27:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:34Z|00716|binding|INFO|Claiming lport dff7c478-3227-40b0-86ed-ff699a5e5ccf for this chassis.
Jan 23 05:27:34 np0005593233 NetworkManager[48871]: <info>  [1769164054.5546] manager: (tapdff7c478-32): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 23 05:27:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:34Z|00717|binding|INFO|dff7c478-3227-40b0-86ed-ff699a5e5ccf: Claiming fa:16:3e:3e:a9:bf 10.100.0.12
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.565 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:a9:bf 10.100.0.12'], port_security=['fa:16:3e:3e:a9:bf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '5', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dff7c478-3227-40b0-86ed-ff699a5e5ccf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.566 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dff7c478-3227-40b0-86ed-ff699a5e5ccf in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 bound to our chassis#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.568 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:27:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:34Z|00718|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf ovn-installed in OVS
Jan 23 05:27:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:34Z|00719|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf up in Southbound
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.581 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593233 systemd-udevd[286807]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.587 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.593 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7765e7f8-09e3-4941-be7f-e1e11c0d0600]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:34 np0005593233 systemd-machined[190954]: New machine qemu-78-instance-000000aa.
Jan 23 05:27:34 np0005593233 NetworkManager[48871]: <info>  [1769164054.5999] device (tapdff7c478-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:27:34 np0005593233 NetworkManager[48871]: <info>  [1769164054.6008] device (tapdff7c478-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:27:34 np0005593233 systemd[1]: Started Virtual Machine qemu-78-instance-000000aa.
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.630 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8f67a6c4-3f85-4cb4-8e5b-163de71cd4c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.633 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[28b868b7-090f-4ec6-84d9-0c0a84132026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.665 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7c37d6-f3af-4475-9152-dadfded01480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.688 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a5abe487-4901-4fdc-857c-01b119b5ec8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 26634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286820, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.710 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c391d6a2-cef0-45ec-93f7-fbdaa53397d0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775717, 'tstamp': 775717}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286822, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775721, 'tstamp': 775721}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286822, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.712 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.713 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593233 nova_compute[222017]: 2026-01-23 10:27:34.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.715 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.715 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.715 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:34.716 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.569 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.680 222021 DEBUG nova.compute.manager [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.682 222021 DEBUG oslo_concurrency.lockutils [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.683 222021 DEBUG oslo_concurrency.lockutils [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.683 222021 DEBUG oslo_concurrency.lockutils [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.684 222021 DEBUG nova.compute.manager [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.684 222021 WARNING nova.compute.manager [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.685 222021 DEBUG nova.compute.manager [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.686 222021 DEBUG oslo_concurrency.lockutils [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.687 222021 DEBUG oslo_concurrency.lockutils [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.687 222021 DEBUG oslo_concurrency.lockutils [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.688 222021 DEBUG nova.compute.manager [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:35 np0005593233 nova_compute[222017]: 2026-01-23 10:27:35.688 222021 WARNING nova.compute.manager [req-4567b22b-afa9-4c73-8c25-20cddec0f1f1 req-9f54b49c-b57f-4ea3-bdb6-c34832a9c445 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:27:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:36.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.289 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 87859c15-e250-4a5d-aab0-8bc67aae1bc3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.290 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164056.288945, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.291 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.295 222021 DEBUG nova.compute.manager [None req-a42b2979-c35b-4cf9-82c4-274be365b486 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.358 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.361 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.390 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164056.2894812, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.390 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Started (Lifecycle Event)#033[00m
Jan 23 05:27:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:36.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.414 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:36 np0005593233 nova_compute[222017]: 2026-01-23 10:27:36.418 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:37 np0005593233 nova_compute[222017]: 2026-01-23 10:27:37.307 222021 INFO nova.compute.manager [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Unrescuing#033[00m
Jan 23 05:27:37 np0005593233 nova_compute[222017]: 2026-01-23 10:27:37.308 222021 DEBUG oslo_concurrency.lockutils [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:37 np0005593233 nova_compute[222017]: 2026-01-23 10:27:37.308 222021 DEBUG oslo_concurrency.lockutils [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquired lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:37 np0005593233 nova_compute[222017]: 2026-01-23 10:27:37.308 222021 DEBUG nova.network.neutron [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:27:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:38.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:38.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:39 np0005593233 nova_compute[222017]: 2026-01-23 10:27:39.540 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:40.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.281 222021 DEBUG nova.network.neutron [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating instance_info_cache with network_info: [{"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.322 222021 DEBUG oslo_concurrency.lockutils [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Releasing lock "refresh_cache-87859c15-e250-4a5d-aab0-8bc67aae1bc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.324 222021 DEBUG nova.objects.instance [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'flavor' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:40.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:40 np0005593233 kernel: tapdff7c478-32 (unregistering): left promiscuous mode
Jan 23 05:27:40 np0005593233 NetworkManager[48871]: <info>  [1769164060.9140] device (tapdff7c478-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:27:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:40Z|00720|binding|INFO|Releasing lport dff7c478-3227-40b0-86ed-ff699a5e5ccf from this chassis (sb_readonly=0)
Jan 23 05:27:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:40Z|00721|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf down in Southbound
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.927 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:40Z|00722|binding|INFO|Removing iface tapdff7c478-32 ovn-installed in OVS
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.931 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:40.937 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:a9:bf 10.100.0.12'], port_security=['fa:16:3e:3e:a9:bf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '6', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dff7c478-3227-40b0-86ed-ff699a5e5ccf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:40.939 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dff7c478-3227-40b0-86ed-ff699a5e5ccf in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 unbound from our chassis#033[00m
Jan 23 05:27:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:40.942 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:27:40 np0005593233 nova_compute[222017]: 2026-01-23 10:27:40.962 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:40.977 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3b9872b4-130d-407b-9e46-eece6f5e9fff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:40 np0005593233 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 23 05:27:40 np0005593233 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000aa.scope: Consumed 5.736s CPU time.
Jan 23 05:27:40 np0005593233 systemd-machined[190954]: Machine qemu-78-instance-000000aa terminated.
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.024 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a996be-c0ac-4d20-9321-cb9d3be9a945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.029 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fb513750-50e0-48f9-9a5f-3e9bd72da824]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.070 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9649ee-b311-40c8-a25f-45368c7e34ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.090 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3631c4fb-2b93-49e0-870c-5acfe05658f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 26634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287025, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.105 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ae49068d-d9c8-4517-a970-0f6c05e73659]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775717, 'tstamp': 775717}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287026, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775721, 'tstamp': 775721}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287026, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.107 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.108 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.114 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.115 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.115 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.115 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.116 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.205 222021 INFO nova.virt.libvirt.driver [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance destroyed successfully.#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.206 222021 DEBUG nova.objects.instance [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.309 222021 DEBUG nova.compute.manager [req-1c4dd4af-f69c-4345-bd64-302d3c740310 req-1620df7a-ab6d-4372-9632-721849bcfc32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.310 222021 DEBUG oslo_concurrency.lockutils [req-1c4dd4af-f69c-4345-bd64-302d3c740310 req-1620df7a-ab6d-4372-9632-721849bcfc32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.312 222021 DEBUG oslo_concurrency.lockutils [req-1c4dd4af-f69c-4345-bd64-302d3c740310 req-1620df7a-ab6d-4372-9632-721849bcfc32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.313 222021 DEBUG oslo_concurrency.lockutils [req-1c4dd4af-f69c-4345-bd64-302d3c740310 req-1620df7a-ab6d-4372-9632-721849bcfc32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.313 222021 DEBUG nova.compute.manager [req-1c4dd4af-f69c-4345-bd64-302d3c740310 req-1620df7a-ab6d-4372-9632-721849bcfc32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.313 222021 WARNING nova.compute.manager [req-1c4dd4af-f69c-4345-bd64-302d3c740310 req-1620df7a-ab6d-4372-9632-721849bcfc32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:27:41 np0005593233 kernel: tapdff7c478-32: entered promiscuous mode
Jan 23 05:27:41 np0005593233 NetworkManager[48871]: <info>  [1769164061.8034] manager: (tapdff7c478-32): new Tun device (/org/freedesktop/NetworkManager/Devices/330)
Jan 23 05:27:41 np0005593233 systemd-udevd[287005]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:27:41 np0005593233 NetworkManager[48871]: <info>  [1769164061.8200] device (tapdff7c478-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:27:41 np0005593233 NetworkManager[48871]: <info>  [1769164061.8208] device (tapdff7c478-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.852 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:41Z|00723|binding|INFO|Claiming lport dff7c478-3227-40b0-86ed-ff699a5e5ccf for this chassis.
Jan 23 05:27:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:41Z|00724|binding|INFO|dff7c478-3227-40b0-86ed-ff699a5e5ccf: Claiming fa:16:3e:3e:a9:bf 10.100.0.12
Jan 23 05:27:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:41Z|00725|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf ovn-installed in OVS
Jan 23 05:27:41 np0005593233 nova_compute[222017]: 2026-01-23 10:27:41.874 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:41Z|00726|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf up in Southbound
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.909 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:a9:bf 10.100.0.12'], port_security=['fa:16:3e:3e:a9:bf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '6', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dff7c478-3227-40b0-86ed-ff699a5e5ccf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.910 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dff7c478-3227-40b0-86ed-ff699a5e5ccf in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 bound to our chassis#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.912 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.930 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8535b573-40e4-4c47-8e12-6277a6f237b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.962 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbb16d2-a527-4f37-9a1b-32a5061383de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.964 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[46410dd0-49da-4f83-bc91-b79fdc3bfe25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:41.993 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[879497e8-271c-40aa-97fe-ef1d795bad24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.015 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[16802b65-64e2-4e36-8b8c-3c7b531d4b81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 14, 'rx_bytes': 616, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 14, 'rx_bytes': 616, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 26634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287055, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.038 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9117b324-34f1-4d7f-982f-a6fc572fffa4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775717, 'tstamp': 775717}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287056, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775721, 'tstamp': 775721}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287056, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:42.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.040 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:42 np0005593233 nova_compute[222017]: 2026-01-23 10:27:42.042 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.043 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.044 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.044 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.044 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:42 np0005593233 systemd-machined[190954]: New machine qemu-79-instance-000000aa.
Jan 23 05:27:42 np0005593233 systemd[1]: Started Virtual Machine qemu-79-instance-000000aa.
Jan 23 05:27:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:42.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:27:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:27:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.689 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.690 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:42.690 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:42 np0005593233 nova_compute[222017]: 2026-01-23 10:27:42.922 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 87859c15-e250-4a5d-aab0-8bc67aae1bc3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:27:42 np0005593233 nova_compute[222017]: 2026-01-23 10:27:42.923 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164062.9222145, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:42 np0005593233 nova_compute[222017]: 2026-01-23 10:27:42.923 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.017 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.022 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.051 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.052 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164062.9245343, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.052 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Started (Lifecycle Event)#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.072 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.077 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:43 np0005593233 nova_compute[222017]: 2026-01-23 10:27:43.156 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:27:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.029 222021 DEBUG nova.compute.manager [None req-98b2d160-e936-449a-afa6-fb039844e31a 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:44.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.064 222021 DEBUG nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.065 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.066 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.067 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.067 222021 DEBUG nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.068 222021 WARNING nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.069 222021 DEBUG nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.069 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.070 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.071 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.072 222021 DEBUG nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.072 222021 WARNING nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.073 222021 DEBUG nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.074 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.075 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.075 222021 DEBUG oslo_concurrency.lockutils [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.076 222021 DEBUG nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.077 222021 WARNING nova.compute.manager [req-21e79be0-425e-4dc7-bef1-40ba3f9d8c20 req-df690494-5aa4-4c4c-b00f-29f34d7d112d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:27:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:44.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:44 np0005593233 nova_compute[222017]: 2026-01-23 10:27:44.576 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:45 np0005593233 nova_compute[222017]: 2026-01-23 10:27:45.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:46.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:46.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:47 np0005593233 podman[287127]: 2026-01-23 10:27:47.093794655 +0000 UTC m=+0.086789408 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:27:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:48.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.341 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.342 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.343 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.344 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.345 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.347 222021 INFO nova.compute.manager [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Terminating instance#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.349 222021 DEBUG nova.compute.manager [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:27:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:48.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:48 np0005593233 kernel: tapdff7c478-32 (unregistering): left promiscuous mode
Jan 23 05:27:48 np0005593233 NetworkManager[48871]: <info>  [1769164068.6026] device (tapdff7c478-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:48Z|00727|binding|INFO|Releasing lport dff7c478-3227-40b0-86ed-ff699a5e5ccf from this chassis (sb_readonly=0)
Jan 23 05:27:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:48Z|00728|binding|INFO|Setting lport dff7c478-3227-40b0-86ed-ff699a5e5ccf down in Southbound
Jan 23 05:27:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:27:48Z|00729|binding|INFO|Removing iface tapdff7c478-32 ovn-installed in OVS
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.620 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.631 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:a9:bf 10.100.0.12'], port_security=['fa:16:3e:3e:a9:bf 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '87859c15-e250-4a5d-aab0-8bc67aae1bc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '8', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=dff7c478-3227-40b0-86ed-ff699a5e5ccf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.634 140224 INFO neutron.agent.ovn.metadata.agent [-] Port dff7c478-3227-40b0-86ed-ff699a5e5ccf in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 unbound from our chassis#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.638 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.648 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.662 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0cde91ea-73f1-407a-a1ab-694b4372ea8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:48 np0005593233 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 23 05:27:48 np0005593233 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000aa.scope: Consumed 6.176s CPU time.
Jan 23 05:27:48 np0005593233 systemd-machined[190954]: Machine qemu-79-instance-000000aa terminated.
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.709 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8598015f-0acb-4ec2-9719-bbf738f851b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.714 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3959c9a8-959a-4349-906f-06fde23013a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.760 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[90f35eaf-80ae-4ef1-bc50-57652e556471]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:48 np0005593233 kernel: tapdff7c478-32: entered promiscuous mode
Jan 23 05:27:48 np0005593233 kernel: tapdff7c478-32 (unregistering): left promiscuous mode
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.795 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5127d910-dab6-47a7-8afd-09b94620acb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbd64ab8-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:7c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 16, 'rx_bytes': 616, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 16, 'rx_bytes': 616, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775702, 'reachable_time': 26634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287160, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.806 222021 INFO nova.virt.libvirt.driver [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Instance destroyed successfully.#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.807 222021 DEBUG nova.objects.instance [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'resources' on Instance uuid 87859c15-e250-4a5d-aab0-8bc67aae1bc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.828 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[37f37a60-7e7e-48aa-9794-51111a0a10c2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775717, 'tstamp': 775717}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287169, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbd64ab8-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775721, 'tstamp': 775721}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287169, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.830 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.830 222021 DEBUG nova.virt.libvirt.vif [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:26:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-240824956',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-240824956',id=170,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:27:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-x0tjh5et',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:27:45Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=87859c15-e250-4a5d-aab0-8bc67aae1bc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.831 222021 DEBUG nova.network.os_vif_util [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "address": "fa:16:3e:3e:a9:bf", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdff7c478-32", "ovs_interfaceid": "dff7c478-3227-40b0-86ed-ff699a5e5ccf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.833 222021 DEBUG nova.network.os_vif_util [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.833 222021 DEBUG os_vif [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.836 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.837 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdff7c478-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.839 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.842 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbd64ab8-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.842 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.842 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbd64ab8-90, col_values=(('external_ids', {'iface-id': 'b648300b-e46c-4d3b-b02e-94ff684c03ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:27:48.843 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.845 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593233 nova_compute[222017]: 2026-01-23 10:27:48.850 222021 INFO os_vif [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:a9:bf,bridge_name='br-int',has_traffic_filtering=True,id=dff7c478-3227-40b0-86ed-ff699a5e5ccf,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdff7c478-32')#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.562 222021 DEBUG nova.compute.manager [req-90fe62da-8551-4ff1-ab2a-68dccd079bb1 req-0df49963-c739-4749-99b6-cd63d461f10a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.563 222021 DEBUG oslo_concurrency.lockutils [req-90fe62da-8551-4ff1-ab2a-68dccd079bb1 req-0df49963-c739-4749-99b6-cd63d461f10a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.563 222021 DEBUG oslo_concurrency.lockutils [req-90fe62da-8551-4ff1-ab2a-68dccd079bb1 req-0df49963-c739-4749-99b6-cd63d461f10a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.563 222021 DEBUG oslo_concurrency.lockutils [req-90fe62da-8551-4ff1-ab2a-68dccd079bb1 req-0df49963-c739-4749-99b6-cd63d461f10a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.564 222021 DEBUG nova.compute.manager [req-90fe62da-8551-4ff1-ab2a-68dccd079bb1 req-0df49963-c739-4749-99b6-cd63d461f10a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.564 222021 DEBUG nova.compute.manager [req-90fe62da-8551-4ff1-ab2a-68dccd079bb1 req-0df49963-c739-4749-99b6-cd63d461f10a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-unplugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.579 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.903 222021 INFO nova.virt.libvirt.driver [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Deleting instance files /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3_del#033[00m
Jan 23 05:27:49 np0005593233 nova_compute[222017]: 2026-01-23 10:27:49.904 222021 INFO nova.virt.libvirt.driver [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Deletion of /var/lib/nova/instances/87859c15-e250-4a5d-aab0-8bc67aae1bc3_del complete#033[00m
Jan 23 05:27:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:50.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:50.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:51 np0005593233 nova_compute[222017]: 2026-01-23 10:27:51.319 222021 INFO nova.compute.manager [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Took 2.97 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:27:51 np0005593233 nova_compute[222017]: 2026-01-23 10:27:51.320 222021 DEBUG oslo.service.loopingcall [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:27:51 np0005593233 nova_compute[222017]: 2026-01-23 10:27:51.322 222021 DEBUG nova.compute.manager [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:27:51 np0005593233 nova_compute[222017]: 2026-01-23 10:27:51.322 222021 DEBUG nova.network.neutron [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:27:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:52.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:52.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:27:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:27:53 np0005593233 nova_compute[222017]: 2026-01-23 10:27:53.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:53 np0005593233 nova_compute[222017]: 2026-01-23 10:27:53.840 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:54.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:54 np0005593233 nova_compute[222017]: 2026-01-23 10:27:54.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:54 np0005593233 nova_compute[222017]: 2026-01-23 10:27:54.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:54.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:54 np0005593233 nova_compute[222017]: 2026-01-23 10:27:54.581 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:55 np0005593233 nova_compute[222017]: 2026-01-23 10:27:55.153 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:55 np0005593233 nova_compute[222017]: 2026-01-23 10:27:55.153 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:55 np0005593233 nova_compute[222017]: 2026-01-23 10:27:55.154 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:55 np0005593233 nova_compute[222017]: 2026-01-23 10:27:55.154 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:27:55 np0005593233 nova_compute[222017]: 2026-01-23 10:27:55.154 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:27:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1864475906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:55 np0005593233 nova_compute[222017]: 2026-01-23 10:27:55.646 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:56.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:56.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:57 np0005593233 nova_compute[222017]: 2026-01-23 10:27:57.537 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:57 np0005593233 nova_compute[222017]: 2026-01-23 10:27:57.538 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000a3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:57 np0005593233 nova_compute[222017]: 2026-01-23 10:27:57.713 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:57 np0005593233 nova_compute[222017]: 2026-01-23 10:27:57.715 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4117MB free_disk=20.78516387939453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:27:57 np0005593233 nova_compute[222017]: 2026-01-23 10:27:57.716 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:57 np0005593233 nova_compute[222017]: 2026-01-23 10:27:57.716 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:27:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:58.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:27:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:27:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:27:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:58.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:27:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.974 222021 DEBUG nova.compute.manager [req-c9112ad0-3526-4938-8fb5-3a09816b0e0b req-06549f9c-795d-4713-bde2-989e53116efc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.975 222021 DEBUG oslo_concurrency.lockutils [req-c9112ad0-3526-4938-8fb5-3a09816b0e0b req-06549f9c-795d-4713-bde2-989e53116efc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.975 222021 DEBUG oslo_concurrency.lockutils [req-c9112ad0-3526-4938-8fb5-3a09816b0e0b req-06549f9c-795d-4713-bde2-989e53116efc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.976 222021 DEBUG oslo_concurrency.lockutils [req-c9112ad0-3526-4938-8fb5-3a09816b0e0b req-06549f9c-795d-4713-bde2-989e53116efc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.976 222021 DEBUG nova.compute.manager [req-c9112ad0-3526-4938-8fb5-3a09816b0e0b req-06549f9c-795d-4713-bde2-989e53116efc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] No waiting events found dispatching network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:58 np0005593233 nova_compute[222017]: 2026-01-23 10:27:58.976 222021 WARNING nova.compute.manager [req-c9112ad0-3526-4938-8fb5-3a09816b0e0b req-06549f9c-795d-4713-bde2-989e53116efc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received unexpected event network-vif-plugged-dff7c478-3227-40b0-86ed-ff699a5e5ccf for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:27:59 np0005593233 nova_compute[222017]: 2026-01-23 10:27:59.584 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:00.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:00.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:01 np0005593233 podman[287263]: 2026-01-23 10:28:01.124439461 +0000 UTC m=+0.117178931 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:28:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:02.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:02.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:03 np0005593233 nova_compute[222017]: 2026-01-23 10:28:03.805 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164068.8031995, 87859c15-e250-4a5d-aab0-8bc67aae1bc3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:28:03 np0005593233 nova_compute[222017]: 2026-01-23 10:28:03.806 222021 INFO nova.compute.manager [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:28:03 np0005593233 nova_compute[222017]: 2026-01-23 10:28:03.844 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:04.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:04.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:04 np0005593233 nova_compute[222017]: 2026-01-23 10:28:04.587 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:06.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:06.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:08.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:08.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:08 np0005593233 nova_compute[222017]: 2026-01-23 10:28:08.845 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:09 np0005593233 nova_compute[222017]: 2026-01-23 10:28:09.589 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:10.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:10 np0005593233 nova_compute[222017]: 2026-01-23 10:28:10.167 222021 DEBUG nova.compute.manager [None req-d89fe29b-47d2-4685-8ee7-19280868d453 - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:28:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:10.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:12.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:12.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:13 np0005593233 nova_compute[222017]: 2026-01-23 10:28:13.847 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:14.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:14.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:14 np0005593233 nova_compute[222017]: 2026-01-23 10:28:14.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:16.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:16.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:16 np0005593233 nova_compute[222017]: 2026-01-23 10:28:16.967 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:28:16 np0005593233 nova_compute[222017]: 2026-01-23 10:28:16.968 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 87859c15-e250-4a5d-aab0-8bc67aae1bc3 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:28:16 np0005593233 nova_compute[222017]: 2026-01-23 10:28:16.968 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:28:16 np0005593233 nova_compute[222017]: 2026-01-23 10:28:16.969 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:28:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:18.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:18 np0005593233 podman[287289]: 2026-01-23 10:28:18.110238717 +0000 UTC m=+0.106904208 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:28:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:18.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:18 np0005593233 nova_compute[222017]: 2026-01-23 10:28:18.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:19 np0005593233 nova_compute[222017]: 2026-01-23 10:28:19.596 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:20.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:20.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:22.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:22.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:23 np0005593233 nova_compute[222017]: 2026-01-23 10:28:23.852 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:24.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:24 np0005593233 nova_compute[222017]: 2026-01-23 10:28:24.182 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:24.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:24 np0005593233 nova_compute[222017]: 2026-01-23 10:28:24.598 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/574150867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:24 np0005593233 nova_compute[222017]: 2026-01-23 10:28:24.719 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:24 np0005593233 nova_compute[222017]: 2026-01-23 10:28:24.726 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:26 np0005593233 nova_compute[222017]: 2026-01-23 10:28:26.013 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:26.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:26.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:26.626 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:28:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:26.628 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:28:26 np0005593233 nova_compute[222017]: 2026-01-23 10:28:26.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:26 np0005593233 nova_compute[222017]: 2026-01-23 10:28:26.785 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:28:26 np0005593233 nova_compute[222017]: 2026-01-23 10:28:26.785 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 29.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:27 np0005593233 nova_compute[222017]: 2026-01-23 10:28:27.024 222021 DEBUG nova.network.neutron [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:27 np0005593233 nova_compute[222017]: 2026-01-23 10:28:27.787 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:27 np0005593233 nova_compute[222017]: 2026-01-23 10:28:27.787 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.080 222021 INFO nova.compute.manager [-] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Took 36.76 seconds to deallocate network for instance.#033[00m
Jan 23 05:28:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:28.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:28.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.755 222021 DEBUG nova.compute.manager [req-16de82bf-0516-40b7-942f-e588f576bbeb req-d1a2c99d-85dc-499d-87a5-a5bffd3d5f8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Received event network-vif-deleted-dff7c478-3227-40b0-86ed-ff699a5e5ccf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.755 222021 INFO nova.compute.manager [req-16de82bf-0516-40b7-942f-e588f576bbeb req-d1a2c99d-85dc-499d-87a5-a5bffd3d5f8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Neutron deleted interface dff7c478-3227-40b0-86ed-ff699a5e5ccf; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.756 222021 DEBUG nova.network.neutron [req-16de82bf-0516-40b7-942f-e588f576bbeb req-d1a2c99d-85dc-499d-87a5-a5bffd3d5f8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.771 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.771 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.771 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.854 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:28 np0005593233 nova_compute[222017]: 2026-01-23 10:28:28.992 222021 INFO nova.compute.manager [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Took 0.91 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:28:29 np0005593233 nova_compute[222017]: 2026-01-23 10:28:29.094 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 05:28:29 np0005593233 nova_compute[222017]: 2026-01-23 10:28:29.105 222021 DEBUG nova.compute.manager [req-16de82bf-0516-40b7-942f-e588f576bbeb req-d1a2c99d-85dc-499d-87a5-a5bffd3d5f8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 87859c15-e250-4a5d-aab0-8bc67aae1bc3] Detach interface failed, port_id=dff7c478-3227-40b0-86ed-ff699a5e5ccf, reason: Instance 87859c15-e250-4a5d-aab0-8bc67aae1bc3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:28:29 np0005593233 nova_compute[222017]: 2026-01-23 10:28:29.601 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.083 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.083 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:30.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.162 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.162 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.163 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.163 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.227 222021 DEBUG oslo_concurrency.processutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:30.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2487446529' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.761 222021 DEBUG oslo_concurrency.processutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.768 222021 DEBUG nova.compute.provider_tree [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:30 np0005593233 nova_compute[222017]: 2026-01-23 10:28:30.957 222021 DEBUG nova.scheduler.client.report [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:32 np0005593233 podman[287357]: 2026-01-23 10:28:32.106817744 +0000 UTC m=+0.108099293 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:28:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:32.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:32 np0005593233 nova_compute[222017]: 2026-01-23 10:28:32.770 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:32 np0005593233 nova_compute[222017]: 2026-01-23 10:28:32.888 222021 INFO nova.scheduler.client.report [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Deleted allocations for instance 87859c15-e250-4a5d-aab0-8bc67aae1bc3#033[00m
Jan 23 05:28:33 np0005593233 nova_compute[222017]: 2026-01-23 10:28:33.551 222021 DEBUG oslo_concurrency.lockutils [None req-9cf28ee7-8c3b-4c7c-b5d9-4fcefbf8911e 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "87859c15-e250-4a5d-aab0-8bc67aae1bc3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 45.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:33.630 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:33 np0005593233 nova_compute[222017]: 2026-01-23 10:28:33.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e329 e329: 3 total, 3 up, 3 in
Jan 23 05:28:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:34.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:34.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:34 np0005593233 nova_compute[222017]: 2026-01-23 10:28:34.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:36.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:36.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.060 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updating instance_info_cache with network_info: [{"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:38.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.147 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-4b43bf7c-8fc3-4ea4-9401-283826c9ed39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.147 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.148 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.149 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.149 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.149 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.150 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:28:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e330 e330: 3 total, 3 up, 3 in
Jan 23 05:28:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:38.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:38 np0005593233 nova_compute[222017]: 2026-01-23 10:28:38.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e330 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:39 np0005593233 nova_compute[222017]: 2026-01-23 10:28:39.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:40.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:40.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:42.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:42.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:42.690 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:42.691 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:42.692 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:43 np0005593233 nova_compute[222017]: 2026-01-23 10:28:43.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e330 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:44.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:44.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:44 np0005593233 nova_compute[222017]: 2026-01-23 10:28:44.610 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:46.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:46.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:48.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:48.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e331 e331: 3 total, 3 up, 3 in
Jan 23 05:28:48 np0005593233 nova_compute[222017]: 2026-01-23 10:28:48.862 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:49 np0005593233 podman[287384]: 2026-01-23 10:28:49.062542511 +0000 UTC m=+0.072962814 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.613 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.763 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.764 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.764 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.765 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.765 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.767 222021 INFO nova.compute.manager [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Terminating instance#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.769 222021 DEBUG nova.compute.manager [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:28:49 np0005593233 kernel: tap45b1f068-97 (unregistering): left promiscuous mode
Jan 23 05:28:49 np0005593233 NetworkManager[48871]: <info>  [1769164129.9174] device (tap45b1f068-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:28:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:28:49Z|00730|binding|INFO|Releasing lport 45b1f068-9743-4164-a7d2-c1ab991c291f from this chassis (sb_readonly=0)
Jan 23 05:28:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:28:49Z|00731|binding|INFO|Setting lport 45b1f068-9743-4164-a7d2-c1ab991c291f down in Southbound
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:28:49Z|00732|binding|INFO|Removing iface tap45b1f068-97 ovn-installed in OVS
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.937 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:49.943 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:57:6c 10.100.0.10'], port_security=['fa:16:3e:1e:57:6c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4b43bf7c-8fc3-4ea4-9401-283826c9ed39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c27429e1d8f433a8a67ddb76f8798f1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '29028637-714b-453c-9e54-c753b1c8b7f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0dedc65-79e0-4ae8-b1b0-46423e11b58a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=45b1f068-9743-4164-a7d2-c1ab991c291f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:28:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:49.945 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 45b1f068-9743-4164-a7d2-c1ab991c291f in datapath fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 unbound from our chassis#033[00m
Jan 23 05:28:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:49.948 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:28:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:49.950 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f6d495d5-ea0e-4f6a-9221-db6b0b0ceced]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:49.951 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 namespace which is not needed anymore#033[00m
Jan 23 05:28:49 np0005593233 nova_compute[222017]: 2026-01-23 10:28:49.960 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:50 np0005593233 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Jan 23 05:28:50 np0005593233 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000a3.scope: Consumed 28.373s CPU time.
Jan 23 05:28:50 np0005593233 systemd-machined[190954]: Machine qemu-73-instance-000000a3 terminated.
Jan 23 05:28:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:50.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:50 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [NOTICE]   (283472) : haproxy version is 2.8.14-c23fe91
Jan 23 05:28:50 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [NOTICE]   (283472) : path to executable is /usr/sbin/haproxy
Jan 23 05:28:50 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [WARNING]  (283472) : Exiting Master process...
Jan 23 05:28:50 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [ALERT]    (283472) : Current worker (283474) exited with code 143 (Terminated)
Jan 23 05:28:50 np0005593233 neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4[283468]: [WARNING]  (283472) : All workers exited. Exiting... (0)
Jan 23 05:28:50 np0005593233 systemd[1]: libpod-6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1.scope: Deactivated successfully.
Jan 23 05:28:50 np0005593233 podman[287427]: 2026-01-23 10:28:50.204102278 +0000 UTC m=+0.080615502 container died 6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.238 222021 INFO nova.virt.libvirt.driver [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Instance destroyed successfully.#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.239 222021 DEBUG nova.objects.instance [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lazy-loading 'resources' on Instance uuid 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:28:50 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1-userdata-shm.mount: Deactivated successfully.
Jan 23 05:28:50 np0005593233 systemd[1]: var-lib-containers-storage-overlay-27e5f96a86838744a9e791ac8d7aae98f6b42ca2b6c4f97343e3d203b9f9afc3-merged.mount: Deactivated successfully.
Jan 23 05:28:50 np0005593233 podman[287427]: 2026-01-23 10:28:50.2706854 +0000 UTC m=+0.147198624 container cleanup 6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:28:50 np0005593233 systemd[1]: libpod-conmon-6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1.scope: Deactivated successfully.
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.342 222021 DEBUG nova.virt.libvirt.vif [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-250274490',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverbootfromvolumestablerescuetest-server-250274490',id=163,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:24:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c27429e1d8f433a8a67ddb76f8798f1',ramdisk_id='',reservation_id='r-bwzrhbs1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1351337832-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:14Z,user_data=None,user_id='0d6a628e0dcb441fa41457bf719e65a0',uuid=4b43bf7c-8fc3-4ea4-9401-283826c9ed39,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.343 222021 DEBUG nova.network.os_vif_util [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converting VIF {"id": "45b1f068-9743-4164-a7d2-c1ab991c291f", "address": "fa:16:3e:1e:57:6c", "network": {"id": "fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-596908432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c27429e1d8f433a8a67ddb76f8798f1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45b1f068-97", "ovs_interfaceid": "45b1f068-9743-4164-a7d2-c1ab991c291f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.344 222021 DEBUG nova.network.os_vif_util [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.345 222021 DEBUG os_vif [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.348 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.349 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45b1f068-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:50 np0005593233 podman[287467]: 2026-01-23 10:28:50.366110842 +0000 UTC m=+0.057365361 container remove 6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.389 222021 DEBUG nova.compute.manager [req-636008bb-fec5-4e3b-8ece-4bf215067613 req-ff93c8fc-71e3-4f49-aedb-74f515151b33 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-vif-unplugged-45b1f068-9743-4164-a7d2-c1ab991c291f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.390 222021 DEBUG oslo_concurrency.lockutils [req-636008bb-fec5-4e3b-8ece-4bf215067613 req-ff93c8fc-71e3-4f49-aedb-74f515151b33 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.410 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f49c54-4cb0-45f6-820c-d09e76f5c61b]: (4, ('Fri Jan 23 10:28:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 (6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1)\n6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1\nFri Jan 23 10:28:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 (6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1)\n6b8f84e521423559b9e60fad3702f2ff29f939c2c4898f21077ae9241dae5cd1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.413 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab918d8-13f0-4b1e-b330-502ececa0e3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.414 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbd64ab8-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:50 np0005593233 kernel: tapfbd64ab8-90: left promiscuous mode
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.452 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a33a694-2eda-43ca-b02f-6217f8860a72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.478 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dac8651a-8ed2-456a-b0ba-bcce3a7019e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.480 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4f916620-baa7-4781-b209-0b3c1818c039]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.502 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4784a14b-5e43-49f4-a751-c9bb7bb1b18a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775693, 'reachable_time': 15372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287482, 'error': None, 'target': 'ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 systemd[1]: run-netns-ovnmeta\x2dfbd64ab8\x2d9e5b\x2d4300\x2d98d7\x2d50a5d6fbefc4.mount: Deactivated successfully.
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.509 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:28:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:28:50.509 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[79723cbe-fb85-49f3-b16c-eef425dd7428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:50.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.568 222021 DEBUG oslo_concurrency.lockutils [req-636008bb-fec5-4e3b-8ece-4bf215067613 req-ff93c8fc-71e3-4f49-aedb-74f515151b33 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.569 222021 DEBUG oslo_concurrency.lockutils [req-636008bb-fec5-4e3b-8ece-4bf215067613 req-ff93c8fc-71e3-4f49-aedb-74f515151b33 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.569 222021 DEBUG nova.compute.manager [req-636008bb-fec5-4e3b-8ece-4bf215067613 req-ff93c8fc-71e3-4f49-aedb-74f515151b33 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] No waiting events found dispatching network-vif-unplugged-45b1f068-9743-4164-a7d2-c1ab991c291f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.569 222021 DEBUG nova.compute.manager [req-636008bb-fec5-4e3b-8ece-4bf215067613 req-ff93c8fc-71e3-4f49-aedb-74f515151b33 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-vif-unplugged-45b1f068-9743-4164-a7d2-c1ab991c291f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.569 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.571 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:28:50 np0005593233 nova_compute[222017]: 2026-01-23 10:28:50.573 222021 INFO os_vif [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:57:6c,bridge_name='br-int',has_traffic_filtering=True,id=45b1f068-9743-4164-a7d2-c1ab991c291f,network=Network(fbd64ab8-9e5b-4300-98d7-50a5d6fbefc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45b1f068-97')#033[00m
Jan 23 05:28:51 np0005593233 nova_compute[222017]: 2026-01-23 10:28:51.302 222021 INFO nova.virt.libvirt.driver [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Deleting instance files /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39_del#033[00m
Jan 23 05:28:51 np0005593233 nova_compute[222017]: 2026-01-23 10:28:51.304 222021 INFO nova.virt.libvirt.driver [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Deletion of /var/lib/nova/instances/4b43bf7c-8fc3-4ea4-9401-283826c9ed39_del complete#033[00m
Jan 23 05:28:51 np0005593233 nova_compute[222017]: 2026-01-23 10:28:51.632 222021 INFO nova.compute.manager [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Took 1.86 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:28:51 np0005593233 nova_compute[222017]: 2026-01-23 10:28:51.633 222021 DEBUG oslo.service.loopingcall [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:28:51 np0005593233 nova_compute[222017]: 2026-01-23 10:28:51.634 222021 DEBUG nova.compute.manager [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:28:51 np0005593233 nova_compute[222017]: 2026-01-23 10:28:51.635 222021 DEBUG nova.network.neutron [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:28:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:52.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:28:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:52.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:28:52 np0005593233 nova_compute[222017]: 2026-01-23 10:28:52.542 222021 DEBUG nova.compute.manager [req-16bfde0a-9ec1-4b5e-af4b-827ababc5d13 req-ccb6a5c9-2645-41c4-842b-4cc76ff341a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:52 np0005593233 nova_compute[222017]: 2026-01-23 10:28:52.543 222021 DEBUG oslo_concurrency.lockutils [req-16bfde0a-9ec1-4b5e-af4b-827ababc5d13 req-ccb6a5c9-2645-41c4-842b-4cc76ff341a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:52 np0005593233 nova_compute[222017]: 2026-01-23 10:28:52.543 222021 DEBUG oslo_concurrency.lockutils [req-16bfde0a-9ec1-4b5e-af4b-827ababc5d13 req-ccb6a5c9-2645-41c4-842b-4cc76ff341a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:52 np0005593233 nova_compute[222017]: 2026-01-23 10:28:52.543 222021 DEBUG oslo_concurrency.lockutils [req-16bfde0a-9ec1-4b5e-af4b-827ababc5d13 req-ccb6a5c9-2645-41c4-842b-4cc76ff341a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:52 np0005593233 nova_compute[222017]: 2026-01-23 10:28:52.543 222021 DEBUG nova.compute.manager [req-16bfde0a-9ec1-4b5e-af4b-827ababc5d13 req-ccb6a5c9-2645-41c4-842b-4cc76ff341a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] No waiting events found dispatching network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:28:52 np0005593233 nova_compute[222017]: 2026-01-23 10:28:52.544 222021 WARNING nova.compute.manager [req-16bfde0a-9ec1-4b5e-af4b-827ababc5d13 req-ccb6a5c9-2645-41c4-842b-4cc76ff341a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received unexpected event network-vif-plugged-45b1f068-9743-4164-a7d2-c1ab991c291f for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:28:53 np0005593233 podman[287672]: 2026-01-23 10:28:53.414429381 +0000 UTC m=+0.088960779 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 23 05:28:53 np0005593233 podman[287672]: 2026-01-23 10:28:53.526430804 +0000 UTC m=+0.200962152 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:28:53 np0005593233 nova_compute[222017]: 2026-01-23 10:28:53.624 222021 DEBUG nova.network.neutron [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:53 np0005593233 nova_compute[222017]: 2026-01-23 10:28:53.674 222021 INFO nova.compute.manager [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Took 2.04 seconds to deallocate network for instance.#033[00m
Jan 23 05:28:53 np0005593233 nova_compute[222017]: 2026-01-23 10:28:53.727 222021 DEBUG nova.compute.manager [req-090d292e-6ddb-4cc0-9594-828bdf9c7a06 req-6e5f18da-4457-4d6a-bf24-56f6b6cc3a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Received event network-vif-deleted-45b1f068-9743-4164-a7d2-c1ab991c291f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:53 np0005593233 nova_compute[222017]: 2026-01-23 10:28:53.734 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:53 np0005593233 nova_compute[222017]: 2026-01-23 10:28:53.735 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:53 np0005593233 nova_compute[222017]: 2026-01-23 10:28:53.807 222021 DEBUG oslo_concurrency.processutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:28:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:54.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:28:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:54 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/958414920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.330 222021 DEBUG oslo_concurrency.processutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.341 222021 DEBUG nova.compute.provider_tree [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.397 222021 DEBUG nova.scheduler.client.report [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.433 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.499 222021 INFO nova.scheduler.client.report [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Deleted allocations for instance 4b43bf7c-8fc3-4ea4-9401-283826c9ed39#033[00m
Jan 23 05:28:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:54.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.611 222021 DEBUG oslo_concurrency.lockutils [None req-bd2ab91e-f5c2-495c-8462-a58533347ca1 0d6a628e0dcb441fa41457bf719e65a0 5c27429e1d8f433a8a67ddb76f8798f1 - - default default] Lock "4b43bf7c-8fc3-4ea4-9401-283826c9ed39" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:54 np0005593233 nova_compute[222017]: 2026-01-23 10:28:54.615 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:28:55 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:28:55 np0005593233 nova_compute[222017]: 2026-01-23 10:28:55.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:55 np0005593233 nova_compute[222017]: 2026-01-23 10:28:55.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:55 np0005593233 nova_compute[222017]: 2026-01-23 10:28:55.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:56.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:28:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:28:56 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:28:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:56.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:57 np0005593233 nova_compute[222017]: 2026-01-23 10:28:57.309 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:57 np0005593233 nova_compute[222017]: 2026-01-23 10:28:57.309 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:57 np0005593233 nova_compute[222017]: 2026-01-23 10:28:57.310 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:57 np0005593233 nova_compute[222017]: 2026-01-23 10:28:57.310 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:28:57 np0005593233 nova_compute[222017]: 2026-01-23 10:28:57.311 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4031460086' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:57 np0005593233 nova_compute[222017]: 2026-01-23 10:28:57.817 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:58 np0005593233 nova_compute[222017]: 2026-01-23 10:28:58.065 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:28:58 np0005593233 nova_compute[222017]: 2026-01-23 10:28:58.067 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4329MB free_disk=20.897201538085938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:28:58 np0005593233 nova_compute[222017]: 2026-01-23 10:28:58.067 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:58 np0005593233 nova_compute[222017]: 2026-01-23 10:28:58.067 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:58.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 e332: 3 total, 3 up, 3 in
Jan 23 05:28:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:28:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:58.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:58 np0005593233 nova_compute[222017]: 2026-01-23 10:28:58.829 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:28:58 np0005593233 nova_compute[222017]: 2026-01-23 10:28:58.830 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:28:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:59 np0005593233 nova_compute[222017]: 2026-01-23 10:28:59.374 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:59 np0005593233 nova_compute[222017]: 2026-01-23 10:28:59.630 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/901553759' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:59 np0005593233 nova_compute[222017]: 2026-01-23 10:28:59.879 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:59 np0005593233 nova_compute[222017]: 2026-01-23 10:28:59.893 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:59 np0005593233 nova_compute[222017]: 2026-01-23 10:28:59.958 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:00.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:00 np0005593233 nova_compute[222017]: 2026-01-23 10:29:00.201 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:29:00 np0005593233 nova_compute[222017]: 2026-01-23 10:29:00.202 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:00 np0005593233 nova_compute[222017]: 2026-01-23 10:29:00.407 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:00.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:02.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:02 np0005593233 nova_compute[222017]: 2026-01-23 10:29:02.202 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:02 np0005593233 nova_compute[222017]: 2026-01-23 10:29:02.202 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:02 np0005593233 nova_compute[222017]: 2026-01-23 10:29:02.203 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:02 np0005593233 nova_compute[222017]: 2026-01-23 10:29:02.203 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:02 np0005593233 nova_compute[222017]: 2026-01-23 10:29:02.203 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:29:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:02.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:03 np0005593233 podman[287992]: 2026-01-23 10:29:03.161107985 +0000 UTC m=+0.154753669 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 05:29:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:29:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:29:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:04.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:04.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:04 np0005593233 nova_compute[222017]: 2026-01-23 10:29:04.671 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:05 np0005593233 nova_compute[222017]: 2026-01-23 10:29:05.232 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164130.2294116, 4b43bf7c-8fc3-4ea4-9401-283826c9ed39 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:29:05 np0005593233 nova_compute[222017]: 2026-01-23 10:29:05.232 222021 INFO nova.compute.manager [-] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:29:05 np0005593233 nova_compute[222017]: 2026-01-23 10:29:05.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:05 np0005593233 nova_compute[222017]: 2026-01-23 10:29:05.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:29:05 np0005593233 nova_compute[222017]: 2026-01-23 10:29:05.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:29:05 np0005593233 nova_compute[222017]: 2026-01-23 10:29:05.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:06.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:06.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:06 np0005593233 nova_compute[222017]: 2026-01-23 10:29:06.545 222021 DEBUG nova.compute.manager [None req-5614a6c8-4023-47cb-9edc-021cdc2387eb - - - - - -] [instance: 4b43bf7c-8fc3-4ea4-9401-283826c9ed39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:29:06 np0005593233 nova_compute[222017]: 2026-01-23 10:29:06.735 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:29:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:08.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:08.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:09 np0005593233 nova_compute[222017]: 2026-01-23 10:29:09.675 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:10.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:10 np0005593233 nova_compute[222017]: 2026-01-23 10:29:10.413 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:10.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:10 np0005593233 nova_compute[222017]: 2026-01-23 10:29:10.729 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:12.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:12.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:14 np0005593233 nova_compute[222017]: 2026-01-23 10:29:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:14.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:14 np0005593233 nova_compute[222017]: 2026-01-23 10:29:14.726 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:15 np0005593233 nova_compute[222017]: 2026-01-23 10:29:15.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:16.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:16.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:18.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:18.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:19 np0005593233 nova_compute[222017]: 2026-01-23 10:29:19.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:20 np0005593233 podman[288069]: 2026-01-23 10:29:20.075704836 +0000 UTC m=+0.083237106 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:29:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:20.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:20 np0005593233 nova_compute[222017]: 2026-01-23 10:29:20.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:20.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:22.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:22.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:24.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:24.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:24 np0005593233 nova_compute[222017]: 2026-01-23 10:29:24.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:25 np0005593233 nova_compute[222017]: 2026-01-23 10:29:25.430 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:26.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:26.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:28.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:28.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:29:28.887 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:29:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:29:28.887 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:29:28 np0005593233 nova_compute[222017]: 2026-01-23 10:29:28.888 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:29 np0005593233 nova_compute[222017]: 2026-01-23 10:29:29.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:29 np0005593233 nova_compute[222017]: 2026-01-23 10:29:29.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:30.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:30 np0005593233 nova_compute[222017]: 2026-01-23 10:29:30.432 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:30.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:30 np0005593233 nova_compute[222017]: 2026-01-23 10:29:30.861 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:30 np0005593233 nova_compute[222017]: 2026-01-23 10:29:30.862 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:29:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:32.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:32.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:34 np0005593233 podman[288091]: 2026-01-23 10:29:34.09014428 +0000 UTC m=+0.104591453 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:29:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:34.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:34.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:34 np0005593233 nova_compute[222017]: 2026-01-23 10:29:34.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:35 np0005593233 nova_compute[222017]: 2026-01-23 10:29:35.410 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:35 np0005593233 nova_compute[222017]: 2026-01-23 10:29:35.410 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:29:35 np0005593233 nova_compute[222017]: 2026-01-23 10:29:35.426 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:29:35 np0005593233 nova_compute[222017]: 2026-01-23 10:29:35.472 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:36.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:36.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:29:36.895 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:38.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:38.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:39 np0005593233 nova_compute[222017]: 2026-01-23 10:29:39.736 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:40.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:40 np0005593233 nova_compute[222017]: 2026-01-23 10:29:40.475 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:42.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:42.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:29:42.691 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:29:42.692 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:29:42.692 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:44.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:44.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:44 np0005593233 nova_compute[222017]: 2026-01-23 10:29:44.738 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:45 np0005593233 nova_compute[222017]: 2026-01-23 10:29:45.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:46.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:46.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.680 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.680 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.731 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.835 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.836 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.849 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:29:47 np0005593233 nova_compute[222017]: 2026-01-23 10:29:47.850 222021 INFO nova.compute.claims [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.035 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.149472) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188149526, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 2376, "num_deletes": 255, "total_data_size": 5555795, "memory_usage": 5647808, "flush_reason": "Manual Compaction"}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188177642, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 3645647, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70496, "largest_seqno": 72867, "table_properties": {"data_size": 3636033, "index_size": 6043, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20333, "raw_average_key_size": 20, "raw_value_size": 3616659, "raw_average_value_size": 3668, "num_data_blocks": 263, "num_entries": 986, "num_filter_entries": 986, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163986, "oldest_key_time": 1769163986, "file_creation_time": 1769164188, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 28243 microseconds, and 14713 cpu microseconds.
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.177711) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 3645647 bytes OK
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.177739) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.179604) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.179621) EVENT_LOG_v1 {"time_micros": 1769164188179615, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.179641) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 5545148, prev total WAL file size 5545148, number of live WAL files 2.
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.181569) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(3560KB)], [147(10071KB)]
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188181678, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 13959329, "oldest_snapshot_seqno": -1}
Jan 23 05:29:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:48.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9260 keys, 12060995 bytes, temperature: kUnknown
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188411430, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 12060995, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12001681, "index_size": 35073, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 243581, "raw_average_key_size": 26, "raw_value_size": 11839686, "raw_average_value_size": 1278, "num_data_blocks": 1341, "num_entries": 9260, "num_filter_entries": 9260, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164188, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3451758401' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.411838) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 12060995 bytes
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.526666) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.7 rd, 52.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.8 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(7.1) write-amplify(3.3) OK, records in: 9788, records dropped: 528 output_compression: NoCompression
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.526743) EVENT_LOG_v1 {"time_micros": 1769164188526716, "job": 94, "event": "compaction_finished", "compaction_time_micros": 229860, "compaction_time_cpu_micros": 37565, "output_level": 6, "num_output_files": 1, "total_output_size": 12060995, "num_input_records": 9788, "num_output_records": 9260, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188528548, "job": 94, "event": "table_file_deletion", "file_number": 149}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188533062, "job": 94, "event": "table_file_deletion", "file_number": 147}
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.181362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.533171) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.533182) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.533186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.533190) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:29:48.533194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.552 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.566 222021 DEBUG nova.compute.provider_tree [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.600 222021 DEBUG nova.scheduler.client.report [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:48.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.626 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.626 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.685 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.685 222021 DEBUG nova.network.neutron [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.715 222021 INFO nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.761 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.869 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.871 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.872 222021 INFO nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Creating image(s)#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.916 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:29:48 np0005593233 nova_compute[222017]: 2026-01-23 10:29:48.954 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:29:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.007 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.015 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.060 222021 DEBUG nova.policy [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.106 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.107 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.108 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.108 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.138 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.143 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a9112892-c55a-46f8-a5f2-6df7fac1510a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:49 np0005593233 nova_compute[222017]: 2026-01-23 10:29:49.743 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.075 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a9112892-c55a-46f8-a5f2-6df7fac1510a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.932s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.168 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:29:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:50.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.409 222021 DEBUG nova.objects.instance [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid a9112892-c55a-46f8-a5f2-6df7fac1510a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.431 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.432 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Ensure instance console log exists: /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.432 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.433 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.433 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:50 np0005593233 nova_compute[222017]: 2026-01-23 10:29:50.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:50.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:51 np0005593233 nova_compute[222017]: 2026-01-23 10:29:51.029 222021 DEBUG nova.network.neutron [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Successfully created port: 55e3e503-4e7f-4527-b7da-0242067d96b3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:29:51 np0005593233 podman[288306]: 2026-01-23 10:29:51.080497847 +0000 UTC m=+0.083304088 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:29:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:52.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:52.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:53 np0005593233 nova_compute[222017]: 2026-01-23 10:29:53.277 222021 DEBUG nova.network.neutron [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Successfully updated port: 55e3e503-4e7f-4527-b7da-0242067d96b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:29:53 np0005593233 nova_compute[222017]: 2026-01-23 10:29:53.302 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:29:53 np0005593233 nova_compute[222017]: 2026-01-23 10:29:53.302 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:29:53 np0005593233 nova_compute[222017]: 2026-01-23 10:29:53.303 222021 DEBUG nova.network.neutron [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:29:53 np0005593233 nova_compute[222017]: 2026-01-23 10:29:53.906 222021 DEBUG nova.network.neutron [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:29:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:54.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:54.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:54 np0005593233 nova_compute[222017]: 2026-01-23 10:29:54.787 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.084 222021 DEBUG nova.compute.manager [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.085 222021 DEBUG nova.compute.manager [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing instance network info cache due to event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.085 222021 DEBUG oslo_concurrency.lockutils [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.401 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.498 222021 DEBUG nova.network.neutron [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.521 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.521 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Instance network_info: |[{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.522 222021 DEBUG oslo_concurrency.lockutils [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.522 222021 DEBUG nova.network.neutron [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.526 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Start _get_guest_xml network_info=[{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.532 222021 WARNING nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.538 222021 DEBUG nova.virt.libvirt.host [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.539 222021 DEBUG nova.virt.libvirt.host [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.546 222021 DEBUG nova.virt.libvirt.host [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.547 222021 DEBUG nova.virt.libvirt.host [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.549 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.550 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.551 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.551 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.552 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.552 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.552 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.553 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.553 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.554 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.554 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.555 222021 DEBUG nova.virt.hardware [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:29:55 np0005593233 nova_compute[222017]: 2026-01-23 10:29:55.560 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:29:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3241512567' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.055 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.085 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.089 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:56.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:29:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/739981541' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.550 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.552 222021 DEBUG nova.virt.libvirt.vif [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:29:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1271596965',display_name='tempest-TestNetworkBasicOps-server-1271596965',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1271596965',id=174,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO2LI0UABNlDU3HpBqZ4TFdn/tyl+k7EkIfoy8J4Qrg3uVnEo4BKCEou7n9DUxH0pF9daXCxvV4DOAO3a7+2coTfCOHx8K9T8rhOLIf1u1NApLhksMZyj9YN8VZ26cJVCQ==',key_name='tempest-TestNetworkBasicOps-362963621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3chv5nxw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:29:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=a9112892-c55a-46f8-a5f2-6df7fac1510a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.552 222021 DEBUG nova.network.os_vif_util [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.553 222021 DEBUG nova.network.os_vif_util [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:29:56 np0005593233 nova_compute[222017]: 2026-01-23 10:29:56.554 222021 DEBUG nova.objects.instance [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid a9112892-c55a-46f8-a5f2-6df7fac1510a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:29:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:29:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:56.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.308 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.308 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.309 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.309 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.310 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.536 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <uuid>a9112892-c55a-46f8-a5f2-6df7fac1510a</uuid>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <name>instance-000000ae</name>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkBasicOps-server-1271596965</nova:name>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:29:55</nova:creationTime>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <nova:port uuid="55e3e503-4e7f-4527-b7da-0242067d96b3">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <entry name="serial">a9112892-c55a-46f8-a5f2-6df7fac1510a</entry>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <entry name="uuid">a9112892-c55a-46f8-a5f2-6df7fac1510a</entry>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a9112892-c55a-46f8-a5f2-6df7fac1510a_disk">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a9112892-c55a-46f8-a5f2-6df7fac1510a_disk.config">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:2b:9c:7e"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <target dev="tap55e3e503-4e"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/console.log" append="off"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:29:57 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:29:57 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:29:57 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:29:57 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.539 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Preparing to wait for external event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.540 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.541 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.541 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.543 222021 DEBUG nova.virt.libvirt.vif [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:29:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1271596965',display_name='tempest-TestNetworkBasicOps-server-1271596965',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1271596965',id=174,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO2LI0UABNlDU3HpBqZ4TFdn/tyl+k7EkIfoy8J4Qrg3uVnEo4BKCEou7n9DUxH0pF9daXCxvV4DOAO3a7+2coTfCOHx8K9T8rhOLIf1u1NApLhksMZyj9YN8VZ26cJVCQ==',key_name='tempest-TestNetworkBasicOps-362963621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3chv5nxw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:29:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=a9112892-c55a-46f8-a5f2-6df7fac1510a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.544 222021 DEBUG nova.network.os_vif_util [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.545 222021 DEBUG nova.network.os_vif_util [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.546 222021 DEBUG os_vif [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.548 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.549 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.550 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.556 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55e3e503-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.557 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap55e3e503-4e, col_values=(('external_ids', {'iface-id': '55e3e503-4e7f-4527-b7da-0242067d96b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:9c:7e', 'vm-uuid': 'a9112892-c55a-46f8-a5f2-6df7fac1510a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.560 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:57 np0005593233 NetworkManager[48871]: <info>  [1769164197.5617] manager: (tap55e3e503-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.582 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.584 222021 INFO os_vif [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e')#033[00m
Jan 23 05:29:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:29:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/170332775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:57 np0005593233 nova_compute[222017]: 2026-01-23 10:29:57.866 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:58.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:29:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:29:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:58.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:29:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.790 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.802 222021 DEBUG nova.network.neutron [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updated VIF entry in instance network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.803 222021 DEBUG nova.network.neutron [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.838 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.838 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.840 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.841 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.841 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:2b:9c:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.842 222021 INFO nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Using config drive#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.883 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:29:59 np0005593233 nova_compute[222017]: 2026-01-23 10:29:59.911 222021 DEBUG oslo_concurrency.lockutils [req-7540ec79-3133-4f3e-8bde-a5f11fd407f0 req-0a6238c6-c26a-486d-b435-3b04123d5bf3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:00 np0005593233 nova_compute[222017]: 2026-01-23 10:30:00.116 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:00 np0005593233 nova_compute[222017]: 2026-01-23 10:30:00.118 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4402MB free_disk=20.946773529052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:30:00 np0005593233 nova_compute[222017]: 2026-01-23 10:30:00.118 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:00 np0005593233 nova_compute[222017]: 2026-01-23 10:30:00.118 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:00.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:00.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 05:30:01 np0005593233 nova_compute[222017]: 2026-01-23 10:30:01.963 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance a9112892-c55a-46f8-a5f2-6df7fac1510a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:30:01 np0005593233 nova_compute[222017]: 2026-01-23 10:30:01.963 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:30:01 np0005593233 nova_compute[222017]: 2026-01-23 10:30:01.964 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.028 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.054 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.055 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.072 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.098 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.162 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:02.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.566 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:30:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1967628664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.669 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.680 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:02 np0005593233 nova_compute[222017]: 2026-01-23 10:30:02.912 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.060 222021 INFO nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Creating config drive at /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/disk.config#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.068 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8gwk_nqr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.126 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.127 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.232 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8gwk_nqr" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.288 222021 DEBUG nova.storage.rbd_utils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a9112892-c55a-46f8-a5f2-6df7fac1510a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.293 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/disk.config a9112892-c55a-46f8-a5f2-6df7fac1510a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.464 222021 DEBUG oslo_concurrency.processutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/disk.config a9112892-c55a-46f8-a5f2-6df7fac1510a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.465 222021 INFO nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Deleting local config drive /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a/disk.config because it was imported into RBD.#033[00m
Jan 23 05:30:03 np0005593233 kernel: tap55e3e503-4e: entered promiscuous mode
Jan 23 05:30:03 np0005593233 NetworkManager[48871]: <info>  [1769164203.5307] manager: (tap55e3e503-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Jan 23 05:30:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:03Z|00733|binding|INFO|Claiming lport 55e3e503-4e7f-4527-b7da-0242067d96b3 for this chassis.
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:03Z|00734|binding|INFO|55e3e503-4e7f-4527-b7da-0242067d96b3: Claiming fa:16:3e:2b:9c:7e 10.100.0.11
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.540 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 systemd-udevd[288559]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:30:03 np0005593233 systemd-machined[190954]: New machine qemu-80-instance-000000ae.
Jan 23 05:30:03 np0005593233 NetworkManager[48871]: <info>  [1769164203.5962] device (tap55e3e503-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:30:03 np0005593233 NetworkManager[48871]: <info>  [1769164203.5971] device (tap55e3e503-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:30:03 np0005593233 systemd[1]: Started Virtual Machine qemu-80-instance-000000ae.
Jan 23 05:30:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:03Z|00735|binding|INFO|Setting lport 55e3e503-4e7f-4527-b7da-0242067d96b3 ovn-installed in OVS
Jan 23 05:30:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:03Z|00736|binding|INFO|Setting lport 55e3e503-4e7f-4527-b7da-0242067d96b3 up in Southbound
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.639 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:9c:7e 10.100.0.11'], port_security=['fa:16:3e:2b:9c:7e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a9112892-c55a-46f8-a5f2-6df7fac1510a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3128fa93-5584-4fd7-b8b2-100d4babba87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '92b696b2-abea-4273-befd-9b9d9b6c5bb3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d423e3-a129-4092-a097-e9db38a84e9f, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=55e3e503-4e7f-4527-b7da-0242067d96b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.641 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 55e3e503-4e7f-4527-b7da-0242067d96b3 in datapath 3128fa93-5584-4fd7-b8b2-100d4babba87 bound to our chassis#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.642 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3128fa93-5584-4fd7-b8b2-100d4babba87#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.655 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[37bd639b-afb1-4daa-a131-90285b637091]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.656 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3128fa93-51 in ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.659 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3128fa93-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.660 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b756c3-7b27-49c8-a477-9c96cb58a228]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.661 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[409dccba-cc2e-47d8-b922-061add08e582]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.675 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[99de867a-f742-4b4c-bf45-05e9046e2176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.698 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[18c9a92c-5833-463a-aa48-d9907d4a5df5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.736 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4b8e5de8-c002-4a12-9a03-f83e285561c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 NetworkManager[48871]: <info>  [1769164203.7434] manager: (tap3128fa93-50): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.742 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fb2ec3fc-ccb5-4b02-81ca-e69e34fa86a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.778 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5a32ff4c-123e-4ada-a975-09a734d1789b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.782 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b5171ddf-604f-4392-8d4c-c388e61f6b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 NetworkManager[48871]: <info>  [1769164203.8046] device (tap3128fa93-50): carrier: link connected
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.810 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3351995a-35d4-47cc-a149-1d34b3a60a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.829 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd36b9e-28f6-4d1c-968b-3f886ee35d66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3128fa93-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:9d:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 811643, 'reachable_time': 15476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288637, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.845 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3461b1af-422d-4f06-9c42-dedd460ff940]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:9de0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 811643, 'tstamp': 811643}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288638, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.864 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a004fe2b-c82a-4e6c-b924-7b25c882809a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3128fa93-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:9d:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 811643, 'reachable_time': 15476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 288640, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.898 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f2fe1388-9fab-42f0-8aa1-a949bed71828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.972 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3d2cc323-fea2-40cf-aab9-ac5dfaeeefdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.974 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3128fa93-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.974 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.975 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3128fa93-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.976 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 NetworkManager[48871]: <info>  [1769164203.9776] manager: (tap3128fa93-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 23 05:30:03 np0005593233 kernel: tap3128fa93-50: entered promiscuous mode
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.978 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.979 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3128fa93-50, col_values=(('external_ids', {'iface-id': '40f3a1ed-d213-498b-a2eb-96feaa1eae36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.980 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:03Z|00737|binding|INFO|Releasing lport 40f3a1ed-d213-498b-a2eb-96feaa1eae36 from this chassis (sb_readonly=0)
Jan 23 05:30:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:03 np0005593233 nova_compute[222017]: 2026-01-23 10:30:03.994 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.995 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3128fa93-5584-4fd7-b8b2-100d4babba87.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3128fa93-5584-4fd7-b8b2-100d4babba87.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.996 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[37d1ad47-3fb1-4b8e-bf76-a8d101f65770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.998 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-3128fa93-5584-4fd7-b8b2-100d4babba87
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/3128fa93-5584-4fd7-b8b2-100d4babba87.pid.haproxy
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 3128fa93-5584-4fd7-b8b2-100d4babba87
Jan 23 05:30:03 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:30:04 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:03.998 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'env', 'PROCESS_TAG=haproxy-3128fa93-5584-4fd7-b8b2-100d4babba87', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3128fa93-5584-4fd7-b8b2-100d4babba87.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.129 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.129 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.129 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.129 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.130 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.156 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164204.15527, a9112892-c55a-46f8-a5f2-6df7fac1510a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.156 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] VM Started (Lifecycle Event)#033[00m
Jan 23 05:30:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:04.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:04 np0005593233 podman[288743]: 2026-01-23 10:30:04.460993188 +0000 UTC m=+0.077499822 container create 6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:30:04 np0005593233 systemd[1]: Started libpod-conmon-6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4.scope.
Jan 23 05:30:04 np0005593233 podman[288743]: 2026-01-23 10:30:04.418151951 +0000 UTC m=+0.034658605 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:30:04 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:30:04 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e697d54862fd9a31eefcf18d811b09dc1273fb2b5f7c675865980321fef4b90/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:30:04 np0005593233 podman[288743]: 2026-01-23 10:30:04.56311875 +0000 UTC m=+0.179625404 container init 6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:30:04 np0005593233 podman[288743]: 2026-01-23 10:30:04.570828939 +0000 UTC m=+0.187335553 container start 6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.575 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.584 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164204.1557105, a9112892-c55a-46f8-a5f2-6df7fac1510a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.585 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:30:04 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [NOTICE]   (288781) : New worker (288788) forked
Jan 23 05:30:04 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [NOTICE]   (288781) : Loading success.
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.627 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.631 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:04.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:04 np0005593233 podman[288756]: 2026-01-23 10:30:04.645337456 +0000 UTC m=+0.141245184 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.681 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:04 np0005593233 nova_compute[222017]: 2026-01-23 10:30:04.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:05 np0005593233 nova_compute[222017]: 2026-01-23 10:30:05.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:05 np0005593233 nova_compute[222017]: 2026-01-23 10:30:05.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:30:05 np0005593233 nova_compute[222017]: 2026-01-23 10:30:05.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:30:05 np0005593233 nova_compute[222017]: 2026-01-23 10:30:05.623 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:30:05 np0005593233 nova_compute[222017]: 2026-01-23 10:30:05.624 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:30:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:06.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.569 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.575 222021 DEBUG nova.compute.manager [req-439fdcad-d98a-43af-a5c5-7f1954dabf4e req-c6240ed8-b8ac-49f2-9617-3077bc1c0292 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.576 222021 DEBUG oslo_concurrency.lockutils [req-439fdcad-d98a-43af-a5c5-7f1954dabf4e req-c6240ed8-b8ac-49f2-9617-3077bc1c0292 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.576 222021 DEBUG oslo_concurrency.lockutils [req-439fdcad-d98a-43af-a5c5-7f1954dabf4e req-c6240ed8-b8ac-49f2-9617-3077bc1c0292 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.576 222021 DEBUG oslo_concurrency.lockutils [req-439fdcad-d98a-43af-a5c5-7f1954dabf4e req-c6240ed8-b8ac-49f2-9617-3077bc1c0292 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.576 222021 DEBUG nova.compute.manager [req-439fdcad-d98a-43af-a5c5-7f1954dabf4e req-c6240ed8-b8ac-49f2-9617-3077bc1c0292 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Processing event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.577 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.583 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164207.5830195, a9112892-c55a-46f8-a5f2-6df7fac1510a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.583 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.586 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.590 222021 INFO nova.virt.libvirt.driver [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Instance spawned successfully.#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.590 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.853 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.861 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.862 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.862 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.862 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.863 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.863 222021 DEBUG nova.virt.libvirt.driver [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.867 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:07 np0005593233 nova_compute[222017]: 2026-01-23 10:30:07.955 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:07 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:30:08 np0005593233 nova_compute[222017]: 2026-01-23 10:30:08.101 222021 INFO nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Took 19.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:30:08 np0005593233 nova_compute[222017]: 2026-01-23 10:30:08.101 222021 DEBUG nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:08.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:08 np0005593233 nova_compute[222017]: 2026-01-23 10:30:08.409 222021 INFO nova.compute.manager [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Took 20.61 seconds to build instance.#033[00m
Jan 23 05:30:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:08.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:08 np0005593233 nova_compute[222017]: 2026-01-23 10:30:08.832 222021 DEBUG oslo_concurrency.lockutils [None req-8df1f79a-1502-4292-8329-39d2ae783f1a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.617 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.706 222021 DEBUG nova.compute.manager [req-f5bac6e5-22ae-48b6-bd0c-758f32da9477 req-45267de5-a7ad-4394-9efc-cbc859d07fec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.706 222021 DEBUG oslo_concurrency.lockutils [req-f5bac6e5-22ae-48b6-bd0c-758f32da9477 req-45267de5-a7ad-4394-9efc-cbc859d07fec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.707 222021 DEBUG oslo_concurrency.lockutils [req-f5bac6e5-22ae-48b6-bd0c-758f32da9477 req-45267de5-a7ad-4394-9efc-cbc859d07fec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.707 222021 DEBUG oslo_concurrency.lockutils [req-f5bac6e5-22ae-48b6-bd0c-758f32da9477 req-45267de5-a7ad-4394-9efc-cbc859d07fec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.707 222021 DEBUG nova.compute.manager [req-f5bac6e5-22ae-48b6-bd0c-758f32da9477 req-45267de5-a7ad-4394-9efc-cbc859d07fec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.707 222021 WARNING nova.compute.manager [req-f5bac6e5-22ae-48b6-bd0c-758f32da9477 req-45267de5-a7ad-4394-9efc-cbc859d07fec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:30:09 np0005593233 nova_compute[222017]: 2026-01-23 10:30:09.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:10.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:10.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:12.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:12 np0005593233 nova_compute[222017]: 2026-01-23 10:30:12.572 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:14.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:14.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:14 np0005593233 nova_compute[222017]: 2026-01-23 10:30:14.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:16.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:16.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:17 np0005593233 NetworkManager[48871]: <info>  [1769164217.5531] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 23 05:30:17 np0005593233 NetworkManager[48871]: <info>  [1769164217.5545] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 23 05:30:17 np0005593233 nova_compute[222017]: 2026-01-23 10:30:17.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:17 np0005593233 nova_compute[222017]: 2026-01-23 10:30:17.574 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:17 np0005593233 nova_compute[222017]: 2026-01-23 10:30:17.780 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:17 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:17Z|00738|binding|INFO|Releasing lport 40f3a1ed-d213-498b-a2eb-96feaa1eae36 from this chassis (sb_readonly=0)
Jan 23 05:30:17 np0005593233 nova_compute[222017]: 2026-01-23 10:30:17.805 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:18.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:18.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:19 np0005593233 nova_compute[222017]: 2026-01-23 10:30:19.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:20.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:20 np0005593233 nova_compute[222017]: 2026-01-23 10:30:20.461 222021 DEBUG nova.compute.manager [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:20 np0005593233 nova_compute[222017]: 2026-01-23 10:30:20.463 222021 DEBUG nova.compute.manager [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing instance network info cache due to event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:30:20 np0005593233 nova_compute[222017]: 2026-01-23 10:30:20.464 222021 DEBUG oslo_concurrency.lockutils [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:20 np0005593233 nova_compute[222017]: 2026-01-23 10:30:20.465 222021 DEBUG oslo_concurrency.lockutils [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:20 np0005593233 nova_compute[222017]: 2026-01-23 10:30:20.465 222021 DEBUG nova.network.neutron [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:30:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:20.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:22 np0005593233 podman[288856]: 2026-01-23 10:30:22.109497936 +0000 UTC m=+0.107964828 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:30:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:22.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:22 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:22Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:9c:7e 10.100.0.11
Jan 23 05:30:22 np0005593233 ovn_controller[130653]: 2026-01-23T10:30:22Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:9c:7e 10.100.0.11
Jan 23 05:30:22 np0005593233 nova_compute[222017]: 2026-01-23 10:30:22.576 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:22.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e333 e333: 3 total, 3 up, 3 in
Jan 23 05:30:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:24.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:24 np0005593233 nova_compute[222017]: 2026-01-23 10:30:24.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:24.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:24 np0005593233 nova_compute[222017]: 2026-01-23 10:30:24.847 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:26.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e334 e334: 3 total, 3 up, 3 in
Jan 23 05:30:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:26.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e335 e335: 3 total, 3 up, 3 in
Jan 23 05:30:27 np0005593233 nova_compute[222017]: 2026-01-23 10:30:27.578 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:27 np0005593233 nova_compute[222017]: 2026-01-23 10:30:27.959 222021 DEBUG nova.network.neutron [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updated VIF entry in instance network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:30:27 np0005593233 nova_compute[222017]: 2026-01-23 10:30:27.960 222021 DEBUG nova.network.neutron [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:27 np0005593233 nova_compute[222017]: 2026-01-23 10:30:27.994 222021 DEBUG oslo_concurrency.lockutils [req-2e4759ad-ae8d-481b-8adf-3fc6bd64189c req-19366662-edbe-4277-ab78-138db1f99881 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:28.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:28.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:29 np0005593233 nova_compute[222017]: 2026-01-23 10:30:29.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:29.025 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:29.027 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:30:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:29 np0005593233 nova_compute[222017]: 2026-01-23 10:30:29.850 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:30.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:30.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:32.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:32 np0005593233 nova_compute[222017]: 2026-01-23 10:30:32.579 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e336 e336: 3 total, 3 up, 3 in
Jan 23 05:30:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:34.029 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e336 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:34.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e337 e337: 3 total, 3 up, 3 in
Jan 23 05:30:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:34.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:34 np0005593233 nova_compute[222017]: 2026-01-23 10:30:34.853 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:35 np0005593233 podman[288879]: 2026-01-23 10:30:35.115429952 +0000 UTC m=+0.121983097 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:30:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e338 e338: 3 total, 3 up, 3 in
Jan 23 05:30:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:36.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:36.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:37 np0005593233 nova_compute[222017]: 2026-01-23 10:30:37.581 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:38.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:38.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:39 np0005593233 nova_compute[222017]: 2026-01-23 10:30:39.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:40.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:40.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:42.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:42 np0005593233 nova_compute[222017]: 2026-01-23 10:30:42.583 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:42.693 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:42.693 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:30:42.694 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:42.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:30:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:44.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:30:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 e339: 3 total, 3 up, 3 in
Jan 23 05:30:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:44.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:44 np0005593233 nova_compute[222017]: 2026-01-23 10:30:44.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:46.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:46.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:47 np0005593233 nova_compute[222017]: 2026-01-23 10:30:47.585 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:48.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:48.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:49 np0005593233 nova_compute[222017]: 2026-01-23 10:30:49.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:50.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:50.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:52.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:52 np0005593233 nova_compute[222017]: 2026-01-23 10:30:52.588 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:52.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:53 np0005593233 podman[288908]: 2026-01-23 10:30:53.082996388 +0000 UTC m=+0.084246655 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:30:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:30:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:54.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:30:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:54.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:54 np0005593233 nova_compute[222017]: 2026-01-23 10:30:54.864 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:56.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:56.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.417 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.418 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.419 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.419 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.420 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.590 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:30:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2896242075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.887 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.981 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:30:57 np0005593233 nova_compute[222017]: 2026-01-23 10:30:57.981 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000ae as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.163 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.164 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4148MB free_disk=20.80990982055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.164 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.164 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.247 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance a9112892-c55a-46f8-a5f2-6df7fac1510a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.247 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.247 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:30:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:58.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.342 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:30:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:30:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:58.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.861 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.871 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.935 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.976 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:30:58 np0005593233 nova_compute[222017]: 2026-01-23 10:30:58.977 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:59 np0005593233 nova_compute[222017]: 2026-01-23 10:30:59.866 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:59 np0005593233 nova_compute[222017]: 2026-01-23 10:30:59.976 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:00.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:00 np0005593233 nova_compute[222017]: 2026-01-23 10:31:00.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:00.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:01 np0005593233 nova_compute[222017]: 2026-01-23 10:31:01.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:02.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:02 np0005593233 nova_compute[222017]: 2026-01-23 10:31:02.591 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:02.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:03 np0005593233 nova_compute[222017]: 2026-01-23 10:31:03.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:03 np0005593233 nova_compute[222017]: 2026-01-23 10:31:03.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:31:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:04.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:04.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:04 np0005593233 nova_compute[222017]: 2026-01-23 10:31:04.870 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:06 np0005593233 podman[288975]: 2026-01-23 10:31:06.166425148 +0000 UTC m=+0.140065221 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:31:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:06.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:06.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:07 np0005593233 nova_compute[222017]: 2026-01-23 10:31:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:07 np0005593233 nova_compute[222017]: 2026-01-23 10:31:07.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:31:07 np0005593233 nova_compute[222017]: 2026-01-23 10:31:07.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:31:07 np0005593233 nova_compute[222017]: 2026-01-23 10:31:07.594 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:08.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:08.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:09 np0005593233 nova_compute[222017]: 2026-01-23 10:31:09.878 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:09 np0005593233 nova_compute[222017]: 2026-01-23 10:31:09.935 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:09 np0005593233 nova_compute[222017]: 2026-01-23 10:31:09.936 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:09 np0005593233 nova_compute[222017]: 2026-01-23 10:31:09.936 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:31:09 np0005593233 nova_compute[222017]: 2026-01-23 10:31:09.937 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a9112892-c55a-46f8-a5f2-6df7fac1510a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:10.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:10.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:12.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:12 np0005593233 nova_compute[222017]: 2026-01-23 10:31:12.709 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:12.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:14.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:14.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:14 np0005593233 nova_compute[222017]: 2026-01-23 10:31:14.880 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:15 np0005593233 nova_compute[222017]: 2026-01-23 10:31:15.895 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:15 np0005593233 nova_compute[222017]: 2026-01-23 10:31:15.896 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:15 np0005593233 nova_compute[222017]: 2026-01-23 10:31:15.935 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:31:16 np0005593233 nova_compute[222017]: 2026-01-23 10:31:16.071 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:16 np0005593233 nova_compute[222017]: 2026-01-23 10:31:16.073 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:16 np0005593233 nova_compute[222017]: 2026-01-23 10:31:16.089 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:31:16 np0005593233 nova_compute[222017]: 2026-01-23 10:31:16.089 222021 INFO nova.compute.claims [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:31:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:16.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:16.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:16 np0005593233 nova_compute[222017]: 2026-01-23 10:31:16.922 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:17 np0005593233 nova_compute[222017]: 2026-01-23 10:31:17.047 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:17 np0005593233 nova_compute[222017]: 2026-01-23 10:31:17.074 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:17 np0005593233 nova_compute[222017]: 2026-01-23 10:31:17.075 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3934087115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:17 np0005593233 nova_compute[222017]: 2026-01-23 10:31:17.414 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:17 np0005593233 nova_compute[222017]: 2026-01-23 10:31:17.422 222021 DEBUG nova.compute.provider_tree [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:31:17 np0005593233 nova_compute[222017]: 2026-01-23 10:31:17.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:31:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.8 total, 600.0 interval#012Cumulative writes: 52K writes, 203K keys, 52K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.04 MB/s#012Cumulative WAL: 52K writes, 19K syncs, 2.68 writes per sync, written: 0.19 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5903 writes, 21K keys, 5903 commit groups, 1.0 writes per commit group, ingest: 23.10 MB, 0.04 MB/s#012Interval WAL: 5903 writes, 2382 syncs, 2.48 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:31:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:18.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:18.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:19 np0005593233 nova_compute[222017]: 2026-01-23 10:31:19.489 222021 DEBUG nova.scheduler.client.report [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:19 np0005593233 nova_compute[222017]: 2026-01-23 10:31:19.531 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:19 np0005593233 nova_compute[222017]: 2026-01-23 10:31:19.533 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:31:19 np0005593233 nova_compute[222017]: 2026-01-23 10:31:19.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.060 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.061 222021 DEBUG nova.network.neutron [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.069 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.109 222021 INFO nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.152 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:31:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:20.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.403 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.405 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.405 222021 INFO nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Creating image(s)#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.443 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.479 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.517 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.522 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.576 222021 DEBUG nova.policy [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1629a4b14764dddaabcadd16f3e1c1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.582 222021 INFO nova.compute.manager [None req-55e725eb-5ecf-42d5-a943-cb12579d031a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Get console output#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.591 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.623 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.624 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.625 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.625 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.655 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:20 np0005593233 nova_compute[222017]: 2026-01-23 10:31:20.660 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:20.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.051 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.390s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.137 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] resizing rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.266 222021 DEBUG nova.objects.instance [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.352 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.353 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Ensure instance console log exists: /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.354 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.355 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:21 np0005593233 nova_compute[222017]: 2026-01-23 10:31:21.355 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:22.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:22 np0005593233 nova_compute[222017]: 2026-01-23 10:31:22.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:22.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.564321) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283564375, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1350, "num_deletes": 260, "total_data_size": 2824898, "memory_usage": 2877736, "flush_reason": "Manual Compaction"}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283584655, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 1842297, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 72872, "largest_seqno": 74217, "table_properties": {"data_size": 1836455, "index_size": 3108, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13115, "raw_average_key_size": 20, "raw_value_size": 1824385, "raw_average_value_size": 2789, "num_data_blocks": 136, "num_entries": 654, "num_filter_entries": 654, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164189, "oldest_key_time": 1769164189, "file_creation_time": 1769164283, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 20363 microseconds, and 6259 cpu microseconds.
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.584686) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 1842297 bytes OK
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.584705) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.586598) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.586615) EVENT_LOG_v1 {"time_micros": 1769164283586609, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.586635) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 2818332, prev total WAL file size 2820258, number of live WAL files 2.
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.591336) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373632' seq:72057594037927935, type:22 .. '6C6F676D0033303136' seq:0, type:0; will stop at (end)
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1799KB)], [150(11MB)]
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283591375, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 13903292, "oldest_snapshot_seqno": -1}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9376 keys, 13758959 bytes, temperature: kUnknown
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283742156, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 13758959, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13696790, "index_size": 37608, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23493, "raw_key_size": 247092, "raw_average_key_size": 26, "raw_value_size": 13530821, "raw_average_value_size": 1443, "num_data_blocks": 1446, "num_entries": 9376, "num_filter_entries": 9376, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164283, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.743035) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 13758959 bytes
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.744581) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 92.1 rd, 91.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 11.5 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(15.0) write-amplify(7.5) OK, records in: 9914, records dropped: 538 output_compression: NoCompression
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.744615) EVENT_LOG_v1 {"time_micros": 1769164283744600, "job": 96, "event": "compaction_finished", "compaction_time_micros": 150891, "compaction_time_cpu_micros": 34437, "output_level": 6, "num_output_files": 1, "total_output_size": 13758959, "num_input_records": 9914, "num_output_records": 9376, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283745905, "job": 96, "event": "table_file_deletion", "file_number": 152}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283751627, "job": 96, "event": "table_file_deletion", "file_number": 150}
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.591209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.751730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.751737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.751740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.751743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:31:23.751745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593233 podman[289463]: 2026-01-23 10:31:23.988644284 +0000 UTC m=+0.086272812 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:31:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:24.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.451 222021 DEBUG nova.compute.manager [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.452 222021 DEBUG nova.compute.manager [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing instance network info cache due to event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.453 222021 DEBUG oslo_concurrency.lockutils [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.453 222021 DEBUG oslo_concurrency.lockutils [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.453 222021 DEBUG nova.network.neutron [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:24 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:24.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.872 222021 INFO nova.compute.manager [None req-70574fa8-cc71-4fe3-b1bd-ff77fef50fca 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Get console output#033[00m
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.880 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:31:24 np0005593233 nova_compute[222017]: 2026-01-23 10:31:24.886 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:25 np0005593233 nova_compute[222017]: 2026-01-23 10:31:25.176 222021 DEBUG nova.network.neutron [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Successfully created port: 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:31:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:26.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:26 np0005593233 nova_compute[222017]: 2026-01-23 10:31:26.730 222021 DEBUG nova.compute.manager [req-3688af07-b39b-4a9e-bbc2-1c6f4bba9333 req-4e1f7440-a490-4473-a982-e0bb0bd6002f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-unplugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:26 np0005593233 nova_compute[222017]: 2026-01-23 10:31:26.732 222021 DEBUG oslo_concurrency.lockutils [req-3688af07-b39b-4a9e-bbc2-1c6f4bba9333 req-4e1f7440-a490-4473-a982-e0bb0bd6002f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:26 np0005593233 nova_compute[222017]: 2026-01-23 10:31:26.733 222021 DEBUG oslo_concurrency.lockutils [req-3688af07-b39b-4a9e-bbc2-1c6f4bba9333 req-4e1f7440-a490-4473-a982-e0bb0bd6002f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:26 np0005593233 nova_compute[222017]: 2026-01-23 10:31:26.733 222021 DEBUG oslo_concurrency.lockutils [req-3688af07-b39b-4a9e-bbc2-1c6f4bba9333 req-4e1f7440-a490-4473-a982-e0bb0bd6002f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:26 np0005593233 nova_compute[222017]: 2026-01-23 10:31:26.734 222021 DEBUG nova.compute.manager [req-3688af07-b39b-4a9e-bbc2-1c6f4bba9333 req-4e1f7440-a490-4473-a982-e0bb0bd6002f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-unplugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:26 np0005593233 nova_compute[222017]: 2026-01-23 10:31:26.734 222021 WARNING nova.compute.manager [req-3688af07-b39b-4a9e-bbc2-1c6f4bba9333 req-4e1f7440-a490-4473-a982-e0bb0bd6002f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-unplugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:26.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:27 np0005593233 nova_compute[222017]: 2026-01-23 10:31:27.725 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:28.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:28.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:29 np0005593233 nova_compute[222017]: 2026-01-23 10:31:29.890 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:30.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:30.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:32.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:32 np0005593233 nova_compute[222017]: 2026-01-23 10:31:32.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:32.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.284 222021 DEBUG nova.compute.manager [req-467bd386-979d-46c0-a40e-456437ddff37 req-eb5ffff0-e36d-468b-a1a8-c85b386bfe3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.285 222021 DEBUG oslo_concurrency.lockutils [req-467bd386-979d-46c0-a40e-456437ddff37 req-eb5ffff0-e36d-468b-a1a8-c85b386bfe3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.285 222021 DEBUG oslo_concurrency.lockutils [req-467bd386-979d-46c0-a40e-456437ddff37 req-eb5ffff0-e36d-468b-a1a8-c85b386bfe3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.285 222021 DEBUG oslo_concurrency.lockutils [req-467bd386-979d-46c0-a40e-456437ddff37 req-eb5ffff0-e36d-468b-a1a8-c85b386bfe3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.285 222021 DEBUG nova.compute.manager [req-467bd386-979d-46c0-a40e-456437ddff37 req-eb5ffff0-e36d-468b-a1a8-c85b386bfe3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.286 222021 WARNING nova.compute.manager [req-467bd386-979d-46c0-a40e-456437ddff37 req-eb5ffff0-e36d-468b-a1a8-c85b386bfe3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:33.350 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:33.351 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:31:33 np0005593233 nova_compute[222017]: 2026-01-23 10:31:33.352 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:34.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:34.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:34 np0005593233 nova_compute[222017]: 2026-01-23 10:31:34.894 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:35 np0005593233 nova_compute[222017]: 2026-01-23 10:31:35.556 222021 INFO nova.compute.manager [None req-0fe29ef6-151c-43f2-a8c7-ccfb6e4d60cb 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Get console output#033[00m
Jan 23 05:31:35 np0005593233 nova_compute[222017]: 2026-01-23 10:31:35.561 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:31:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:36.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:37 np0005593233 podman[289512]: 2026-01-23 10:31:37.170869831 +0000 UTC m=+0.168249662 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true)
Jan 23 05:31:37 np0005593233 nova_compute[222017]: 2026-01-23 10:31:37.703 222021 DEBUG nova.network.neutron [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updated VIF entry in instance network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:37 np0005593233 nova_compute[222017]: 2026-01-23 10:31:37.703 222021 DEBUG nova.network.neutron [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:37 np0005593233 nova_compute[222017]: 2026-01-23 10:31:37.727 222021 DEBUG oslo_concurrency.lockutils [req-e5d0a2a5-b132-4d3a-8033-eba1da01206f req-dfc9fa6e-cc3c-4779-8311-026d22bbadfe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:37 np0005593233 nova_compute[222017]: 2026-01-23 10:31:37.733 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.334 222021 DEBUG nova.compute.manager [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.334 222021 DEBUG nova.compute.manager [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing instance network info cache due to event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.335 222021 DEBUG oslo_concurrency.lockutils [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.335 222021 DEBUG oslo_concurrency.lockutils [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.335 222021 DEBUG nova.network.neutron [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:38.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e340 e340: 3 total, 3 up, 3 in
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.500 222021 DEBUG nova.compute.manager [req-0df2b169-8d99-490e-9bd4-3b7ee73f3fae req-bd22de58-128f-4781-a207-9b3cc190c6ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.500 222021 DEBUG oslo_concurrency.lockutils [req-0df2b169-8d99-490e-9bd4-3b7ee73f3fae req-bd22de58-128f-4781-a207-9b3cc190c6ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.501 222021 DEBUG oslo_concurrency.lockutils [req-0df2b169-8d99-490e-9bd4-3b7ee73f3fae req-bd22de58-128f-4781-a207-9b3cc190c6ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.501 222021 DEBUG oslo_concurrency.lockutils [req-0df2b169-8d99-490e-9bd4-3b7ee73f3fae req-bd22de58-128f-4781-a207-9b3cc190c6ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.502 222021 DEBUG nova.compute.manager [req-0df2b169-8d99-490e-9bd4-3b7ee73f3fae req-bd22de58-128f-4781-a207-9b3cc190c6ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.502 222021 WARNING nova.compute.manager [req-0df2b169-8d99-490e-9bd4-3b7ee73f3fae req-bd22de58-128f-4781-a207-9b3cc190c6ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:38.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:38 np0005593233 nova_compute[222017]: 2026-01-23 10:31:38.992 222021 DEBUG nova.network.neutron [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Successfully updated port: 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:31:39 np0005593233 nova_compute[222017]: 2026-01-23 10:31:39.046 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:39 np0005593233 nova_compute[222017]: 2026-01-23 10:31:39.047 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:39 np0005593233 nova_compute[222017]: 2026-01-23 10:31:39.047 222021 DEBUG nova.network.neutron [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:31:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e340 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:39 np0005593233 nova_compute[222017]: 2026-01-23 10:31:39.482 222021 DEBUG nova.network.neutron [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:31:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e341 e341: 3 total, 3 up, 3 in
Jan 23 05:31:39 np0005593233 nova_compute[222017]: 2026-01-23 10:31:39.898 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:40.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e342 e342: 3 total, 3 up, 3 in
Jan 23 05:31:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:40.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.901 222021 DEBUG nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.902 222021 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.903 222021 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.903 222021 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.904 222021 DEBUG nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.904 222021 WARNING nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.904 222021 DEBUG nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-changed-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.905 222021 DEBUG nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Refreshing instance network info cache due to event network-changed-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:40 np0005593233 nova_compute[222017]: 2026-01-23 10:31:40.905 222021 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:42.354 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:42.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:42.693 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:42.694 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:42.694 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:42 np0005593233 nova_compute[222017]: 2026-01-23 10:31:42.735 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:42.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.725 222021 DEBUG nova.network.neutron [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updated VIF entry in instance network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.725 222021 DEBUG nova.network.neutron [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.732 222021 DEBUG nova.network.neutron [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.743 222021 DEBUG oslo_concurrency.lockutils [req-3689c8d0-8c30-43a4-b5ea-1ad9edcf17ab req-0968bcbe-9d0c-409d-8ca5-b563f51db51a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.775 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.776 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance network_info: |[{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.776 222021 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.777 222021 DEBUG nova.network.neutron [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Refreshing network info cache for port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.781 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Start _get_guest_xml network_info=[{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.790 222021 WARNING nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.798 222021 DEBUG nova.virt.libvirt.host [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.799 222021 DEBUG nova.virt.libvirt.host [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.802 222021 DEBUG nova.virt.libvirt.host [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.803 222021 DEBUG nova.virt.libvirt.host [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.804 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.805 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.805 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.806 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.806 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.806 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.807 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.807 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.807 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.808 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.808 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.808 222021 DEBUG nova.virt.hardware [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:31:43 np0005593233 nova_compute[222017]: 2026-01-23 10:31:43.812 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:31:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2993202824' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.266 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.312 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.317 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:44.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:31:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3332271874' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.781 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.783 222021 DEBUG nova.virt.libvirt.vif [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:31:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-833764657',display_name='tempest-ServerStableDeviceRescueTest-server-833764657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-833764657',id=179,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-l34bmfpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:31:20Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=23f7c54d-ed5d-404f-8517-b5cd21d0c282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.783 222021 DEBUG nova.network.os_vif_util [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.784 222021 DEBUG nova.network.os_vif_util [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.785 222021 DEBUG nova.objects.instance [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:44.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.824 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <uuid>23f7c54d-ed5d-404f-8517-b5cd21d0c282</uuid>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <name>instance-000000b3</name>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-833764657</nova:name>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:31:43</nova:creationTime>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:user uuid="e1629a4b14764dddaabcadd16f3e1c1c">tempest-ServerStableDeviceRescueTest-1802220041-project-member</nova:user>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:project uuid="815b71acf60d4ed8933ebd05228fa0c0">tempest-ServerStableDeviceRescueTest-1802220041</nova:project>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <nova:port uuid="66bbd2d4-1733-4a5d-a84b-8d41c36dd82d">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <entry name="serial">23f7c54d-ed5d-404f-8517-b5cd21d0c282</entry>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <entry name="uuid">23f7c54d-ed5d-404f-8517-b5cd21d0c282</entry>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:e4:42:72"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <target dev="tap66bbd2d4-17"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/console.log" append="off"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:31:44 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:31:44 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:31:44 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:31:44 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.824 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Preparing to wait for external event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.825 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.825 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.825 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.826 222021 DEBUG nova.virt.libvirt.vif [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:31:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-833764657',display_name='tempest-ServerStableDeviceRescueTest-server-833764657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-833764657',id=179,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-l34bmfpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:31:20Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=23f7c54d-ed5d-404f-8517-b5cd21d0c282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.826 222021 DEBUG nova.network.os_vif_util [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.827 222021 DEBUG nova.network.os_vif_util [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.827 222021 DEBUG os_vif [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.828 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.828 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.828 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.832 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.832 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66bbd2d4-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.832 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66bbd2d4-17, col_values=(('external_ids', {'iface-id': '66bbd2d4-1733-4a5d-a84b-8d41c36dd82d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:42:72', 'vm-uuid': '23f7c54d-ed5d-404f-8517-b5cd21d0c282'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:44 np0005593233 NetworkManager[48871]: <info>  [1769164304.8360] manager: (tap66bbd2d4-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.837 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.846 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.847 222021 INFO os_vif [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17')#033[00m
Jan 23 05:31:44 np0005593233 nova_compute[222017]: 2026-01-23 10:31:44.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:45 np0005593233 nova_compute[222017]: 2026-01-23 10:31:45.119 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:31:45 np0005593233 nova_compute[222017]: 2026-01-23 10:31:45.120 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:31:45 np0005593233 nova_compute[222017]: 2026-01-23 10:31:45.120 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No VIF found with MAC fa:16:3e:e4:42:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:31:45 np0005593233 nova_compute[222017]: 2026-01-23 10:31:45.121 222021 INFO nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Using config drive#033[00m
Jan 23 05:31:45 np0005593233 nova_compute[222017]: 2026-01-23 10:31:45.156 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:46.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e343 e343: 3 total, 3 up, 3 in
Jan 23 05:31:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:46.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:46 np0005593233 nova_compute[222017]: 2026-01-23 10:31:46.804 222021 INFO nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Creating config drive at /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config#033[00m
Jan 23 05:31:46 np0005593233 nova_compute[222017]: 2026-01-23 10:31:46.813 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzrjlvxgi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:46 np0005593233 nova_compute[222017]: 2026-01-23 10:31:46.964 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzrjlvxgi" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.004 222021 DEBUG nova.storage.rbd_utils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.010 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.178 222021 DEBUG oslo_concurrency.processutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.179 222021 INFO nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Deleting local config drive /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config because it was imported into RBD.#033[00m
Jan 23 05:31:47 np0005593233 kernel: tap66bbd2d4-17: entered promiscuous mode
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.2344] manager: (tap66bbd2d4-17): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.235 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00739|binding|INFO|Claiming lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for this chassis.
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00740|binding|INFO|66bbd2d4-1733-4a5d-a84b-8d41c36dd82d: Claiming fa:16:3e:e4:42:72 10.100.0.13
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00741|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d ovn-installed in OVS
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.255 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 systemd-udevd[289672]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.3002] device (tap66bbd2d4-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.3011] device (tap66bbd2d4-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.379 222021 DEBUG nova.network.neutron [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updated VIF entry in instance network info cache for port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.380 222021 DEBUG nova.network.neutron [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00742|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d up in Southbound
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.405 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:42:72 10.100.0.13'], port_security=['fa:16:3e:e4:42:72 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '23f7c54d-ed5d-404f-8517-b5cd21d0c282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.407 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.410 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:31:47 np0005593233 systemd-machined[190954]: New machine qemu-81-instance-000000b3.
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.427 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ac21b61f-95d9-475b-8fbb-ad2cef796eac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.429 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7d5530f-51 in ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.432 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7d5530f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.432 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1e8825-000b-4dc9-aa25-6f3a9749c9c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.433 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a3d0ca-bc9d-407c-942b-2834f6ecbb1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.446 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[69d6df89-7a2f-47e8-8ecf-5b76f5688d0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 systemd[1]: Started Virtual Machine qemu-81-instance-000000b3.
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.462 222021 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.462 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e177e0-c948-468d-978b-365efb463c01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.516 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[91a66c85-9b7b-4975-af4b-c229281e61dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.5255] manager: (tapd7d5530f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/339)
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.524 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1efe68ee-a33d-4e3f-bede-3f24a9d8171e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.574 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7341d8d1-f0f1-4456-b56d-9a76a1868630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.579 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[151f8e83-e89e-4958-9890-765fdb01640d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.595 222021 DEBUG nova.compute.manager [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.596 222021 DEBUG nova.compute.manager [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing instance network info cache due to event network-changed-55e3e503-4e7f-4527-b7da-0242067d96b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.596 222021 DEBUG oslo_concurrency.lockutils [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.597 222021 DEBUG oslo_concurrency.lockutils [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.597 222021 DEBUG nova.network.neutron [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Refreshing network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.6175] device (tapd7d5530f-50): carrier: link connected
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.625 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e61289d8-4ee8-43d9-9ed4-a9df45eccd01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.657 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4419b8-2dfa-469f-a739-5248c70b99da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 822024, 'reachable_time': 18042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289708, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.685 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f8df2304-14e1-44a2-827c-0b62c668ee79]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:67cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 822024, 'tstamp': 822024}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289709, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.774 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.774 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.774 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.774 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.775 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.776 222021 INFO nova.compute.manager [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Terminating instance#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.777 222021 DEBUG nova.compute.manager [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.791 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[39cc927f-f6ee-4a9e-9044-58f4b750ad7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 822024, 'reachable_time': 18042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 289717, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.830 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fc94a45b-3e87-449a-a795-ce92dd8ee5d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 kernel: tap55e3e503-4e (unregistering): left promiscuous mode
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.8372] device (tap55e3e503-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.851 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00743|binding|INFO|Releasing lport 55e3e503-4e7f-4527-b7da-0242067d96b3 from this chassis (sb_readonly=0)
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00744|binding|INFO|Setting lport 55e3e503-4e7f-4527-b7da-0242067d96b3 down in Southbound
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00745|binding|INFO|Removing iface tap55e3e503-4e ovn-installed in OVS
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.854 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.862 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:9c:7e 10.100.0.11'], port_security=['fa:16:3e:2b:9c:7e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a9112892-c55a-46f8-a5f2-6df7fac1510a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3128fa93-5584-4fd7-b8b2-100d4babba87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '8', 'neutron:security_group_ids': '92b696b2-abea-4273-befd-9b9d9b6c5bb3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d423e3-a129-4092-a097-e9db38a84e9f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=55e3e503-4e7f-4527-b7da-0242067d96b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.872 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Jan 23 05:31:47 np0005593233 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000ae.scope: Consumed 19.313s CPU time.
Jan 23 05:31:47 np0005593233 systemd-machined[190954]: Machine qemu-80-instance-000000ae terminated.
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.916 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e64ac1c4-9a16-45f3-b5c0-db62c0f4ff7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.917 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.918 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.918 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.920 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.9214] manager: (tapd7d5530f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 23 05:31:47 np0005593233 kernel: tapd7d5530f-50: entered promiscuous mode
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.925 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.926 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:31:47Z|00746|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 nova_compute[222017]: 2026-01-23 10:31:47.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.954 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.955 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[634b2a0a-5a58-4f80-8b39-a4d67d86ff0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.956 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:31:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:47.956 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'env', 'PROCESS_TAG=haproxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7d5530f-5227-4f75-bac0-2604bb3d68e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:31:47 np0005593233 NetworkManager[48871]: <info>  [1769164307.9986] manager: (tap55e3e503-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.002 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164308.0021465, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.002 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Started (Lifecycle Event)#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.010 222021 INFO nova.virt.libvirt.driver [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Instance destroyed successfully.#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.011 222021 DEBUG nova.objects.instance [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid a9112892-c55a-46f8-a5f2-6df7fac1510a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.034 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.040 222021 DEBUG nova.virt.libvirt.vif [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:29:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1271596965',display_name='tempest-TestNetworkBasicOps-server-1271596965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1271596965',id=174,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO2LI0UABNlDU3HpBqZ4TFdn/tyl+k7EkIfoy8J4Qrg3uVnEo4BKCEou7n9DUxH0pF9daXCxvV4DOAO3a7+2coTfCOHx8K9T8rhOLIf1u1NApLhksMZyj9YN8VZ26cJVCQ==',key_name='tempest-TestNetworkBasicOps-362963621',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:30:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3chv5nxw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:30:08Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=a9112892-c55a-46f8-a5f2-6df7fac1510a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.041 222021 DEBUG nova.network.os_vif_util [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.041 222021 DEBUG nova.network.os_vif_util [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.042 222021 DEBUG os_vif [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.044 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55e3e503-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.046 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.048 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.050 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164308.002269, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.050 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.051 222021 INFO os_vif [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:9c:7e,bridge_name='br-int',has_traffic_filtering=True,id=55e3e503-4e7f-4527-b7da-0242067d96b3,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55e3e503-4e')#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.096 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.099 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.127 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:31:48 np0005593233 podman[289821]: 2026-01-23 10:31:48.377743715 +0000 UTC m=+0.076503455 container create 27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:31:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:48.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:48 np0005593233 podman[289821]: 2026-01-23 10:31:48.342324719 +0000 UTC m=+0.041084549 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:31:48 np0005593233 systemd[1]: Started libpod-conmon-27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c.scope.
Jan 23 05:31:48 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:31:48 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2bb7ffa9402c02b1d39c11ceb691dacfa98e307ba2f07cd2f768bd665233cd1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.475 222021 INFO nova.virt.libvirt.driver [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Deleting instance files /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a_del#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.476 222021 INFO nova.virt.libvirt.driver [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Deletion of /var/lib/nova/instances/a9112892-c55a-46f8-a5f2-6df7fac1510a_del complete#033[00m
Jan 23 05:31:48 np0005593233 podman[289821]: 2026-01-23 10:31:48.483887461 +0000 UTC m=+0.182647221 container init 27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:31:48 np0005593233 podman[289821]: 2026-01-23 10:31:48.489633245 +0000 UTC m=+0.188392995 container start 27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [NOTICE]   (289842) : New worker (289844) forked
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [NOTICE]   (289842) : Loading success.
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.550 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 55e3e503-4e7f-4527-b7da-0242067d96b3 in datapath 3128fa93-5584-4fd7-b8b2-100d4babba87 unbound from our chassis#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.553 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3128fa93-5584-4fd7-b8b2-100d4babba87, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.555 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3c69a6d9-d587-41eb-adb7-73009a75d030]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.555 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 namespace which is not needed anymore#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.561 222021 INFO nova.compute.manager [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.561 222021 DEBUG oslo.service.loopingcall [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.562 222021 DEBUG nova.compute.manager [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.562 222021 DEBUG nova.network.neutron [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:31:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e344 e344: 3 total, 3 up, 3 in
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [NOTICE]   (288781) : haproxy version is 2.8.14-c23fe91
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [NOTICE]   (288781) : path to executable is /usr/sbin/haproxy
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [WARNING]  (288781) : Exiting Master process...
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [WARNING]  (288781) : Exiting Master process...
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [ALERT]    (288781) : Current worker (288788) exited with code 143 (Terminated)
Jan 23 05:31:48 np0005593233 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[288759]: [WARNING]  (288781) : All workers exited. Exiting... (0)
Jan 23 05:31:48 np0005593233 systemd[1]: libpod-6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4.scope: Deactivated successfully.
Jan 23 05:31:48 np0005593233 podman[289870]: 2026-01-23 10:31:48.725187848 +0000 UTC m=+0.050654520 container died 6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:31:48 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:31:48 np0005593233 systemd[1]: var-lib-containers-storage-overlay-1e697d54862fd9a31eefcf18d811b09dc1273fb2b5f7c675865980321fef4b90-merged.mount: Deactivated successfully.
Jan 23 05:31:48 np0005593233 podman[289870]: 2026-01-23 10:31:48.779227984 +0000 UTC m=+0.104694626 container cleanup 6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:31:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:48.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:48 np0005593233 systemd[1]: libpod-conmon-6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4.scope: Deactivated successfully.
Jan 23 05:31:48 np0005593233 podman[289898]: 2026-01-23 10:31:48.885984587 +0000 UTC m=+0.067121918 container remove 6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.892 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0a01b2-abce-4c76-883c-1b63317d4aa5]: (4, ('Fri Jan 23 10:31:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 (6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4)\n6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4\nFri Jan 23 10:31:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 (6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4)\n6d0a69876d596c28931af5ed50160b9fcf6fc3968cbd42638c48d9c42fc1fee4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.895 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ff7b1c-98bf-4745-869b-3946cbd6ec62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.897 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3128fa93-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 kernel: tap3128fa93-50: left promiscuous mode
Jan 23 05:31:48 np0005593233 nova_compute[222017]: 2026-01-23 10:31:48.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.933 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[acc171dd-cdd9-4b26-8c06-4183d045eb8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.951 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9503e80a-13e4-480e-a870-c4c8593e196d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.953 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3acbad31-7233-4357-9602-fced3ad27f14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.982 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9d020bc5-5c40-489b-ae90-902531b70afd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 811636, 'reachable_time': 23721, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289916, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.987 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:31:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:31:48.987 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[57037683-8bca-4ba6-8ab2-2af33ae0f7a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:48 np0005593233 systemd[1]: run-netns-ovnmeta\x2d3128fa93\x2d5584\x2d4fd7\x2db8b2\x2d100d4babba87.mount: Deactivated successfully.
Jan 23 05:31:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.458 222021 DEBUG nova.compute.manager [req-35a82389-6934-4e3a-999a-bb7d9096be9d req-7c5379a5-a7a1-4160-8bf7-139dd91b27c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.459 222021 DEBUG oslo_concurrency.lockutils [req-35a82389-6934-4e3a-999a-bb7d9096be9d req-7c5379a5-a7a1-4160-8bf7-139dd91b27c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.459 222021 DEBUG oslo_concurrency.lockutils [req-35a82389-6934-4e3a-999a-bb7d9096be9d req-7c5379a5-a7a1-4160-8bf7-139dd91b27c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.460 222021 DEBUG oslo_concurrency.lockutils [req-35a82389-6934-4e3a-999a-bb7d9096be9d req-7c5379a5-a7a1-4160-8bf7-139dd91b27c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.460 222021 DEBUG nova.compute.manager [req-35a82389-6934-4e3a-999a-bb7d9096be9d req-7c5379a5-a7a1-4160-8bf7-139dd91b27c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Processing event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.461 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.469 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164309.4696233, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.470 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.473 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.478 222021 INFO nova.virt.libvirt.driver [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance spawned successfully.#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.479 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.529 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.539 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.546 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.547 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.548 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.549 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.550 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.551 222021 DEBUG nova.virt.libvirt.driver [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.603 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.678 222021 INFO nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Took 29.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.679 222021 DEBUG nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.807 222021 INFO nova.compute.manager [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Took 33.80 seconds to build instance.#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.836 222021 DEBUG oslo_concurrency.lockutils [None req-43905ebc-0aac-4787-a45d-f5345b86064e e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 33.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:49 np0005593233 nova_compute[222017]: 2026-01-23 10:31:49.905 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:50.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:31:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:50.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.332 222021 DEBUG nova.network.neutron [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updated VIF entry in instance network info cache for port 55e3e503-4e7f-4527-b7da-0242067d96b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.334 222021 DEBUG nova.network.neutron [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [{"id": "55e3e503-4e7f-4527-b7da-0242067d96b3", "address": "fa:16:3e:2b:9c:7e", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55e3e503-4e", "ovs_interfaceid": "55e3e503-4e7f-4527-b7da-0242067d96b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.366 222021 DEBUG oslo_concurrency.lockutils [req-f4e024fc-302f-448f-8752-ea8e3eda8675 req-8ac5aeb2-d255-41d7-a8ea-d9b81bb24b6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a9112892-c55a-46f8-a5f2-6df7fac1510a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.394 222021 DEBUG nova.network.neutron [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.477 222021 INFO nova.compute.manager [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Took 2.92 seconds to deallocate network for instance.#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.571 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.572 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.594 222021 DEBUG nova.compute.manager [req-79842881-dc3a-4c3d-a727-f906d69ab5cc req-4ed387e0-5897-43cc-bfcf-c8240c5b3ce2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.595 222021 DEBUG oslo_concurrency.lockutils [req-79842881-dc3a-4c3d-a727-f906d69ab5cc req-4ed387e0-5897-43cc-bfcf-c8240c5b3ce2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.595 222021 DEBUG oslo_concurrency.lockutils [req-79842881-dc3a-4c3d-a727-f906d69ab5cc req-4ed387e0-5897-43cc-bfcf-c8240c5b3ce2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.595 222021 DEBUG oslo_concurrency.lockutils [req-79842881-dc3a-4c3d-a727-f906d69ab5cc req-4ed387e0-5897-43cc-bfcf-c8240c5b3ce2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.596 222021 DEBUG nova.compute.manager [req-79842881-dc3a-4c3d-a727-f906d69ab5cc req-4ed387e0-5897-43cc-bfcf-c8240c5b3ce2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.596 222021 WARNING nova.compute.manager [req-79842881-dc3a-4c3d-a727-f906d69ab5cc req-4ed387e0-5897-43cc-bfcf-c8240c5b3ce2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:51 np0005593233 nova_compute[222017]: 2026-01-23 10:31:51.976 222021 DEBUG oslo_concurrency.processutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.264 222021 DEBUG nova.compute.manager [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-unplugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.265 222021 DEBUG oslo_concurrency.lockutils [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.266 222021 DEBUG oslo_concurrency.lockutils [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.266 222021 DEBUG oslo_concurrency.lockutils [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.267 222021 DEBUG nova.compute.manager [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-unplugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.268 222021 WARNING nova.compute.manager [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-unplugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.268 222021 DEBUG nova.compute.manager [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.269 222021 DEBUG oslo_concurrency.lockutils [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.269 222021 DEBUG oslo_concurrency.lockutils [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.270 222021 DEBUG oslo_concurrency.lockutils [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.271 222021 DEBUG nova.compute.manager [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] No waiting events found dispatching network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.271 222021 WARNING nova.compute.manager [req-f9949630-ae23-4a2c-b3b5-2bb6df8d73bd req-9ccd00cf-b2d8-45c3-95d9-55c0da72a560 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received unexpected event network-vif-plugged-55e3e503-4e7f-4527-b7da-0242067d96b3 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:31:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:52.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2320594349' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.452 222021 DEBUG oslo_concurrency.processutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.464 222021 DEBUG nova.compute.provider_tree [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.485 222021 DEBUG nova.scheduler.client.report [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.526 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.592 222021 INFO nova.scheduler.client.report [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance a9112892-c55a-46f8-a5f2-6df7fac1510a#033[00m
Jan 23 05:31:52 np0005593233 nova_compute[222017]: 2026-01-23 10:31:52.737 222021 DEBUG oslo_concurrency.lockutils [None req-23992a87-bdd6-448b-b117-2cc2ea5387d3 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a9112892-c55a-46f8-a5f2-6df7fac1510a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:52.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:53 np0005593233 nova_compute[222017]: 2026-01-23 10:31:53.099 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:53 np0005593233 nova_compute[222017]: 2026-01-23 10:31:53.302 222021 DEBUG nova.compute.manager [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e345 e345: 3 total, 3 up, 3 in
Jan 23 05:31:53 np0005593233 nova_compute[222017]: 2026-01-23 10:31:53.775 222021 INFO nova.compute.manager [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] instance snapshotting#033[00m
Jan 23 05:31:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e345 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:31:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:54.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:31:54 np0005593233 nova_compute[222017]: 2026-01-23 10:31:54.738 222021 INFO nova.virt.libvirt.driver [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Beginning live snapshot process#033[00m
Jan 23 05:31:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:54.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:54 np0005593233 nova_compute[222017]: 2026-01-23 10:31:54.846 222021 DEBUG nova.compute.manager [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Received event network-vif-deleted-55e3e503-4e7f-4527-b7da-0242067d96b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:54 np0005593233 nova_compute[222017]: 2026-01-23 10:31:54.908 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:55 np0005593233 podman[289941]: 2026-01-23 10:31:55.088854755 +0000 UTC m=+0.087083215 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:31:55 np0005593233 nova_compute[222017]: 2026-01-23 10:31:55.130 222021 DEBUG nova.virt.libvirt.imagebackend [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:31:56 np0005593233 nova_compute[222017]: 2026-01-23 10:31:56.133 222021 DEBUG nova.storage.rbd_utils [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] creating snapshot(96cd025ba77e4bbda006689d73daa949) on rbd image(23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:31:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:56.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:31:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 14K writes, 74K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 14K writes, 14K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1556 writes, 7927 keys, 1556 commit groups, 1.0 writes per commit group, ingest: 16.10 MB, 0.03 MB/s#012Interval WAL: 1556 writes, 1556 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     51.3      1.81              0.43        48    0.038       0      0       0.0       0.0#012  L6      1/0   13.12 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   5.0     76.2     64.9      7.09              1.54        47    0.151    330K    25K       0.0       0.0#012 Sum      1/0   13.12 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.0     60.7     62.1      8.91              1.96        95    0.094    330K    25K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.5     44.6     46.4      1.67              0.29        12    0.139     56K   3136       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0     76.2     64.9      7.09              1.54        47    0.151    330K    25K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     51.4      1.81              0.43        47    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.091, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.54 GB write, 0.10 MB/s write, 0.53 GB read, 0.10 MB/s read, 8.9 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 59.60 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.000567 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3402,57.21 MB,18.8175%) FilterBlock(95,930.17 KB,0.298806%) IndexBlock(95,1.49 MB,0.489079%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:31:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:56.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e346 e346: 3 total, 3 up, 3 in
Jan 23 05:31:56 np0005593233 nova_compute[222017]: 2026-01-23 10:31:56.958 222021 DEBUG nova.storage.rbd_utils [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] cloning vms/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk@96cd025ba77e4bbda006689d73daa949 to images/e43f34d9-2c82-4fa9-94ac-56998bf2dd4a clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:31:57 np0005593233 nova_compute[222017]: 2026-01-23 10:31:57.130 222021 DEBUG nova.storage.rbd_utils [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] flattening images/e43f34d9-2c82-4fa9-94ac-56998bf2dd4a flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:31:57 np0005593233 nova_compute[222017]: 2026-01-23 10:31:57.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:57 np0005593233 nova_compute[222017]: 2026-01-23 10:31:57.492 222021 DEBUG nova.storage.rbd_utils [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] removing snapshot(96cd025ba77e4bbda006689d73daa949) on rbd image(23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:31:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e347 e347: 3 total, 3 up, 3 in
Jan 23 05:31:57 np0005593233 nova_compute[222017]: 2026-01-23 10:31:57.968 222021 DEBUG nova.storage.rbd_utils [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] creating snapshot(snap) on rbd image(e43f34d9-2c82-4fa9-94ac-56998bf2dd4a) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.102 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:58.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.421 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:31:58 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.421 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:31:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:58.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e348 e348: 3 total, 3 up, 3 in
Jan 23 05:31:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2079207847' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:58.999 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.578s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:59 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.207 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.207 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:31:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e348 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.380 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.382 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4088MB free_disk=20.809696197509766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.382 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.383 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.598 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 23f7c54d-ed5d-404f-8517-b5cd21d0c282 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.600 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.600 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:31:59 np0005593233 nova_compute[222017]: 2026-01-23 10:31:59.909 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:00 np0005593233 nova_compute[222017]: 2026-01-23 10:32:00.177 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:00.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:32:00 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3701836877' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:00 np0005593233 nova_compute[222017]: 2026-01-23 10:32:00.685 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:00 np0005593233 nova_compute[222017]: 2026-01-23 10:32:00.696 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:32:00 np0005593233 nova_compute[222017]: 2026-01-23 10:32:00.721 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:32:00 np0005593233 nova_compute[222017]: 2026-01-23 10:32:00.788 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:32:00 np0005593233 nova_compute[222017]: 2026-01-23 10:32:00.788 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:00.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e349 e349: 3 total, 3 up, 3 in
Jan 23 05:32:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:02 np0005593233 nova_compute[222017]: 2026-01-23 10:32:02.789 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:02 np0005593233 nova_compute[222017]: 2026-01-23 10:32:02.790 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:02 np0005593233 nova_compute[222017]: 2026-01-23 10:32:02.791 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:02.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:03 np0005593233 nova_compute[222017]: 2026-01-23 10:32:03.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:03 np0005593233 nova_compute[222017]: 2026-01-23 10:32:03.128 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164308.0084379, a9112892-c55a-46f8-a5f2-6df7fac1510a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:32:03 np0005593233 nova_compute[222017]: 2026-01-23 10:32:03.129 222021 INFO nova.compute.manager [-] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:32:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:03Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:42:72 10.100.0.13
Jan 23 05:32:03 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:03Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:42:72 10.100.0.13
Jan 23 05:32:03 np0005593233 nova_compute[222017]: 2026-01-23 10:32:03.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:03 np0005593233 nova_compute[222017]: 2026-01-23 10:32:03.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:32:03 np0005593233 nova_compute[222017]: 2026-01-23 10:32:03.527 222021 DEBUG nova.compute.manager [None req-b3952339-f646-47a4-9126-7bd0f8883233 - - - - - -] [instance: a9112892-c55a-46f8-a5f2-6df7fac1510a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:04Z|00747|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:32:04 np0005593233 nova_compute[222017]: 2026-01-23 10:32:04.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:04.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:04Z|00748|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:32:04 np0005593233 nova_compute[222017]: 2026-01-23 10:32:04.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:04.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:04 np0005593233 nova_compute[222017]: 2026-01-23 10:32:04.911 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:05 np0005593233 nova_compute[222017]: 2026-01-23 10:32:05.969 222021 INFO nova.virt.libvirt.driver [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Snapshot image upload complete#033[00m
Jan 23 05:32:05 np0005593233 nova_compute[222017]: 2026-01-23 10:32:05.970 222021 INFO nova.compute.manager [None req-fb39e814-3b86-4f7d-a56a-87a0f00a0c39 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Took 12.19 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 05:32:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:06.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:06.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:08 np0005593233 nova_compute[222017]: 2026-01-23 10:32:08.107 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:08 np0005593233 podman[290150]: 2026-01-23 10:32:08.146403638 +0000 UTC m=+0.141104591 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:32:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:32:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:08.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:32:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 e350: 3 total, 3 up, 3 in
Jan 23 05:32:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:08.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:09 np0005593233 nova_compute[222017]: 2026-01-23 10:32:09.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:09 np0005593233 nova_compute[222017]: 2026-01-23 10:32:09.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:32:09 np0005593233 nova_compute[222017]: 2026-01-23 10:32:09.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:32:09 np0005593233 nova_compute[222017]: 2026-01-23 10:32:09.915 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:10 np0005593233 nova_compute[222017]: 2026-01-23 10:32:10.194 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:32:10 np0005593233 nova_compute[222017]: 2026-01-23 10:32:10.194 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:32:10 np0005593233 nova_compute[222017]: 2026-01-23 10:32:10.195 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:32:10 np0005593233 nova_compute[222017]: 2026-01-23 10:32:10.195 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:10.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:10.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:12.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:12.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:13 np0005593233 nova_compute[222017]: 2026-01-23 10:32:13.109 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:13 np0005593233 nova_compute[222017]: 2026-01-23 10:32:13.209 222021 INFO nova.compute.manager [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Rescuing#033[00m
Jan 23 05:32:13 np0005593233 nova_compute[222017]: 2026-01-23 10:32:13.210 222021 DEBUG oslo_concurrency.lockutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:32:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:14.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:14 np0005593233 nova_compute[222017]: 2026-01-23 10:32:14.919 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:15 np0005593233 nova_compute[222017]: 2026-01-23 10:32:15.236 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:32:15 np0005593233 nova_compute[222017]: 2026-01-23 10:32:15.296 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:32:15 np0005593233 nova_compute[222017]: 2026-01-23 10:32:15.297 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:32:15 np0005593233 nova_compute[222017]: 2026-01-23 10:32:15.298 222021 DEBUG oslo_concurrency.lockutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:32:15 np0005593233 nova_compute[222017]: 2026-01-23 10:32:15.298 222021 DEBUG nova.network.neutron [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:32:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:16.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:16.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:18 np0005593233 nova_compute[222017]: 2026-01-23 10:32:18.112 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:18 np0005593233 nova_compute[222017]: 2026-01-23 10:32:18.294 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:18.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:18.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:19 np0005593233 nova_compute[222017]: 2026-01-23 10:32:19.921 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:20.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:20 np0005593233 nova_compute[222017]: 2026-01-23 10:32:20.746 222021 DEBUG nova.network.neutron [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:32:20 np0005593233 nova_compute[222017]: 2026-01-23 10:32:20.787 222021 DEBUG oslo_concurrency.lockutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:32:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:20.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:21 np0005593233 nova_compute[222017]: 2026-01-23 10:32:21.423 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:32:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:22.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:22.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:23 np0005593233 nova_compute[222017]: 2026-01-23 10:32:23.114 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:23 np0005593233 kernel: tap66bbd2d4-17 (unregistering): left promiscuous mode
Jan 23 05:32:23 np0005593233 NetworkManager[48871]: <info>  [1769164343.9860] device (tap66bbd2d4-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:32:24 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:24Z|00749|binding|INFO|Releasing lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d from this chassis (sb_readonly=0)
Jan 23 05:32:24 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:24Z|00750|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d down in Southbound
Jan 23 05:32:24 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:24Z|00751|binding|INFO|Removing iface tap66bbd2d4-17 ovn-installed in OVS
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.023 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.040 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.042 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:42:72 10.100.0.13'], port_security=['fa:16:3e:e4:42:72 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '23f7c54d-ed5d-404f-8517-b5cd21d0c282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.044 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.045 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7d5530f-5227-4f75-bac0-2604bb3d68e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.047 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb1e46c-0dec-421e-9c85-9d0212335b82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.047 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace which is not needed anymore#033[00m
Jan 23 05:32:24 np0005593233 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Jan 23 05:32:24 np0005593233 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000b3.scope: Consumed 15.778s CPU time.
Jan 23 05:32:24 np0005593233 systemd-machined[190954]: Machine qemu-81-instance-000000b3 terminated.
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.214 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.222 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:24 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [NOTICE]   (289842) : haproxy version is 2.8.14-c23fe91
Jan 23 05:32:24 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [NOTICE]   (289842) : path to executable is /usr/sbin/haproxy
Jan 23 05:32:24 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [WARNING]  (289842) : Exiting Master process...
Jan 23 05:32:24 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [ALERT]    (289842) : Current worker (289844) exited with code 143 (Terminated)
Jan 23 05:32:24 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[289838]: [WARNING]  (289842) : All workers exited. Exiting... (0)
Jan 23 05:32:24 np0005593233 systemd[1]: libpod-27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c.scope: Deactivated successfully.
Jan 23 05:32:24 np0005593233 podman[290218]: 2026-01-23 10:32:24.256504215 +0000 UTC m=+0.066513311 container died 27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:32:24 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:32:24 np0005593233 systemd[1]: var-lib-containers-storage-overlay-2bb7ffa9402c02b1d39c11ceb691dacfa98e307ba2f07cd2f768bd665233cd1b-merged.mount: Deactivated successfully.
Jan 23 05:32:24 np0005593233 podman[290218]: 2026-01-23 10:32:24.312079004 +0000 UTC m=+0.122088050 container cleanup 27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:32:24 np0005593233 systemd[1]: libpod-conmon-27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c.scope: Deactivated successfully.
Jan 23 05:32:24 np0005593233 podman[290289]: 2026-01-23 10:32:24.391738398 +0000 UTC m=+0.050644880 container remove 27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.398 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[34ab419f-2203-4aba-ad2c-217343ee92ef]: (4, ('Fri Jan 23 10:32:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c)\n27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c\nFri Jan 23 10:32:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c)\n27e66c61276fb9f4ea8cca8582d821242e5090220cc1149a9032a1881fc7819c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.401 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3dffd55d-84bb-4ba6-bf29-9c9cb46a6d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.403 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 kernel: tapd7d5530f-50: left promiscuous mode
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.425 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.429 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0282d31d-0a79-417d-bd7b-88b2e8bcfcae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.445 222021 INFO nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.447 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[61b0fd54-0ce7-4f8d-b7ff-8254cb4cf179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.449 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a8ed93-bdfe-4876-8cf0-844ab95ebb81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.451 222021 INFO nova.virt.libvirt.driver [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance destroyed successfully.#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.452 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.471 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[27d9cb88-2408-4574-9e6c-d6bb7d5a90ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 822013, 'reachable_time': 28228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290353, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:32:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:24.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:32:24 np0005593233 systemd[1]: run-netns-ovnmeta\x2dd7d5530f\x2d5227\x2d4f75\x2dbac0\x2d2604bb3d68e2.mount: Deactivated successfully.
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.475 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:32:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:24.475 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[743cac3c-3d50-4a82-840f-ecdd0137723b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.509 222021 INFO nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Attempting a stable device rescue#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.535 222021 DEBUG nova.compute.manager [req-5fa77602-67ba-407a-b171-21af5d4a986c req-06511234-bd71-4381-b6c2-4de8ed2eddb6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.536 222021 DEBUG oslo_concurrency.lockutils [req-5fa77602-67ba-407a-b171-21af5d4a986c req-06511234-bd71-4381-b6c2-4de8ed2eddb6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.537 222021 DEBUG oslo_concurrency.lockutils [req-5fa77602-67ba-407a-b171-21af5d4a986c req-06511234-bd71-4381-b6c2-4de8ed2eddb6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.537 222021 DEBUG oslo_concurrency.lockutils [req-5fa77602-67ba-407a-b171-21af5d4a986c req-06511234-bd71-4381-b6c2-4de8ed2eddb6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.538 222021 DEBUG nova.compute.manager [req-5fa77602-67ba-407a-b171-21af5d4a986c req-06511234-bd71-4381-b6c2-4de8ed2eddb6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.538 222021 WARNING nova.compute.manager [req-5fa77602-67ba-407a-b171-21af5d4a986c req-06511234-bd71-4381-b6c2-4de8ed2eddb6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:32:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:24.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:24 np0005593233 nova_compute[222017]: 2026-01-23 10:32:24.925 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.215 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.224 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.225 222021 INFO nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Creating image(s)#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.274 222021 DEBUG nova.storage.rbd_utils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.279 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:25 np0005593233 podman[290410]: 2026-01-23 10:32:25.296772236 +0000 UTC m=+0.107646080 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.367 222021 DEBUG nova.storage.rbd_utils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.407 222021 DEBUG nova.storage.rbd_utils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.415 222021 DEBUG oslo_concurrency.lockutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "bedf2533bbbf1c9f576dc0402dee984bc3cee36c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.416 222021 DEBUG oslo_concurrency.lockutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "bedf2533bbbf1c9f576dc0402dee984bc3cee36c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.419 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.844 222021 DEBUG nova.virt.libvirt.imagebackend [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/e43f34d9-2c82-4fa9-94ac-56998bf2dd4a/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/e43f34d9-2c82-4fa9-94ac-56998bf2dd4a/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:32:25 np0005593233 podman[290596]: 2026-01-23 10:32:25.892303078 +0000 UTC m=+0.047010087 container create 4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_mirzakhani, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.913 222021 DEBUG nova.virt.libvirt.imagebackend [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/e43f34d9-2c82-4fa9-94ac-56998bf2dd4a/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:32:25 np0005593233 nova_compute[222017]: 2026-01-23 10:32:25.914 222021 DEBUG nova.storage.rbd_utils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] cloning images/e43f34d9-2c82-4fa9-94ac-56998bf2dd4a@snap to None/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:32:25 np0005593233 systemd[1]: Started libpod-conmon-4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba.scope.
Jan 23 05:32:25 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:32:25 np0005593233 podman[290596]: 2026-01-23 10:32:25.873601627 +0000 UTC m=+0.028308666 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:32:25 np0005593233 podman[290596]: 2026-01-23 10:32:25.990277982 +0000 UTC m=+0.144985071 container init 4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_mirzakhani, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 23 05:32:26 np0005593233 podman[290596]: 2026-01-23 10:32:26.003829387 +0000 UTC m=+0.158536406 container start 4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_mirzakhani, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 23 05:32:26 np0005593233 podman[290596]: 2026-01-23 10:32:26.009794217 +0000 UTC m=+0.164501316 container attach 4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default)
Jan 23 05:32:26 np0005593233 blissful_mirzakhani[290664]: 167 167
Jan 23 05:32:26 np0005593233 systemd[1]: libpod-4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba.scope: Deactivated successfully.
Jan 23 05:32:26 np0005593233 podman[290596]: 2026-01-23 10:32:26.015475668 +0000 UTC m=+0.170182717 container died 4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_mirzakhani, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:32:26 np0005593233 systemd[1]: var-lib-containers-storage-overlay-25cba24a4677a1dda9928690bb07ab23efa7d2b7e2bb5b73e7a21f82b438cc5c-merged.mount: Deactivated successfully.
Jan 23 05:32:26 np0005593233 podman[290596]: 2026-01-23 10:32:26.061198247 +0000 UTC m=+0.215905266 container remove 4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=blissful_mirzakhani, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 23 05:32:26 np0005593233 systemd[1]: libpod-conmon-4464aa6a27accc2b18d3dff2e931ddb9c0a05013d79b96c68debadf31fc5a1ba.scope: Deactivated successfully.
Jan 23 05:32:26 np0005593233 podman[290701]: 2026-01-23 10:32:26.229441458 +0000 UTC m=+0.043613520 container create 73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_wright, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:32:26 np0005593233 systemd[1]: Started libpod-conmon-73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e.scope.
Jan 23 05:32:26 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:32:26 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1af6da4166764ab13e0aa5efec80b2494df2b591757719aefd977436f9ce973/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 05:32:26 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1af6da4166764ab13e0aa5efec80b2494df2b591757719aefd977436f9ce973/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 05:32:26 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1af6da4166764ab13e0aa5efec80b2494df2b591757719aefd977436f9ce973/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:32:26 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1af6da4166764ab13e0aa5efec80b2494df2b591757719aefd977436f9ce973/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 05:32:26 np0005593233 podman[290701]: 2026-01-23 10:32:26.212573739 +0000 UTC m=+0.026745821 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:32:26 np0005593233 podman[290701]: 2026-01-23 10:32:26.319370684 +0000 UTC m=+0.133542846 container init 73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_wright, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:32:26 np0005593233 podman[290701]: 2026-01-23 10:32:26.327174875 +0000 UTC m=+0.141346977 container start 73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_wright, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 05:32:26 np0005593233 podman[290701]: 2026-01-23 10:32:26.332164597 +0000 UTC m=+0.146336679 container attach 73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_wright, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.368 222021 DEBUG oslo_concurrency.lockutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "bedf2533bbbf1c9f576dc0402dee984bc3cee36c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.437 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.461 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.465 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Start _get_guest_xml network_info=[{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:e4:42:72"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'e43f34d9-2c82-4fa9-94ac-56998bf2dd4a', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.466 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'resources' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:26.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.491 222021 WARNING nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.496 222021 DEBUG nova.virt.libvirt.host [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.497 222021 DEBUG nova.virt.libvirt.host [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.500 222021 DEBUG nova.virt.libvirt.host [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.501 222021 DEBUG nova.virt.libvirt.host [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.502 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.502 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.503 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.503 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.503 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.504 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.504 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.504 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.504 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.505 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.505 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.505 222021 DEBUG nova.virt.hardware [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.506 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.526 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.841 222021 DEBUG nova.compute.manager [req-2e8211f7-0229-45e7-87c7-1b221a35a149 req-9b4cf951-e745-4f86-8ad3-e345e7f922b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.842 222021 DEBUG oslo_concurrency.lockutils [req-2e8211f7-0229-45e7-87c7-1b221a35a149 req-9b4cf951-e745-4f86-8ad3-e345e7f922b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.843 222021 DEBUG oslo_concurrency.lockutils [req-2e8211f7-0229-45e7-87c7-1b221a35a149 req-9b4cf951-e745-4f86-8ad3-e345e7f922b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.843 222021 DEBUG oslo_concurrency.lockutils [req-2e8211f7-0229-45e7-87c7-1b221a35a149 req-9b4cf951-e745-4f86-8ad3-e345e7f922b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.844 222021 DEBUG nova.compute.manager [req-2e8211f7-0229-45e7-87c7-1b221a35a149 req-9b4cf951-e745-4f86-8ad3-e345e7f922b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:26 np0005593233 nova_compute[222017]: 2026-01-23 10:32:26.844 222021 WARNING nova.compute.manager [req-2e8211f7-0229-45e7-87c7-1b221a35a149 req-9b4cf951-e745-4f86-8ad3-e345e7f922b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:32:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:26.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:32:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3153472946' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:32:27 np0005593233 nova_compute[222017]: 2026-01-23 10:32:27.022 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:27 np0005593233 nova_compute[222017]: 2026-01-23 10:32:27.078 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:32:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2250550051' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:32:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:27 np0005593233 nova_compute[222017]: 2026-01-23 10:32:27.637 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:27 np0005593233 nova_compute[222017]: 2026-01-23 10:32:27.639 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:27 np0005593233 youthful_wright[290720]: [
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:    {
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "available": false,
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "ceph_device": false,
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "lsm_data": {},
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "lvs": [],
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "path": "/dev/sr0",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "rejected_reasons": [
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "Has a FileSystem",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "Insufficient space (<5GB)"
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        ],
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        "sys_api": {
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "actuators": null,
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "device_nodes": "sr0",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "devname": "sr0",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "human_readable_size": "482.00 KB",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "id_bus": "ata",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "model": "QEMU DVD-ROM",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "nr_requests": "2",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "parent": "/dev/sr0",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "partitions": {},
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "path": "/dev/sr0",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "removable": "1",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "rev": "2.5+",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "ro": "0",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "rotational": "1",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "sas_address": "",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "sas_device_handle": "",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "scheduler_mode": "mq-deadline",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "sectors": 0,
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "sectorsize": "2048",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "size": 493568.0,
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "support_discard": "2048",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "type": "disk",
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:            "vendor": "QEMU"
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:        }
Jan 23 05:32:27 np0005593233 youthful_wright[290720]:    }
Jan 23 05:32:27 np0005593233 youthful_wright[290720]: ]
Jan 23 05:32:27 np0005593233 systemd[1]: libpod-73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e.scope: Deactivated successfully.
Jan 23 05:32:27 np0005593233 podman[290701]: 2026-01-23 10:32:27.885195097 +0000 UTC m=+1.699367179 container died 73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_wright, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:32:27 np0005593233 systemd[1]: libpod-73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e.scope: Consumed 1.556s CPU time.
Jan 23 05:32:27 np0005593233 systemd[1]: var-lib-containers-storage-overlay-a1af6da4166764ab13e0aa5efec80b2494df2b591757719aefd977436f9ce973-merged.mount: Deactivated successfully.
Jan 23 05:32:27 np0005593233 podman[290701]: 2026-01-23 10:32:27.954239329 +0000 UTC m=+1.768411401 container remove 73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=youthful_wright, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Jan 23 05:32:27 np0005593233 systemd[1]: libpod-conmon-73f70c8b591036329e50682d100995d026767ceb62ae627788038cb5cbc71e1e.scope: Deactivated successfully.
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.116 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/499980552' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.198 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.559s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.201 222021 DEBUG nova.virt.libvirt.vif [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:31:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-833764657',display_name='tempest-ServerStableDeviceRescueTest-server-833764657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-833764657',id=179,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:31:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-l34bmfpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:32:06Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=23f7c54d-ed5d-404f-8517-b5cd21d0c282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:e4:42:72"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.201 222021 DEBUG nova.network.os_vif_util [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:e4:42:72"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.203 222021 DEBUG nova.network.os_vif_util [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.204 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.248 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <uuid>23f7c54d-ed5d-404f-8517-b5cd21d0c282</uuid>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <name>instance-000000b3</name>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-833764657</nova:name>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:32:26</nova:creationTime>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:user uuid="e1629a4b14764dddaabcadd16f3e1c1c">tempest-ServerStableDeviceRescueTest-1802220041-project-member</nova:user>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:project uuid="815b71acf60d4ed8933ebd05228fa0c0">tempest-ServerStableDeviceRescueTest-1802220041</nova:project>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <nova:port uuid="66bbd2d4-1733-4a5d-a84b-8d41c36dd82d">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <entry name="serial">23f7c54d-ed5d-404f-8517-b5cd21d0c282</entry>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <entry name="uuid">23f7c54d-ed5d-404f-8517-b5cd21d0c282</entry>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.rescue">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <target dev="sdb" bus="usb"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <boot order="1"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:e4:42:72"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <target dev="tap66bbd2d4-17"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/console.log" append="off"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:32:28 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:32:28 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:32:28 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:32:28 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.259 222021 INFO nova.virt.libvirt.driver [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance destroyed successfully.#033[00m
Jan 23 05:32:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:28.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.687 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.687 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.687 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.688 222021 DEBUG nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No VIF found with MAC fa:16:3e:e4:42:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.689 222021 INFO nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Using config drive#033[00m
Jan 23 05:32:28 np0005593233 nova_compute[222017]: 2026-01-23 10:32:28.728 222021 DEBUG nova.storage.rbd_utils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.756604) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348756697, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 1101, "num_deletes": 253, "total_data_size": 2232058, "memory_usage": 2268776, "flush_reason": "Manual Compaction"}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348765964, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 1052623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74222, "largest_seqno": 75318, "table_properties": {"data_size": 1048141, "index_size": 2005, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11611, "raw_average_key_size": 21, "raw_value_size": 1038658, "raw_average_value_size": 1930, "num_data_blocks": 86, "num_entries": 538, "num_filter_entries": 538, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164283, "oldest_key_time": 1769164283, "file_creation_time": 1769164348, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 9404 microseconds, and 4832 cpu microseconds.
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.766022) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 1052623 bytes OK
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.766047) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.768617) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.768636) EVENT_LOG_v1 {"time_micros": 1769164348768628, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.768659) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 2226522, prev total WAL file size 2226522, number of live WAL files 2.
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.769499) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353038' seq:72057594037927935, type:22 .. '6D6772737461740032373539' seq:0, type:0; will stop at (end)
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(1027KB)], [153(13MB)]
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348769571, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 14811582, "oldest_snapshot_seqno": -1}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9414 keys, 11351370 bytes, temperature: kUnknown
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348859995, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 11351370, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11292560, "index_size": 34185, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 248236, "raw_average_key_size": 26, "raw_value_size": 11129401, "raw_average_value_size": 1182, "num_data_blocks": 1305, "num_entries": 9414, "num_filter_entries": 9414, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164348, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:28.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.860859) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 11351370 bytes
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.868130) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.6 rd, 125.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.1 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(24.9) write-amplify(10.8) OK, records in: 9914, records dropped: 500 output_compression: NoCompression
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.868165) EVENT_LOG_v1 {"time_micros": 1769164348868150, "job": 98, "event": "compaction_finished", "compaction_time_micros": 90546, "compaction_time_cpu_micros": 34634, "output_level": 6, "num_output_files": 1, "total_output_size": 11351370, "num_input_records": 9914, "num_output_records": 9414, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348868683, "job": 98, "event": "table_file_deletion", "file_number": 155}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348873298, "job": 98, "event": "table_file_deletion", "file_number": 153}
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.769375) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.873379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.873388) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.873390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.873392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:28.873394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:29 np0005593233 nova_compute[222017]: 2026-01-23 10:32:29.632 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:29 np0005593233 nova_compute[222017]: 2026-01-23 10:32:29.927 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:30.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:31 np0005593233 nova_compute[222017]: 2026-01-23 10:32:31.421 222021 DEBUG nova.objects.instance [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'keypairs' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:32.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:32.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.084 222021 INFO nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Creating config drive at /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config.rescue#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.091 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv1nbi6q8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.121 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.237 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv1nbi6q8" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.274 222021 DEBUG nova.storage.rbd_utils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.279 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config.rescue 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:33.615 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:33.618 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.914 222021 DEBUG oslo_concurrency.processutils [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config.rescue 23f7c54d-ed5d-404f-8517-b5cd21d0c282_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.635s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:33 np0005593233 nova_compute[222017]: 2026-01-23 10:32:33.915 222021 INFO nova.virt.libvirt.driver [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Deleting local config drive /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:32:34 np0005593233 kernel: tap66bbd2d4-17: entered promiscuous mode
Jan 23 05:32:34 np0005593233 NetworkManager[48871]: <info>  [1769164354.0042] manager: (tap66bbd2d4-17): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Jan 23 05:32:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:34Z|00752|binding|INFO|Claiming lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for this chassis.
Jan 23 05:32:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:34Z|00753|binding|INFO|66bbd2d4-1733-4a5d-a84b-8d41c36dd82d: Claiming fa:16:3e:e4:42:72 10.100.0.13
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:34Z|00754|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d ovn-installed in OVS
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:34 np0005593233 systemd-udevd[292082]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:32:34 np0005593233 systemd-machined[190954]: New machine qemu-82-instance-000000b3.
Jan 23 05:32:34 np0005593233 NetworkManager[48871]: <info>  [1769164354.0645] device (tap66bbd2d4-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:32:34 np0005593233 NetworkManager[48871]: <info>  [1769164354.0654] device (tap66bbd2d4-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:32:34 np0005593233 systemd[1]: Started Virtual Machine qemu-82-instance-000000b3.
Jan 23 05:32:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:34Z|00755|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d up in Southbound
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.397 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:42:72 10.100.0.13'], port_security=['fa:16:3e:e4:42:72 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '23f7c54d-ed5d-404f-8517-b5cd21d0c282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.400 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.402 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.424 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[18dfd62c-f42b-4789-b0b4-6cb332fb2f5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.426 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7d5530f-51 in ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.428 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7d5530f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.429 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7860fa27-8af0-4e4d-a60c-de81e03e90f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.431 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[79a70233-46f7-4c79-aefb-13067098a7d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.452 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[13f26930-7f91-4b9f-89c5-9f2013157e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.487 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a250e78c-d6a8-4a68-a3e5-a828afcf289c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:32:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:34.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.540 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[acb9aace-c8b4-47b6-b909-8c3b17b9b7a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.551 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[378682af-838e-4e5f-a72c-fe74a495af0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 NetworkManager[48871]: <info>  [1769164354.5521] manager: (tapd7d5530f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/343)
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.605 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fe17ba95-f950-4002-aaea-34b1d99675bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.613 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[43ea03e1-29d0-4e31-b421-ffe464da3674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 NetworkManager[48871]: <info>  [1769164354.6569] device (tapd7d5530f-50): carrier: link connected
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.668 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[58ab2a5a-1612-43df-b501-b479a884cee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.703 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9f368801-c476-4f62-a68f-c6f3a91559f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 826728, 'reachable_time': 35158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292176, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.728 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f82790d9-8542-4a46-bdf3-34bd103b7df6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:67cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 826728, 'tstamp': 826728}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292178, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.757 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0d4bb4-8583-4251-933e-f71e4697c8bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 826728, 'reachable_time': 35158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 292179, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.768 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 23f7c54d-ed5d-404f-8517-b5cd21d0c282 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.769 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164354.7674668, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.770 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.779 222021 DEBUG nova.compute.manager [None req-26358c17-733f-4462-9651-90e7a61ce95d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.819 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c239d5b7-8d7f-4dfc-a405-539ab6c6d62a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.825 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.830 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:32:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:34.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.927 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[36a8a06c-931c-46f1-8c58-71119e1ab7e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.929 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.929 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.930 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.930 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:34 np0005593233 kernel: tapd7d5530f-50: entered promiscuous mode
Jan 23 05:32:34 np0005593233 NetworkManager[48871]: <info>  [1769164354.9334] manager: (tapd7d5530f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.937 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:34Z|00756|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.941 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.942 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f9b804-df52-4f45-ad66-82c662464fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.943 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:32:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:34.944 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'env', 'PROCESS_TAG=haproxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7d5530f-5227-4f75-bac0-2604bb3d68e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:32:34 np0005593233 nova_compute[222017]: 2026-01-23 10:32:34.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.084 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164354.7712839, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.086 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Started (Lifecycle Event)#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.229 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.235 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:32:35 np0005593233 podman[292211]: 2026-01-23 10:32:35.471854185 +0000 UTC m=+0.094029093 container create 899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:32:35 np0005593233 podman[292211]: 2026-01-23 10:32:35.428229476 +0000 UTC m=+0.050404424 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:32:35 np0005593233 systemd[1]: Started libpod-conmon-899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c.scope.
Jan 23 05:32:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:32:35 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:32:35 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76d77c313b55f51fe7fead41a4eb4f531a2102a6c9e8032820d63b964f6d4014/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:32:35 np0005593233 podman[292211]: 2026-01-23 10:32:35.598857634 +0000 UTC m=+0.221032542 container init 899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:32:35 np0005593233 podman[292211]: 2026-01-23 10:32:35.606063719 +0000 UTC m=+0.228238587 container start 899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:32:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:35.621 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.628 222021 DEBUG nova.compute.manager [req-c092996f-8404-4295-a039-5cfcdaf0005c req-ebfcf36b-3020-43c7-a3c3-134b8023ea4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.629 222021 DEBUG oslo_concurrency.lockutils [req-c092996f-8404-4295-a039-5cfcdaf0005c req-ebfcf36b-3020-43c7-a3c3-134b8023ea4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.630 222021 DEBUG oslo_concurrency.lockutils [req-c092996f-8404-4295-a039-5cfcdaf0005c req-ebfcf36b-3020-43c7-a3c3-134b8023ea4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.630 222021 DEBUG oslo_concurrency.lockutils [req-c092996f-8404-4295-a039-5cfcdaf0005c req-ebfcf36b-3020-43c7-a3c3-134b8023ea4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.631 222021 DEBUG nova.compute.manager [req-c092996f-8404-4295-a039-5cfcdaf0005c req-ebfcf36b-3020-43c7-a3c3-134b8023ea4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:35 np0005593233 nova_compute[222017]: 2026-01-23 10:32:35.632 222021 WARNING nova.compute.manager [req-c092996f-8404-4295-a039-5cfcdaf0005c req-ebfcf36b-3020-43c7-a3c3-134b8023ea4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:32:35 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [NOTICE]   (292230) : New worker (292232) forked
Jan 23 05:32:35 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [NOTICE]   (292230) : Loading success.
Jan 23 05:32:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:36.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:36.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:37 np0005593233 nova_compute[222017]: 2026-01-23 10:32:37.016 222021 INFO nova.compute.manager [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Unrescuing#033[00m
Jan 23 05:32:37 np0005593233 nova_compute[222017]: 2026-01-23 10:32:37.017 222021 DEBUG oslo_concurrency.lockutils [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:32:37 np0005593233 nova_compute[222017]: 2026-01-23 10:32:37.017 222021 DEBUG oslo_concurrency.lockutils [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:32:37 np0005593233 nova_compute[222017]: 2026-01-23 10:32:37.018 222021 DEBUG nova.network.neutron [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.124 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:38.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.515 222021 DEBUG nova.compute.manager [req-b8bb9225-34ca-456e-bf91-edb8741afe1f req-7edf7cb0-cf97-4c4a-850b-26e93ec24858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.516 222021 DEBUG oslo_concurrency.lockutils [req-b8bb9225-34ca-456e-bf91-edb8741afe1f req-7edf7cb0-cf97-4c4a-850b-26e93ec24858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.516 222021 DEBUG oslo_concurrency.lockutils [req-b8bb9225-34ca-456e-bf91-edb8741afe1f req-7edf7cb0-cf97-4c4a-850b-26e93ec24858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.517 222021 DEBUG oslo_concurrency.lockutils [req-b8bb9225-34ca-456e-bf91-edb8741afe1f req-7edf7cb0-cf97-4c4a-850b-26e93ec24858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.517 222021 DEBUG nova.compute.manager [req-b8bb9225-34ca-456e-bf91-edb8741afe1f req-7edf7cb0-cf97-4c4a-850b-26e93ec24858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:38 np0005593233 nova_compute[222017]: 2026-01-23 10:32:38.518 222021 WARNING nova.compute.manager [req-b8bb9225-34ca-456e-bf91-edb8741afe1f req-7edf7cb0-cf97-4c4a-850b-26e93ec24858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:32:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:38.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:39 np0005593233 podman[292291]: 2026-01-23 10:32:39.116044896 +0000 UTC m=+0.120720512 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 05:32:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:39 np0005593233 nova_compute[222017]: 2026-01-23 10:32:39.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:40.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.305 222021 DEBUG nova.network.neutron [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.342 222021 DEBUG oslo_concurrency.lockutils [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.344 222021 DEBUG nova.objects.instance [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'flavor' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:41 np0005593233 kernel: tap66bbd2d4-17 (unregistering): left promiscuous mode
Jan 23 05:32:41 np0005593233 NetworkManager[48871]: <info>  [1769164361.6983] device (tap66bbd2d4-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:32:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:41Z|00757|binding|INFO|Releasing lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d from this chassis (sb_readonly=0)
Jan 23 05:32:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:41Z|00758|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d down in Southbound
Jan 23 05:32:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:41Z|00759|binding|INFO|Removing iface tap66bbd2d4-17 ovn-installed in OVS
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.717 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.750 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:41 np0005593233 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Jan 23 05:32:41 np0005593233 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b3.scope: Consumed 7.715s CPU time.
Jan 23 05:32:41 np0005593233 systemd-machined[190954]: Machine qemu-82-instance-000000b3 terminated.
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.884 222021 INFO nova.virt.libvirt.driver [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance destroyed successfully.#033[00m
Jan 23 05:32:41 np0005593233 nova_compute[222017]: 2026-01-23 10:32:41.886 222021 DEBUG nova.objects.instance [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:41.905 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:42:72 10.100.0.13'], port_security=['fa:16:3e:e4:42:72 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '23f7c54d-ed5d-404f-8517-b5cd21d0c282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:32:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:41.908 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:32:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:41.911 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7d5530f-5227-4f75-bac0-2604bb3d68e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:32:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:41.913 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[703362f1-53a0-416e-8013-db3314cfdb47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:41 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:41.914 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace which is not needed anymore#033[00m
Jan 23 05:32:42 np0005593233 kernel: tap66bbd2d4-17: entered promiscuous mode
Jan 23 05:32:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:42Z|00760|binding|INFO|Claiming lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for this chassis.
Jan 23 05:32:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:42Z|00761|binding|INFO|66bbd2d4-1733-4a5d-a84b-8d41c36dd82d: Claiming fa:16:3e:e4:42:72 10.100.0.13
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.143 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 systemd-udevd[292322]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:32:42 np0005593233 NetworkManager[48871]: <info>  [1769164362.1480] manager: (tap66bbd2d4-17): new Tun device (/org/freedesktop/NetworkManager/Devices/345)
Jan 23 05:32:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:42Z|00762|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d ovn-installed in OVS
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.167 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.170 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 NetworkManager[48871]: <info>  [1769164362.1724] device (tap66bbd2d4-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:32:42 np0005593233 NetworkManager[48871]: <info>  [1769164362.1743] device (tap66bbd2d4-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:32:42 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [NOTICE]   (292230) : haproxy version is 2.8.14-c23fe91
Jan 23 05:32:42 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [NOTICE]   (292230) : path to executable is /usr/sbin/haproxy
Jan 23 05:32:42 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [WARNING]  (292230) : Exiting Master process...
Jan 23 05:32:42 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [ALERT]    (292230) : Current worker (292232) exited with code 143 (Terminated)
Jan 23 05:32:42 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292226]: [WARNING]  (292230) : All workers exited. Exiting... (0)
Jan 23 05:32:42 np0005593233 systemd[1]: libpod-899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c.scope: Deactivated successfully.
Jan 23 05:32:42 np0005593233 podman[292364]: 2026-01-23 10:32:42.20819763 +0000 UTC m=+0.053741908 container died 899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:32:42 np0005593233 systemd-machined[190954]: New machine qemu-83-instance-000000b3.
Jan 23 05:32:42 np0005593233 systemd[1]: Started Virtual Machine qemu-83-instance-000000b3.
Jan 23 05:32:42 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:32:42 np0005593233 systemd[1]: var-lib-containers-storage-overlay-76d77c313b55f51fe7fead41a4eb4f531a2102a6c9e8032820d63b964f6d4014-merged.mount: Deactivated successfully.
Jan 23 05:32:42 np0005593233 podman[292364]: 2026-01-23 10:32:42.266476616 +0000 UTC m=+0.112020914 container cleanup 899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:32:42 np0005593233 systemd[1]: libpod-conmon-899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c.scope: Deactivated successfully.
Jan 23 05:32:42 np0005593233 podman[292399]: 2026-01-23 10:32:42.368390212 +0000 UTC m=+0.063844915 container remove 899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.379 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[022b08fb-25f0-489a-9b70-7065a8169bd7]: (4, ('Fri Jan 23 10:32:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c)\n899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c\nFri Jan 23 10:32:42 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c)\n899e957bb6305ad8eb805a5dfe80ce8b1311ba3958108ec5940c2d917c2fff6c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.382 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5f55f0-cd8c-469d-a8e5-21cd30cbc5b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.384 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.386 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 kernel: tapd7d5530f-50: left promiscuous mode
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.409 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a85a367d-ad38-45e1-ab66-9816e87b58be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.427 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:42:72 10.100.0.13'], port_security=['fa:16:3e:e4:42:72 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '23f7c54d-ed5d-404f-8517-b5cd21d0c282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:32:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:42Z|00763|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d up in Southbound
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.429 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2dada540-377a-4420-bb23-a3d784af3392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.430 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a41ff0a0-86b7-401f-89d0-dd3de0e76e18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.454 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[22781d1a-c9a3-4906-a0e9-5abc2427132a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 826716, 'reachable_time': 16166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292420, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.459 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.459 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[196cd440-6da9-4020-a750-37a033d5dbdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 systemd[1]: run-netns-ovnmeta\x2dd7d5530f\x2d5227\x2d4f75\x2dbac0\x2d2604bb3d68e2.mount: Deactivated successfully.
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.459 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.461 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.479 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a02e4d-f876-4191-a399-726fcab30700]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.480 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7d5530f-51 in ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.483 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7d5530f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.483 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[14ca835c-7fc0-41a9-adab-82edf83aa004]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.484 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[695b7783-aa7b-4dfe-a084-d32a455ecae3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:32:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:42.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.504 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[6b007dbb-799a-4339-8073-ae20879dc063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.534 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9fb05a-9fbd-45dc-b453-4fc3ba9a5d11]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.589 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a4aff2-3039-493b-98e4-da25fba373fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 NetworkManager[48871]: <info>  [1769164362.6043] manager: (tapd7d5530f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/346)
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.599 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[300b0ff5-d81f-4219-aea1-0337e163a2a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.647 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2997f917-a98d-4818-b41b-1bab8d85658a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.652 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9d6d84-caf2-45ea-b700-e8cc0ae1bbc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.694 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.695 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.695 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:42 np0005593233 NetworkManager[48871]: <info>  [1769164362.7012] device (tapd7d5530f-50): carrier: link connected
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.712 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[52a6f1cd-5e02-4cef-946b-138e01aace25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.737 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6ccd4f-1122-4604-82c2-35c4ff392b48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827533, 'reachable_time': 40736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292476, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.761 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[098307a7-37b3-43c2-beea-e3fb61d42503]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:67cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 827533, 'tstamp': 827533}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292479, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.783 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b1168b-9431-4b33-ac99-20f8268f1e6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827533, 'reachable_time': 40736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 292483, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.834 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[df7d8466-43b2-450f-bb41-2c36dcb04dc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:42.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.885 222021 DEBUG nova.compute.manager [req-628a5da8-3c2a-4e36-bc0f-8230d457ff40 req-d750196b-6f80-49ab-9aeb-a94a51124e3a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.886 222021 DEBUG oslo_concurrency.lockutils [req-628a5da8-3c2a-4e36-bc0f-8230d457ff40 req-d750196b-6f80-49ab-9aeb-a94a51124e3a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.886 222021 DEBUG oslo_concurrency.lockutils [req-628a5da8-3c2a-4e36-bc0f-8230d457ff40 req-d750196b-6f80-49ab-9aeb-a94a51124e3a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.886 222021 DEBUG oslo_concurrency.lockutils [req-628a5da8-3c2a-4e36-bc0f-8230d457ff40 req-d750196b-6f80-49ab-9aeb-a94a51124e3a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.887 222021 DEBUG nova.compute.manager [req-628a5da8-3c2a-4e36-bc0f-8230d457ff40 req-d750196b-6f80-49ab-9aeb-a94a51124e3a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.887 222021 WARNING nova.compute.manager [req-628a5da8-3c2a-4e36-bc0f-8230d457ff40 req-d750196b-6f80-49ab-9aeb-a94a51124e3a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.899 222021 DEBUG nova.virt.libvirt.host [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Removed pending event for 23f7c54d-ed5d-404f-8517-b5cd21d0c282 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.900 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164362.8993225, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.900 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.935 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c41105-ff4d-4cad-8b23-8ae6aac0b650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.936 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.937 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.937 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.939 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:42 np0005593233 NetworkManager[48871]: <info>  [1769164362.9406] manager: (tapd7d5530f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Jan 23 05:32:42 np0005593233 kernel: tapd7d5530f-50: entered promiscuous mode
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.940 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.942 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:42 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:42Z|00764|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.946 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.948 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5dfe5b44-91d0-4208-9cc5-c689620417da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.949 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:32:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:32:42.954 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'env', 'PROCESS_TAG=haproxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7d5530f-5227-4f75-bac0-2604bb3d68e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.957 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:32:42 np0005593233 nova_compute[222017]: 2026-01-23 10:32:42.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.127 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.237 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.238 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164362.9023497, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.239 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Started (Lifecycle Event)#033[00m
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.365 222021 DEBUG nova.compute.manager [None req-f3268c53-3a56-48d9-8fb6-ecadc4a9ce44 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:43 np0005593233 podman[292536]: 2026-01-23 10:32:43.465876547 +0000 UTC m=+0.086670043 container create 7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:32:43 np0005593233 podman[292536]: 2026-01-23 10:32:43.418580173 +0000 UTC m=+0.039373729 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:32:43 np0005593233 systemd[1]: Started libpod-conmon-7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64.scope.
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.548 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.555 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:32:43 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:32:43 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1316b78a5ec41ea0fcd0e6fff88669f947612cec308b83abdb58f8292ed772c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:32:43 np0005593233 nova_compute[222017]: 2026-01-23 10:32:43.598 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:32:43 np0005593233 podman[292536]: 2026-01-23 10:32:43.613202784 +0000 UTC m=+0.233996300 container init 7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:32:43 np0005593233 podman[292536]: 2026-01-23 10:32:43.625289587 +0000 UTC m=+0.246083093 container start 7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:32:43 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [NOTICE]   (292556) : New worker (292558) forked
Jan 23 05:32:43 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [NOTICE]   (292556) : Loading success.
Jan 23 05:32:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:44.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:32:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3095900524' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:32:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:32:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3095900524' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:32:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:32:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:44.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:32:44 np0005593233 nova_compute[222017]: 2026-01-23 10:32:44.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.249 222021 DEBUG nova.compute.manager [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.250 222021 DEBUG oslo_concurrency.lockutils [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.250 222021 DEBUG oslo_concurrency.lockutils [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.250 222021 DEBUG oslo_concurrency.lockutils [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.250 222021 DEBUG nova.compute.manager [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.250 222021 WARNING nova.compute.manager [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.250 222021 DEBUG nova.compute.manager [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.251 222021 DEBUG oslo_concurrency.lockutils [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.251 222021 DEBUG oslo_concurrency.lockutils [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.251 222021 DEBUG oslo_concurrency.lockutils [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.251 222021 DEBUG nova.compute.manager [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:45 np0005593233 nova_compute[222017]: 2026-01-23 10:32:45.251 222021 WARNING nova.compute.manager [req-bfbf39d1-194a-479a-a144-166a71aa5528 req-6389e80b-bd21-4896-a974-639e8c75714c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:32:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:46.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:46.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:47 np0005593233 nova_compute[222017]: 2026-01-23 10:32:47.459 222021 DEBUG nova.compute.manager [req-841fc946-8573-4cd2-8fba-f1150574b56c req-1e63cea4-737e-444a-bbad-f404fb180d7e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:32:47 np0005593233 nova_compute[222017]: 2026-01-23 10:32:47.460 222021 DEBUG oslo_concurrency.lockutils [req-841fc946-8573-4cd2-8fba-f1150574b56c req-1e63cea4-737e-444a-bbad-f404fb180d7e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:47 np0005593233 nova_compute[222017]: 2026-01-23 10:32:47.460 222021 DEBUG oslo_concurrency.lockutils [req-841fc946-8573-4cd2-8fba-f1150574b56c req-1e63cea4-737e-444a-bbad-f404fb180d7e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:47 np0005593233 nova_compute[222017]: 2026-01-23 10:32:47.460 222021 DEBUG oslo_concurrency.lockutils [req-841fc946-8573-4cd2-8fba-f1150574b56c req-1e63cea4-737e-444a-bbad-f404fb180d7e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:47 np0005593233 nova_compute[222017]: 2026-01-23 10:32:47.461 222021 DEBUG nova.compute.manager [req-841fc946-8573-4cd2-8fba-f1150574b56c req-1e63cea4-737e-444a-bbad-f404fb180d7e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:32:47 np0005593233 nova_compute[222017]: 2026-01-23 10:32:47.461 222021 WARNING nova.compute.manager [req-841fc946-8573-4cd2-8fba-f1150574b56c req-1e63cea4-737e-444a-bbad-f404fb180d7e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:32:48 np0005593233 nova_compute[222017]: 2026-01-23 10:32:48.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:48.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:48.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:49 np0005593233 nova_compute[222017]: 2026-01-23 10:32:49.937 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:50.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:50.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:32:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:52.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:32:53 np0005593233 nova_compute[222017]: 2026-01-23 10:32:53.133 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:54.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:54 np0005593233 nova_compute[222017]: 2026-01-23 10:32:54.939 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:54 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Jan 23 05:32:54 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:54.998180) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:54 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Jan 23 05:32:54 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164374998552, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 512, "num_deletes": 251, "total_data_size": 775915, "memory_usage": 786840, "flush_reason": "Manual Compaction"}
Jan 23 05:32:54 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375005745, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 512617, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75323, "largest_seqno": 75830, "table_properties": {"data_size": 509827, "index_size": 825, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6568, "raw_average_key_size": 19, "raw_value_size": 504300, "raw_average_value_size": 1461, "num_data_blocks": 36, "num_entries": 345, "num_filter_entries": 345, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164349, "oldest_key_time": 1769164349, "file_creation_time": 1769164374, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 7581 microseconds, and 2637 cpu microseconds.
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.005790) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 512617 bytes OK
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.005810) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.009091) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.009105) EVENT_LOG_v1 {"time_micros": 1769164375009101, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.009124) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 772872, prev total WAL file size 772872, number of live WAL files 2.
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.009744) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(500KB)], [156(10MB)]
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375009906, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 11863987, "oldest_snapshot_seqno": -1}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 9245 keys, 9912421 bytes, temperature: kUnknown
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375130386, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 9912421, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9855898, "index_size": 32282, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 245393, "raw_average_key_size": 26, "raw_value_size": 9696878, "raw_average_value_size": 1048, "num_data_blocks": 1218, "num_entries": 9245, "num_filter_entries": 9245, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164375, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.130794) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 9912421 bytes
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.132814) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 98.4 rd, 82.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 10.8 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(42.5) write-amplify(19.3) OK, records in: 9759, records dropped: 514 output_compression: NoCompression
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.132845) EVENT_LOG_v1 {"time_micros": 1769164375132831, "job": 100, "event": "compaction_finished", "compaction_time_micros": 120583, "compaction_time_cpu_micros": 41850, "output_level": 6, "num_output_files": 1, "total_output_size": 9912421, "num_input_records": 9759, "num_output_records": 9245, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375133176, "job": 100, "event": "table_file_deletion", "file_number": 158}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375136049, "job": 100, "event": "table_file_deletion", "file_number": 156}
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.009553) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.136171) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.136204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.136209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.136213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:32:55.136217) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:56 np0005593233 podman[292568]: 2026-01-23 10:32:56.104025053 +0000 UTC m=+0.097621565 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:32:56 np0005593233 ovn_controller[130653]: 2026-01-23T10:32:56Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:42:72 10.100.0.13
Jan 23 05:32:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:56.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.506 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.507 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.508 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.508 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:32:58 np0005593233 nova_compute[222017]: 2026-01-23 10:32:58.510 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:58.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:32:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:32:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:58.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:32:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:32:58 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4262452173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.014 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.121 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.122 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:32:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.381 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.383 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4110MB free_disk=20.89715576171875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.383 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.384 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.584 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 23f7c54d-ed5d-404f-8517-b5cd21d0c282 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.585 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.585 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.716 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:59 np0005593233 nova_compute[222017]: 2026-01-23 10:32:59.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:00 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2124341944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:00 np0005593233 nova_compute[222017]: 2026-01-23 10:33:00.249 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:00 np0005593233 nova_compute[222017]: 2026-01-23 10:33:00.258 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:00 np0005593233 nova_compute[222017]: 2026-01-23 10:33:00.292 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:00 np0005593233 nova_compute[222017]: 2026-01-23 10:33:00.359 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:33:00 np0005593233 nova_compute[222017]: 2026-01-23 10:33:00.360 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:00.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:00.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:01 np0005593233 nova_compute[222017]: 2026-01-23 10:33:01.360 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:01 np0005593233 nova_compute[222017]: 2026-01-23 10:33:01.361 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:01 np0005593233 nova_compute[222017]: 2026-01-23 10:33:01.361 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:01 np0005593233 nova_compute[222017]: 2026-01-23 10:33:01.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:02 np0005593233 nova_compute[222017]: 2026-01-23 10:33:02.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:02.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:02.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:03 np0005593233 nova_compute[222017]: 2026-01-23 10:33:03.136 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:03 np0005593233 nova_compute[222017]: 2026-01-23 10:33:03.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:03 np0005593233 nova_compute[222017]: 2026-01-23 10:33:03.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:33:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:04.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:04.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:04 np0005593233 nova_compute[222017]: 2026-01-23 10:33:04.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:06.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:06.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:08 np0005593233 nova_compute[222017]: 2026-01-23 10:33:08.138 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:08.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:08.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.908 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.909 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.915 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.916 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.916 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.917 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.942 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:33:09 np0005593233 nova_compute[222017]: 2026-01-23 10:33:09.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.058 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.058 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.068 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.069 222021 INFO nova.compute.claims [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:33:10 np0005593233 podman[292635]: 2026-01-23 10:33:10.097435949 +0000 UTC m=+0.105466707 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.394 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:10.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1812544972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:10.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.952 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.960 222021 DEBUG nova.compute.provider_tree [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:10 np0005593233 nova_compute[222017]: 2026-01-23 10:33:10.978 222021 DEBUG nova.scheduler.client.report [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.061 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.063 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.120 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.120 222021 DEBUG nova.network.neutron [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.158 222021 INFO nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.212 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.351 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.353 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.354 222021 INFO nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Creating image(s)#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.396 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.434 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.467 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.472 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.560 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.562 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.563 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.563 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.590 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.595 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 94779b4e-014b-463a-ae92-67157067665f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.641 222021 DEBUG nova.policy [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a3cd8c3758e14f9c8e4ad1a9a94a9995', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b27af793a8cc42259216fbeaa302ba03', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:33:11 np0005593233 nova_compute[222017]: 2026-01-23 10:33:11.952 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 94779b4e-014b-463a-ae92-67157067665f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.357s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.027 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] resizing rbd image 94779b4e-014b-463a-ae92-67157067665f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.291 222021 DEBUG nova.objects.instance [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'migration_context' on Instance uuid 94779b4e-014b-463a-ae92-67157067665f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.319 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.320 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Ensure instance console log exists: /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.320 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.321 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:12 np0005593233 nova_compute[222017]: 2026-01-23 10:33:12.321 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:12.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:12.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:13 np0005593233 nova_compute[222017]: 2026-01-23 10:33:13.140 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:13 np0005593233 nova_compute[222017]: 2026-01-23 10:33:13.923 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:13 np0005593233 nova_compute[222017]: 2026-01-23 10:33:13.955 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:13 np0005593233 nova_compute[222017]: 2026-01-23 10:33:13.956 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:33:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 23 05:33:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:14.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:14 np0005593233 nova_compute[222017]: 2026-01-23 10:33:14.589 222021 DEBUG nova.network.neutron [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Successfully created port: 1330df91-d1d9-411b-afea-ee9187f21149 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:33:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:14.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:14 np0005593233 nova_compute[222017]: 2026-01-23 10:33:14.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e351 e351: 3 total, 3 up, 3 in
Jan 23 05:33:16 np0005593233 nova_compute[222017]: 2026-01-23 10:33:16.503 222021 DEBUG nova.network.neutron [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Successfully updated port: 1330df91-d1d9-411b-afea-ee9187f21149 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:33:16 np0005593233 nova_compute[222017]: 2026-01-23 10:33:16.522 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:16 np0005593233 nova_compute[222017]: 2026-01-23 10:33:16.522 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquired lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:16 np0005593233 nova_compute[222017]: 2026-01-23 10:33:16.522 222021 DEBUG nova.network.neutron [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:33:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:16.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e352 e352: 3 total, 3 up, 3 in
Jan 23 05:33:16 np0005593233 nova_compute[222017]: 2026-01-23 10:33:16.881 222021 DEBUG nova.network.neutron [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:33:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:16.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:16 np0005593233 nova_compute[222017]: 2026-01-23 10:33:16.950 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e353 e353: 3 total, 3 up, 3 in
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.071 222021 DEBUG nova.compute.manager [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-changed-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.072 222021 DEBUG nova.compute.manager [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Refreshing instance network info cache due to event network-changed-1330df91-d1d9-411b-afea-ee9187f21149. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.073 222021 DEBUG oslo_concurrency.lockutils [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.143 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.506 222021 DEBUG nova.network.neutron [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updating instance_info_cache with network_info: [{"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:18.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.556 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Releasing lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.557 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Instance network_info: |[{"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.558 222021 DEBUG oslo_concurrency.lockutils [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.558 222021 DEBUG nova.network.neutron [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Refreshing network info cache for port 1330df91-d1d9-411b-afea-ee9187f21149 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.566 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Start _get_guest_xml network_info=[{"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.572 222021 WARNING nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.577 222021 DEBUG nova.virt.libvirt.host [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.578 222021 DEBUG nova.virt.libvirt.host [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.581 222021 DEBUG nova.virt.libvirt.host [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.582 222021 DEBUG nova.virt.libvirt.host [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.583 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.583 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.584 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.584 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.585 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.585 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.585 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.585 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.586 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.586 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.586 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.586 222021 DEBUG nova.virt.hardware [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:33:18 np0005593233 nova_compute[222017]: 2026-01-23 10:33:18.589 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:18.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2766938558' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.132 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.176 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.185 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3351872690' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.677 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.680 222021 DEBUG nova.virt.libvirt.vif [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:33:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=183,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1ltrnE7PZHJGSfhgOoMItJuzuSO8BMu/7or3Sm3xqUJpbfXng2myVoTHi8tXqQq8tN45e6cz9sNqzYdHdsAQpXkf89L8KtoodzDy4Xtv+6pZh8F+deIOOF1ul1nwPV4Q==',key_name='tempest-TestSecurityGroupsBasicOps-1657112076',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-tawbyx3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:33:11Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=94779b4e-014b-463a-ae92-67157067665f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.680 222021 DEBUG nova.network.os_vif_util [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.682 222021 DEBUG nova.network.os_vif_util [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.683 222021 DEBUG nova.objects.instance [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'pci_devices' on Instance uuid 94779b4e-014b-463a-ae92-67157067665f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.720 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <uuid>94779b4e-014b-463a-ae92-67157067665f</uuid>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <name>instance-000000b7</name>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897</nova:name>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:33:18</nova:creationTime>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:user uuid="a3cd8c3758e14f9c8e4ad1a9a94a9995">tempest-TestSecurityGroupsBasicOps-622349977-project-member</nova:user>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:project uuid="b27af793a8cc42259216fbeaa302ba03">tempest-TestSecurityGroupsBasicOps-622349977</nova:project>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <nova:port uuid="1330df91-d1d9-411b-afea-ee9187f21149">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <entry name="serial">94779b4e-014b-463a-ae92-67157067665f</entry>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <entry name="uuid">94779b4e-014b-463a-ae92-67157067665f</entry>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/94779b4e-014b-463a-ae92-67157067665f_disk">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/94779b4e-014b-463a-ae92-67157067665f_disk.config">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:22:bc:67"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <target dev="tap1330df91-d1"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/console.log" append="off"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:33:19 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:33:19 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:33:19 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:33:19 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.722 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Preparing to wait for external event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.723 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.724 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.724 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.725 222021 DEBUG nova.virt.libvirt.vif [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:33:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=183,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1ltrnE7PZHJGSfhgOoMItJuzuSO8BMu/7or3Sm3xqUJpbfXng2myVoTHi8tXqQq8tN45e6cz9sNqzYdHdsAQpXkf89L8KtoodzDy4Xtv+6pZh8F+deIOOF1ul1nwPV4Q==',key_name='tempest-TestSecurityGroupsBasicOps-1657112076',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-tawbyx3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:33:11Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=94779b4e-014b-463a-ae92-67157067665f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.726 222021 DEBUG nova.network.os_vif_util [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.727 222021 DEBUG nova.network.os_vif_util [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.729 222021 DEBUG os_vif [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.731 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.732 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.738 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.738 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1330df91-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.739 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1330df91-d1, col_values=(('external_ids', {'iface-id': '1330df91-d1d9-411b-afea-ee9187f21149', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:bc:67', 'vm-uuid': '94779b4e-014b-463a-ae92-67157067665f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.741 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:19 np0005593233 NetworkManager[48871]: <info>  [1769164399.7423] manager: (tap1330df91-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.750 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.751 222021 INFO os_vif [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1')#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.841 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.842 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.842 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No VIF found with MAC fa:16:3e:22:bc:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.843 222021 INFO nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Using config drive#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.876 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:19 np0005593233 nova_compute[222017]: 2026-01-23 10:33:19.950 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.546 222021 INFO nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Creating config drive at /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/disk.config#033[00m
Jan 23 05:33:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:33:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:20.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.557 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuqzt3k7f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.720 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuqzt3k7f" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.763 222021 DEBUG nova.storage.rbd_utils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 94779b4e-014b-463a-ae92-67157067665f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.769 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/disk.config 94779b4e-014b-463a-ae92-67157067665f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:20.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.987 222021 DEBUG oslo_concurrency.processutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/disk.config 94779b4e-014b-463a-ae92-67157067665f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:20 np0005593233 nova_compute[222017]: 2026-01-23 10:33:20.989 222021 INFO nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Deleting local config drive /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f/disk.config because it was imported into RBD.#033[00m
Jan 23 05:33:21 np0005593233 kernel: tap1330df91-d1: entered promiscuous mode
Jan 23 05:33:21 np0005593233 NetworkManager[48871]: <info>  [1769164401.0615] manager: (tap1330df91-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:21Z|00765|binding|INFO|Claiming lport 1330df91-d1d9-411b-afea-ee9187f21149 for this chassis.
Jan 23 05:33:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:21Z|00766|binding|INFO|1330df91-d1d9-411b-afea-ee9187f21149: Claiming fa:16:3e:22:bc:67 10.100.0.8
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.069 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.072 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.098 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:bc:67 10.100.0.8'], port_security=['fa:16:3e:22:bc:67 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '94779b4e-014b-463a-ae92-67157067665f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a58da180-17cd-4554-a7dd-fdd1057faf2d a861f02d-6ba6-4abf-a226-939d7b8444ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e345ee25-e71d-44f1-88c3-b2014cfeb099, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1330df91-d1d9-411b-afea-ee9187f21149) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:21 np0005593233 systemd-udevd[292987]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:33:21 np0005593233 systemd-machined[190954]: New machine qemu-84-instance-000000b7.
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.102 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1330df91-d1d9-411b-afea-ee9187f21149 in datapath ed4554dd-9a44-4785-9a4e-a211fe7b4949 bound to our chassis#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.105 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed4554dd-9a44-4785-9a4e-a211fe7b4949#033[00m
Jan 23 05:33:21 np0005593233 systemd[1]: Started Virtual Machine qemu-84-instance-000000b7.
Jan 23 05:33:21 np0005593233 NetworkManager[48871]: <info>  [1769164401.1227] device (tap1330df91-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.123 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5286dd24-4909-463a-b70d-99c4340181c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.124 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped4554dd-91 in ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:33:21 np0005593233 NetworkManager[48871]: <info>  [1769164401.1258] device (tap1330df91-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.127 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped4554dd-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.127 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9a139a-b3ef-4aaf-b60e-1daa473155a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.128 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[822221b0-dd75-4ebe-9ffc-8c00bcc69465]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:21Z|00767|binding|INFO|Setting lport 1330df91-d1d9-411b-afea-ee9187f21149 ovn-installed in OVS
Jan 23 05:33:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:21Z|00768|binding|INFO|Setting lport 1330df91-d1d9-411b-afea-ee9187f21149 up in Southbound
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.138 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.147 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[e30e00c0-ed65-4502-ba2e-6af6e87ed48d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.165 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1658eb46-8566-44d2-9a59-1dae53fb5b81]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.212 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[93aeba3d-1970-49be-948c-8e971af25d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.221 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[86322957-56f5-4c1c-9767-29b8ed5ba5ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 systemd-udevd[292990]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:33:21 np0005593233 NetworkManager[48871]: <info>  [1769164401.2230] manager: (taped4554dd-90): new Veth device (/org/freedesktop/NetworkManager/Devices/350)
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.271 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a9d9c8-d79d-4caa-9590-6a4bd8444c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.275 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e28411ba-1e42-499c-9dee-3fc8cb19dfee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 NetworkManager[48871]: <info>  [1769164401.3149] device (taped4554dd-90): carrier: link connected
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.324 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[29a02435-944f-42c3-8d43-50b79b4c0754]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.358 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc0f6bb-0c5d-43e5-9c99-bad14ef2b2e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped4554dd-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:f9:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831394, 'reachable_time': 28882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293020, 'error': None, 'target': 'ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.379 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a416045a-91b6-441e-bc9b-d365645964b8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:f93a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 831394, 'tstamp': 831394}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 293021, 'error': None, 'target': 'ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.403 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[451e0b31-5410-4d49-8650-35cff7caeeaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped4554dd-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:f9:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831394, 'reachable_time': 28882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 293022, 'error': None, 'target': 'ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.449 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[890bd051-f1a2-40e8-bae3-dbd4f4cc9652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.545 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[051cc837-9178-48c9-9a65-499d81a0fde1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.549 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped4554dd-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.550 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.551 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped4554dd-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:21 np0005593233 kernel: taped4554dd-90: entered promiscuous mode
Jan 23 05:33:21 np0005593233 NetworkManager[48871]: <info>  [1769164401.5553] manager: (taped4554dd-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.559 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped4554dd-90, col_values=(('external_ids', {'iface-id': '4316bb5d-ce04-4eff-ada1-59984534f30d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.560 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:21Z|00769|binding|INFO|Releasing lport 4316bb5d-ce04-4eff-ada1-59984534f30d from this chassis (sb_readonly=0)
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.563 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed4554dd-9a44-4785-9a4e-a211fe7b4949.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed4554dd-9a44-4785-9a4e-a211fe7b4949.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.564 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[db99d749-98ec-4b3c-8ea0-f40d7fb35d6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.566 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-ed4554dd-9a44-4785-9a4e-a211fe7b4949
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/ed4554dd-9a44-4785-9a4e-a211fe7b4949.pid.haproxy
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID ed4554dd-9a44-4785-9a4e-a211fe7b4949
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:33:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:21.567 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'env', 'PROCESS_TAG=haproxy-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed4554dd-9a44-4785-9a4e-a211fe7b4949.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.579 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.837 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164401.8365664, 94779b4e-014b-463a-ae92-67157067665f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.838 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] VM Started (Lifecycle Event)#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.866 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.871 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164401.8367555, 94779b4e-014b-463a-ae92-67157067665f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.871 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.904 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.907 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:21 np0005593233 nova_compute[222017]: 2026-01-23 10:33:21.933 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:33:21 np0005593233 podman[293097]: 2026-01-23 10:33:21.957361731 +0000 UTC m=+0.052864723 container create 64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:33:22 np0005593233 systemd[1]: Started libpod-conmon-64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff.scope.
Jan 23 05:33:22 np0005593233 podman[293097]: 2026-01-23 10:33:21.934036338 +0000 UTC m=+0.029539350 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:33:22 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:33:22 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82a845e115bb97d574469adc3f298767bc74f3c4964319826801018e5059813d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:33:22 np0005593233 podman[293097]: 2026-01-23 10:33:22.053532034 +0000 UTC m=+0.149035096 container init 64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:33:22 np0005593233 podman[293097]: 2026-01-23 10:33:22.062246252 +0000 UTC m=+0.157749284 container start 64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.064 222021 DEBUG nova.compute.manager [req-ecdaa194-d7a9-4b75-a965-310efca3a223 req-3bf5b8e0-6034-475a-9b98-dfb62cf73d55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.064 222021 DEBUG oslo_concurrency.lockutils [req-ecdaa194-d7a9-4b75-a965-310efca3a223 req-3bf5b8e0-6034-475a-9b98-dfb62cf73d55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.065 222021 DEBUG oslo_concurrency.lockutils [req-ecdaa194-d7a9-4b75-a965-310efca3a223 req-3bf5b8e0-6034-475a-9b98-dfb62cf73d55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.065 222021 DEBUG oslo_concurrency.lockutils [req-ecdaa194-d7a9-4b75-a965-310efca3a223 req-3bf5b8e0-6034-475a-9b98-dfb62cf73d55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.065 222021 DEBUG nova.compute.manager [req-ecdaa194-d7a9-4b75-a965-310efca3a223 req-3bf5b8e0-6034-475a-9b98-dfb62cf73d55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Processing event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.066 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.070 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164402.0705845, 94779b4e-014b-463a-ae92-67157067665f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.071 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.073 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.077 222021 INFO nova.virt.libvirt.driver [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] Instance spawned successfully.#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.077 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:33:22 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [NOTICE]   (293117) : New worker (293119) forked
Jan 23 05:33:22 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [NOTICE]   (293117) : Loading success.
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.124 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.128 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.136 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.136 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.137 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.137 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.137 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.138 222021 DEBUG nova.virt.libvirt.driver [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.178 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.270 222021 INFO nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Took 10.92 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.271 222021 DEBUG nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.375 222021 INFO nova.compute.manager [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Took 12.36 seconds to build instance.#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.401 222021 DEBUG oslo_concurrency.lockutils [None req-e0c81ecf-eef5-462f-bb36-54cea3739b5e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.414 222021 DEBUG nova.network.neutron [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updated VIF entry in instance network info cache for port 1330df91-d1d9-411b-afea-ee9187f21149. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.414 222021 DEBUG nova.network.neutron [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updating instance_info_cache with network_info: [{"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:22 np0005593233 nova_compute[222017]: 2026-01-23 10:33:22.452 222021 DEBUG oslo_concurrency.lockutils [req-77159f60-83e8-455c-83bb-59f5e617f56c req-74a527df-eda6-4c07-8b96-7db081f6a996 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:22.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:22.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 e354: 3 total, 3 up, 3 in
Jan 23 05:33:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.287 222021 DEBUG nova.compute.manager [req-50e5b41f-2408-474d-9687-b5a6b88d98a6 req-0dd515a3-bc59-48e9-ad26-11f00ca52069 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.289 222021 DEBUG oslo_concurrency.lockutils [req-50e5b41f-2408-474d-9687-b5a6b88d98a6 req-0dd515a3-bc59-48e9-ad26-11f00ca52069 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.290 222021 DEBUG oslo_concurrency.lockutils [req-50e5b41f-2408-474d-9687-b5a6b88d98a6 req-0dd515a3-bc59-48e9-ad26-11f00ca52069 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.290 222021 DEBUG oslo_concurrency.lockutils [req-50e5b41f-2408-474d-9687-b5a6b88d98a6 req-0dd515a3-bc59-48e9-ad26-11f00ca52069 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.290 222021 DEBUG nova.compute.manager [req-50e5b41f-2408-474d-9687-b5a6b88d98a6 req-0dd515a3-bc59-48e9-ad26-11f00ca52069 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] No waiting events found dispatching network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.291 222021 WARNING nova.compute.manager [req-50e5b41f-2408-474d-9687-b5a6b88d98a6 req-0dd515a3-bc59-48e9-ad26-11f00ca52069 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received unexpected event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:33:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:24.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:24.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:24 np0005593233 nova_compute[222017]: 2026-01-23 10:33:24.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.004000114s ======
Jan 23 05:33:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:26.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000114s
Jan 23 05:33:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:26.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:27 np0005593233 podman[293129]: 2026-01-23 10:33:27.083688878 +0000 UTC m=+0.082989370 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 23 05:33:28 np0005593233 NetworkManager[48871]: <info>  [1769164408.2882] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 23 05:33:28 np0005593233 NetworkManager[48871]: <info>  [1769164408.2890] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 23 05:33:28 np0005593233 nova_compute[222017]: 2026-01-23 10:33:28.290 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:28 np0005593233 nova_compute[222017]: 2026-01-23 10:33:28.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:28Z|00770|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:28Z|00771|binding|INFO|Releasing lport 4316bb5d-ce04-4eff-ada1-59984534f30d from this chassis (sb_readonly=0)
Jan 23 05:33:28 np0005593233 nova_compute[222017]: 2026-01-23 10:33:28.420 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:28.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:28.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.006 222021 DEBUG nova.compute.manager [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-changed-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.006 222021 DEBUG nova.compute.manager [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Refreshing instance network info cache due to event network-changed-1330df91-d1d9-411b-afea-ee9187f21149. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.007 222021 DEBUG oslo_concurrency.lockutils [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.007 222021 DEBUG oslo_concurrency.lockutils [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.008 222021 DEBUG nova.network.neutron [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Refreshing network info cache for port 1330df91-d1d9-411b-afea-ee9187f21149 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.745 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:29 np0005593233 nova_compute[222017]: 2026-01-23 10:33:29.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:30.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:30.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:32.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:32 np0005593233 nova_compute[222017]: 2026-01-23 10:33:32.614 222021 DEBUG nova.network.neutron [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updated VIF entry in instance network info cache for port 1330df91-d1d9-411b-afea-ee9187f21149. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:33:32 np0005593233 nova_compute[222017]: 2026-01-23 10:33:32.614 222021 DEBUG nova.network.neutron [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updating instance_info_cache with network_info: [{"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:32 np0005593233 nova_compute[222017]: 2026-01-23 10:33:32.649 222021 DEBUG oslo_concurrency.lockutils [req-0e2d7dc7-c7e4-4bc2-9f76-bf66867cb691 req-b1f5f7b5-4ac6-4026-92a5-41ab17e66b6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:32.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:34.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:34 np0005593233 nova_compute[222017]: 2026-01-23 10:33:34.748 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:34 np0005593233 nova_compute[222017]: 2026-01-23 10:33:34.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:34.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:36.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:36Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:bc:67 10.100.0.8
Jan 23 05:33:36 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:36Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:bc:67 10.100.0.8
Jan 23 05:33:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:36.952 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:36 np0005593233 nova_compute[222017]: 2026-01-23 10:33:36.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:36 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:36.955 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:33:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:36.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:38 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:38Z|00772|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:38 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:38Z|00773|binding|INFO|Releasing lport 4316bb5d-ce04-4eff-ada1-59984534f30d from this chassis (sb_readonly=0)
Jan 23 05:33:38 np0005593233 nova_compute[222017]: 2026-01-23 10:33:38.284 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:38.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:38.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:39 np0005593233 nova_compute[222017]: 2026-01-23 10:33:39.751 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:33:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:33:39 np0005593233 nova_compute[222017]: 2026-01-23 10:33:39.965 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.420 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.421 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 94779b4e-014b-463a-ae92-67157067665f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.424 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "94779b4e-014b-463a-ae92-67157067665f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.462 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:40 np0005593233 nova_compute[222017]: 2026-01-23 10:33:40.465 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "94779b4e-014b-463a-ae92-67157067665f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:40.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:40.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:41 np0005593233 podman[293287]: 2026-01-23 10:33:41.160030673 +0000 UTC m=+0.151744623 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:33:41 np0005593233 nova_compute[222017]: 2026-01-23 10:33:41.441 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:42.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:42.695 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:42.696 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:42.697 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:42.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:44.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:44 np0005593233 nova_compute[222017]: 2026-01-23 10:33:44.756 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:44.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:45 np0005593233 nova_compute[222017]: 2026-01-23 10:33:45.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:46.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:46.957 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:47.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:48.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:49.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:49 np0005593233 nova_compute[222017]: 2026-01-23 10:33:49.758 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:50 np0005593233 nova_compute[222017]: 2026-01-23 10:33:50.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:50.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:52.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:53.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:54.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:54 np0005593233 nova_compute[222017]: 2026-01-23 10:33:54.813 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:54 np0005593233 nova_compute[222017]: 2026-01-23 10:33:54.953 222021 DEBUG nova.compute.manager [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-changed-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:54 np0005593233 nova_compute[222017]: 2026-01-23 10:33:54.954 222021 DEBUG nova.compute.manager [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Refreshing instance network info cache due to event network-changed-1330df91-d1d9-411b-afea-ee9187f21149. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:54 np0005593233 nova_compute[222017]: 2026-01-23 10:33:54.955 222021 DEBUG oslo_concurrency.lockutils [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:54 np0005593233 nova_compute[222017]: 2026-01-23 10:33:54.955 222021 DEBUG oslo_concurrency.lockutils [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:54 np0005593233 nova_compute[222017]: 2026-01-23 10:33:54.955 222021 DEBUG nova.network.neutron [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Refreshing network info cache for port 1330df91-d1d9-411b-afea-ee9187f21149 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.008 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:55.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.331 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.332 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.332 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.333 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.334 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.336 222021 INFO nova.compute.manager [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Terminating instance#033[00m
Jan 23 05:33:55 np0005593233 nova_compute[222017]: 2026-01-23 10:33:55.339 222021 DEBUG nova.compute.manager [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:33:55 np0005593233 kernel: tap1330df91-d1 (unregistering): left promiscuous mode
Jan 23 05:33:55 np0005593233 NetworkManager[48871]: <info>  [1769164435.9786] device (tap1330df91-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:56 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:56Z|00774|binding|INFO|Releasing lport 1330df91-d1d9-411b-afea-ee9187f21149 from this chassis (sb_readonly=0)
Jan 23 05:33:56 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:56Z|00775|binding|INFO|Setting lport 1330df91-d1d9-411b-afea-ee9187f21149 down in Southbound
Jan 23 05:33:56 np0005593233 ovn_controller[130653]: 2026-01-23T10:33:56Z|00776|binding|INFO|Removing iface tap1330df91-d1 ovn-installed in OVS
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.004 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.036 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:56 np0005593233 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Jan 23 05:33:56 np0005593233 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b7.scope: Consumed 15.495s CPU time.
Jan 23 05:33:56 np0005593233 systemd-machined[190954]: Machine qemu-84-instance-000000b7 terminated.
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.207 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:bc:67 10.100.0.8'], port_security=['fa:16:3e:22:bc:67 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '94779b4e-014b-463a-ae92-67157067665f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a58da180-17cd-4554-a7dd-fdd1057faf2d a861f02d-6ba6-4abf-a226-939d7b8444ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e345ee25-e71d-44f1-88c3-b2014cfeb099, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1330df91-d1d9-411b-afea-ee9187f21149) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.209 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1330df91-d1d9-411b-afea-ee9187f21149 in datapath ed4554dd-9a44-4785-9a4e-a211fe7b4949 unbound from our chassis#033[00m
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.211 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed4554dd-9a44-4785-9a4e-a211fe7b4949, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.213 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ff5562-a4f5-4ca3-99b5-0e10e6e71954]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.215 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949 namespace which is not needed anymore#033[00m
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.599 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.606 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:33:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:56.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.623 222021 INFO nova.virt.libvirt.driver [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] Instance destroyed successfully.#033[00m
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.624 222021 DEBUG nova.objects.instance [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'resources' on Instance uuid 94779b4e-014b-463a-ae92-67157067665f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:56 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [NOTICE]   (293117) : haproxy version is 2.8.14-c23fe91
Jan 23 05:33:56 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [NOTICE]   (293117) : path to executable is /usr/sbin/haproxy
Jan 23 05:33:56 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [WARNING]  (293117) : Exiting Master process...
Jan 23 05:33:56 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [ALERT]    (293117) : Current worker (293119) exited with code 143 (Terminated)
Jan 23 05:33:56 np0005593233 neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949[293113]: [WARNING]  (293117) : All workers exited. Exiting... (0)
Jan 23 05:33:56 np0005593233 systemd[1]: libpod-64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff.scope: Deactivated successfully.
Jan 23 05:33:56 np0005593233 podman[293392]: 2026-01-23 10:33:56.739780327 +0000 UTC m=+0.363384167 container died 64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:33:56 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff-userdata-shm.mount: Deactivated successfully.
Jan 23 05:33:56 np0005593233 systemd[1]: var-lib-containers-storage-overlay-82a845e115bb97d574469adc3f298767bc74f3c4964319826801018e5059813d-merged.mount: Deactivated successfully.
Jan 23 05:33:56 np0005593233 podman[293392]: 2026-01-23 10:33:56.85708204 +0000 UTC m=+0.480685880 container cleanup 64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:33:56 np0005593233 systemd[1]: libpod-conmon-64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff.scope: Deactivated successfully.
Jan 23 05:33:56 np0005593233 podman[293427]: 2026-01-23 10:33:56.953603653 +0000 UTC m=+0.060346396 container remove 64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.965 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b3efd3d7-c308-4fe0-847a-18529c1c4a8f]: (4, ('Fri Jan 23 10:33:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949 (64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff)\n64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff\nFri Jan 23 10:33:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949 (64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff)\n64c964acc31f9949a495be5f3d75c6ad34ff236c92ae8cbe472cc18c1c4b1aff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.967 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f64b45af-aa76-4e02-ac4b-1cc715174da5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:56.969 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped4554dd-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:56 np0005593233 nova_compute[222017]: 2026-01-23 10:33:56.972 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:56 np0005593233 kernel: taped4554dd-90: left promiscuous mode
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:57.010 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5b91de6c-d87c-4fc2-ab05-4efb52ef6c3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.011 222021 DEBUG nova.virt.libvirt.vif [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:33:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-1672119897',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=183,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1ltrnE7PZHJGSfhgOoMItJuzuSO8BMu/7or3Sm3xqUJpbfXng2myVoTHi8tXqQq8tN45e6cz9sNqzYdHdsAQpXkf89L8KtoodzDy4Xtv+6pZh8F+deIOOF1ul1nwPV4Q==',key_name='tempest-TestSecurityGroupsBasicOps-1657112076',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:33:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-tawbyx3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:33:22Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=94779b4e-014b-463a-ae92-67157067665f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.012 222021 DEBUG nova.network.os_vif_util [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.013 222021 DEBUG nova.network.os_vif_util [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.013 222021 DEBUG os_vif [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:33:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:33:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:57.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.020 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.021 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1330df91-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:33:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:57.078 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bb989f49-4c32-40f9-bb1c-1e10d9539fb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:57.080 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7aab11d6-bef3-4efc-a7aa-522afeed7ce6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.084 222021 INFO os_vif [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:bc:67,bridge_name='br-int',has_traffic_filtering=True,id=1330df91-d1d9-411b-afea-ee9187f21149,network=Network(ed4554dd-9a44-4785-9a4e-a211fe7b4949),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1330df91-d1')#033[00m
Jan 23 05:33:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:57.098 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5aea338b-0d87-4b21-8c71-ee268f307d28]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 831383, 'reachable_time': 33491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293446, 'error': None, 'target': 'ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:57 np0005593233 systemd[1]: run-netns-ovnmeta\x2ded4554dd\x2d9a44\x2d4785\x2d9a4e\x2da211fe7b4949.mount: Deactivated successfully.
Jan 23 05:33:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:57.103 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed4554dd-9a44-4785-9a4e-a211fe7b4949 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:33:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:33:57.103 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[d8708e99-c3f8-46a7-ae96-6e76780e26ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:57 np0005593233 podman[293462]: 2026-01-23 10:33:57.218059086 +0000 UTC m=+0.071782200 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.541 222021 INFO nova.virt.libvirt.driver [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Deleting instance files /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f_del#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.543 222021 INFO nova.virt.libvirt.driver [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Deletion of /var/lib/nova/instances/94779b4e-014b-463a-ae92-67157067665f_del complete#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.639 222021 INFO nova.compute.manager [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Took 2.30 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.640 222021 DEBUG oslo.service.loopingcall [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.641 222021 DEBUG nova.compute.manager [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:33:57 np0005593233 nova_compute[222017]: 2026-01-23 10:33:57.642 222021 DEBUG nova.network.neutron [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:33:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:58.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:58 np0005593233 nova_compute[222017]: 2026-01-23 10:33:58.865 222021 DEBUG nova.compute.manager [req-633da862-3ea4-4b40-a283-1b43cae28da9 req-198b3cdc-e4ce-4d55-bf29-473502437e43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-vif-unplugged-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:58 np0005593233 nova_compute[222017]: 2026-01-23 10:33:58.865 222021 DEBUG oslo_concurrency.lockutils [req-633da862-3ea4-4b40-a283-1b43cae28da9 req-198b3cdc-e4ce-4d55-bf29-473502437e43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:58 np0005593233 nova_compute[222017]: 2026-01-23 10:33:58.866 222021 DEBUG oslo_concurrency.lockutils [req-633da862-3ea4-4b40-a283-1b43cae28da9 req-198b3cdc-e4ce-4d55-bf29-473502437e43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:58 np0005593233 nova_compute[222017]: 2026-01-23 10:33:58.866 222021 DEBUG oslo_concurrency.lockutils [req-633da862-3ea4-4b40-a283-1b43cae28da9 req-198b3cdc-e4ce-4d55-bf29-473502437e43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:58 np0005593233 nova_compute[222017]: 2026-01-23 10:33:58.867 222021 DEBUG nova.compute.manager [req-633da862-3ea4-4b40-a283-1b43cae28da9 req-198b3cdc-e4ce-4d55-bf29-473502437e43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] No waiting events found dispatching network-vif-unplugged-1330df91-d1d9-411b-afea-ee9187f21149 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:58 np0005593233 nova_compute[222017]: 2026-01-23 10:33:58.867 222021 DEBUG nova.compute.manager [req-633da862-3ea4-4b40-a283-1b43cae28da9 req-198b3cdc-e4ce-4d55-bf29-473502437e43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-vif-unplugged-1330df91-d1d9-411b-afea-ee9187f21149 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:33:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:33:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:59.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.432 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.433 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/291011937' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:59 np0005593233 nova_compute[222017]: 2026-01-23 10:33:59.964 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.158 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.158 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.384 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.385 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4116MB free_disk=20.837467193603516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.386 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.386 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:00.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.725 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 23f7c54d-ed5d-404f-8517-b5cd21d0c282 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.726 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 94779b4e-014b-463a-ae92-67157067665f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.726 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.727 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:34:00 np0005593233 nova_compute[222017]: 2026-01-23 10:34:00.846 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:01.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.076 222021 DEBUG nova.compute.manager [req-f81865d9-3c8a-4a10-a53d-e88212010b24 req-f5e1d056-4670-4966-8368-94a9acbd4581 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.077 222021 DEBUG oslo_concurrency.lockutils [req-f81865d9-3c8a-4a10-a53d-e88212010b24 req-f5e1d056-4670-4966-8368-94a9acbd4581 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "94779b4e-014b-463a-ae92-67157067665f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.077 222021 DEBUG oslo_concurrency.lockutils [req-f81865d9-3c8a-4a10-a53d-e88212010b24 req-f5e1d056-4670-4966-8368-94a9acbd4581 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.078 222021 DEBUG oslo_concurrency.lockutils [req-f81865d9-3c8a-4a10-a53d-e88212010b24 req-f5e1d056-4670-4966-8368-94a9acbd4581 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.078 222021 DEBUG nova.compute.manager [req-f81865d9-3c8a-4a10-a53d-e88212010b24 req-f5e1d056-4670-4966-8368-94a9acbd4581 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] No waiting events found dispatching network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.078 222021 WARNING nova.compute.manager [req-f81865d9-3c8a-4a10-a53d-e88212010b24 req-f5e1d056-4670-4966-8368-94a9acbd4581 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received unexpected event network-vif-plugged-1330df91-d1d9-411b-afea-ee9187f21149 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.340 222021 DEBUG nova.network.neutron [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:34:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3188114651' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.366 222021 INFO nova.compute.manager [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] Took 3.72 seconds to deallocate network for instance.#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.376 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.385 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.416 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.427 222021 DEBUG nova.network.neutron [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updated VIF entry in instance network info cache for port 1330df91-d1d9-411b-afea-ee9187f21149. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.428 222021 DEBUG nova.network.neutron [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updating instance_info_cache with network_info: [{"id": "1330df91-d1d9-411b-afea-ee9187f21149", "address": "fa:16:3e:22:bc:67", "network": {"id": "ed4554dd-9a44-4785-9a4e-a211fe7b4949", "bridge": "br-int", "label": "tempest-network-smoke--137712454", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1330df91-d1", "ovs_interfaceid": "1330df91-d1d9-411b-afea-ee9187f21149", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.492 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.506 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.506 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.506 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.510 222021 DEBUG oslo_concurrency.lockutils [req-d01610a1-f4bc-4160-92fd-bb83aecc47eb req-843c7d9f-5435-4897-aa29-b62dbcfd53bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-94779b4e-014b-463a-ae92-67157067665f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.529 222021 DEBUG nova.compute.manager [req-b57a3cd9-0b32-48b3-b4ee-9ac6ef9b7818 req-ac6ded5d-f6ee-4748-adef-9a4f80256960 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Received event network-vif-deleted-1330df91-d1d9-411b-afea-ee9187f21149 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.530 222021 INFO nova.compute.manager [req-b57a3cd9-0b32-48b3-b4ee-9ac6ef9b7818 req-ac6ded5d-f6ee-4748-adef-9a4f80256960 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Neutron deleted interface 1330df91-d1d9-411b-afea-ee9187f21149; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.530 222021 DEBUG nova.network.neutron [req-b57a3cd9-0b32-48b3-b4ee-9ac6ef9b7818 req-ac6ded5d-f6ee-4748-adef-9a4f80256960 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.568 222021 DEBUG nova.compute.manager [req-b57a3cd9-0b32-48b3-b4ee-9ac6ef9b7818 req-ac6ded5d-f6ee-4748-adef-9a4f80256960 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 94779b4e-014b-463a-ae92-67157067665f] Detach interface failed, port_id=1330df91-d1d9-411b-afea-ee9187f21149, reason: Instance 94779b4e-014b-463a-ae92-67157067665f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:34:01 np0005593233 nova_compute[222017]: 2026-01-23 10:34:01.657 222021 DEBUG oslo_concurrency.processutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:34:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2050743168' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.171 222021 DEBUG oslo_concurrency.processutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.181 222021 DEBUG nova.compute.provider_tree [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.200 222021 DEBUG nova.scheduler.client.report [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.237 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.301 222021 INFO nova.scheduler.client.report [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Deleted allocations for instance 94779b4e-014b-463a-ae92-67157067665f#033[00m
Jan 23 05:34:02 np0005593233 nova_compute[222017]: 2026-01-23 10:34:02.400 222021 DEBUG oslo_concurrency.lockutils [None req-33900cd2-d2fe-41f7-a8da-1dc69d55d3a6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "94779b4e-014b-463a-ae92-67157067665f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:02.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:03.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:03 np0005593233 nova_compute[222017]: 2026-01-23 10:34:03.506 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:03 np0005593233 nova_compute[222017]: 2026-01-23 10:34:03.508 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:03 np0005593233 nova_compute[222017]: 2026-01-23 10:34:03.508 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:03 np0005593233 nova_compute[222017]: 2026-01-23 10:34:03.508 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:04 np0005593233 nova_compute[222017]: 2026-01-23 10:34:04.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:04 np0005593233 nova_compute[222017]: 2026-01-23 10:34:04.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:34:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:04.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:05 np0005593233 nova_compute[222017]: 2026-01-23 10:34:05.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:05.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:06Z|00777|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:34:06 np0005593233 nova_compute[222017]: 2026-01-23 10:34:06.173 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:06Z|00778|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:34:06 np0005593233 nova_compute[222017]: 2026-01-23 10:34:06.328 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:06.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:07.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:07 np0005593233 nova_compute[222017]: 2026-01-23 10:34:07.079 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:08.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:09.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:10 np0005593233 nova_compute[222017]: 2026-01-23 10:34:10.013 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:10.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:11.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:11 np0005593233 nova_compute[222017]: 2026-01-23 10:34:11.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:11 np0005593233 nova_compute[222017]: 2026-01-23 10:34:11.388 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:34:11 np0005593233 nova_compute[222017]: 2026-01-23 10:34:11.388 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:34:11 np0005593233 nova_compute[222017]: 2026-01-23 10:34:11.621 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164436.6198199, 94779b4e-014b-463a-ae92-67157067665f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:34:11 np0005593233 nova_compute[222017]: 2026-01-23 10:34:11.621 222021 INFO nova.compute.manager [-] [instance: 94779b4e-014b-463a-ae92-67157067665f] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:34:11 np0005593233 nova_compute[222017]: 2026-01-23 10:34:11.718 222021 DEBUG nova.compute.manager [None req-3e70cccf-0eb2-4582-baf0-ae8316b59cfa - - - - - -] [instance: 94779b4e-014b-463a-ae92-67157067665f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:34:12 np0005593233 nova_compute[222017]: 2026-01-23 10:34:12.081 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:12 np0005593233 podman[293555]: 2026-01-23 10:34:12.134668995 +0000 UTC m=+0.137146468 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:34:12 np0005593233 nova_compute[222017]: 2026-01-23 10:34:12.397 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:34:12 np0005593233 nova_compute[222017]: 2026-01-23 10:34:12.397 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:34:12 np0005593233 nova_compute[222017]: 2026-01-23 10:34:12.397 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:34:12 np0005593233 nova_compute[222017]: 2026-01-23 10:34:12.397 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:34:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:12.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:13.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:14.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:15 np0005593233 nova_compute[222017]: 2026-01-23 10:34:15.016 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:15.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:16 np0005593233 nova_compute[222017]: 2026-01-23 10:34:16.442 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:16 np0005593233 nova_compute[222017]: 2026-01-23 10:34:16.465 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:34:16 np0005593233 nova_compute[222017]: 2026-01-23 10:34:16.465 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:34:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:16.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:17.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:17 np0005593233 nova_compute[222017]: 2026-01-23 10:34:17.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:18 np0005593233 nova_compute[222017]: 2026-01-23 10:34:18.458 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:18.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:19.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:20 np0005593233 nova_compute[222017]: 2026-01-23 10:34:20.019 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:20.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:21.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:22 np0005593233 nova_compute[222017]: 2026-01-23 10:34:22.085 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:22.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:22 np0005593233 nova_compute[222017]: 2026-01-23 10:34:22.902 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:22 np0005593233 NetworkManager[48871]: <info>  [1769164462.9037] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 23 05:34:22 np0005593233 NetworkManager[48871]: <info>  [1769164462.9056] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Jan 23 05:34:23 np0005593233 nova_compute[222017]: 2026-01-23 10:34:23.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:23Z|00779|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:34:23 np0005593233 nova_compute[222017]: 2026-01-23 10:34:23.019 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:23.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:24.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:25 np0005593233 nova_compute[222017]: 2026-01-23 10:34:25.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:25.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:25 np0005593233 nova_compute[222017]: 2026-01-23 10:34:25.378 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:26.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:27.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:27 np0005593233 nova_compute[222017]: 2026-01-23 10:34:27.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:27 np0005593233 nova_compute[222017]: 2026-01-23 10:34:27.098 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:27 np0005593233 nova_compute[222017]: 2026-01-23 10:34:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:28 np0005593233 podman[293586]: 2026-01-23 10:34:28.065143934 +0000 UTC m=+0.067879740 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:34:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:28.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:29.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:29 np0005593233 nova_compute[222017]: 2026-01-23 10:34:29.209 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:30 np0005593233 nova_compute[222017]: 2026-01-23 10:34:30.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:30.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:31.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:32 np0005593233 nova_compute[222017]: 2026-01-23 10:34:32.090 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:32.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:33.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:34.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:35 np0005593233 nova_compute[222017]: 2026-01-23 10:34:35.030 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:35.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:36.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:37.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:37 np0005593233 nova_compute[222017]: 2026-01-23 10:34:37.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:37 np0005593233 nova_compute[222017]: 2026-01-23 10:34:37.409 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:37 np0005593233 nova_compute[222017]: 2026-01-23 10:34:37.410 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:34:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:37.688 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:34:37 np0005593233 nova_compute[222017]: 2026-01-23 10:34:37.689 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:37.689 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:34:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:38.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:38 np0005593233 nova_compute[222017]: 2026-01-23 10:34:38.904 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:38 np0005593233 nova_compute[222017]: 2026-01-23 10:34:38.904 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:38 np0005593233 nova_compute[222017]: 2026-01-23 10:34:38.924 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.025 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.026 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.040 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.041 222021 INFO nova.compute.claims [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:34:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:39.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.286 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:34:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3341607489' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.777 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:39 np0005593233 nova_compute[222017]: 2026-01-23 10:34:39.785 222021 DEBUG nova.compute.provider_tree [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.034 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.183 222021 DEBUG nova.scheduler.client.report [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.330 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.331 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.422 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.422 222021 DEBUG nova.network.neutron [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.459 222021 INFO nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.482 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.636 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.639 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.639 222021 INFO nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Creating image(s)#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.684 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:40.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.728 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.775 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.780 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.882 222021 DEBUG nova.policy [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a3cd8c3758e14f9c8e4ad1a9a94a9995', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b27af793a8cc42259216fbeaa302ba03', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.886 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.886 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.887 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.887 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.920 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:40 np0005593233 nova_compute[222017]: 2026-01-23 10:34:40.925 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0d582649-9400-4246-ae39-a06921a27f40_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:34:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:41.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.349 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0d582649-9400-4246-ae39-a06921a27f40_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.449 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] resizing rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.591 222021 DEBUG nova.objects.instance [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'migration_context' on Instance uuid 0d582649-9400-4246-ae39-a06921a27f40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.632 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.633 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Ensure instance console log exists: /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.634 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.634 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:41 np0005593233 nova_compute[222017]: 2026-01-23 10:34:41.635 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:42 np0005593233 nova_compute[222017]: 2026-01-23 10:34:42.012 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:42 np0005593233 nova_compute[222017]: 2026-01-23 10:34:42.093 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e355 e355: 3 total, 3 up, 3 in
Jan 23 05:34:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:42.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:42.696 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:42.696 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:42.697 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:43.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:43 np0005593233 podman[293799]: 2026-01-23 10:34:43.122257337 +0000 UTC m=+0.124066166 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:34:43 np0005593233 nova_compute[222017]: 2026-01-23 10:34:43.413 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:43 np0005593233 nova_compute[222017]: 2026-01-23 10:34:43.414 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:34:43 np0005593233 nova_compute[222017]: 2026-01-23 10:34:43.445 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:34:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e356 e356: 3 total, 3 up, 3 in
Jan 23 05:34:43 np0005593233 nova_compute[222017]: 2026-01-23 10:34:43.529 222021 DEBUG nova.network.neutron [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Successfully created port: 1dd58203-38ce-47f5-939b-2da425defa40 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:34:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e356 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e357 e357: 3 total, 3 up, 3 in
Jan 23 05:34:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:44.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:45 np0005593233 nova_compute[222017]: 2026-01-23 10:34:45.037 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:45.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.004 222021 DEBUG nova.network.neutron [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Successfully updated port: 1dd58203-38ce-47f5-939b-2da425defa40 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.045 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.045 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquired lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.046 222021 DEBUG nova.network.neutron [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.273 222021 DEBUG nova.compute.manager [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-changed-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.274 222021 DEBUG nova.compute.manager [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Refreshing instance network info cache due to event network-changed-1dd58203-38ce-47f5-939b-2da425defa40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.275 222021 DEBUG oslo_concurrency.lockutils [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:34:46 np0005593233 nova_compute[222017]: 2026-01-23 10:34:46.471 222021 DEBUG nova.network.neutron [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:34:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:46.691 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:46.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:47 np0005593233 nova_compute[222017]: 2026-01-23 10:34:47.095 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:47.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:48.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 e358: 3 total, 3 up, 3 in
Jan 23 05:34:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:49.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.148 222021 DEBUG nova.network.neutron [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updating instance_info_cache with network_info: [{"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.222 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Releasing lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.222 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Instance network_info: |[{"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.224 222021 DEBUG oslo_concurrency.lockutils [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.224 222021 DEBUG nova.network.neutron [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Refreshing network info cache for port 1dd58203-38ce-47f5-939b-2da425defa40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.230 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Start _get_guest_xml network_info=[{"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.238 222021 WARNING nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.245 222021 DEBUG nova.virt.libvirt.host [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.246 222021 DEBUG nova.virt.libvirt.host [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.251 222021 DEBUG nova.virt.libvirt.host [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.252 222021 DEBUG nova.virt.libvirt.host [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.254 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.254 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.255 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.256 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.257 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.258 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.258 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.259 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.259 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.260 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.260 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.261 222021 DEBUG nova.virt.hardware [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.266 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:34:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/459163866' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.736 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.783 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:49 np0005593233 nova_compute[222017]: 2026-01-23 10:34:49.791 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.041 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.188 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:34:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:34:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:34:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1185119209' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.276 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.279 222021 DEBUG nova.virt.libvirt.vif [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:34:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=185,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqWyBajPk9lxnk3oOggmREf6c3e8Z1l3YtHwaQZ/cnGBO7I0r8ucErB7vOiK0qJNtSaWtRbC9tywTqzZwRuaNd5WHVZcbelBNDiChh7UcxXYMS1JZ3Qp0q7oG5MXquw/g==',key_name='tempest-TestSecurityGroupsBasicOps-910788169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-2eu0ytuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:34:40Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=0d582649-9400-4246-ae39-a06921a27f40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.279 222021 DEBUG nova.network.os_vif_util [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.281 222021 DEBUG nova.network.os_vif_util [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.283 222021 DEBUG nova.objects.instance [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d582649-9400-4246-ae39-a06921a27f40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.317 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <uuid>0d582649-9400-4246-ae39-a06921a27f40</uuid>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <name>instance-000000b9</name>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106</nova:name>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:34:49</nova:creationTime>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:user uuid="a3cd8c3758e14f9c8e4ad1a9a94a9995">tempest-TestSecurityGroupsBasicOps-622349977-project-member</nova:user>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:project uuid="b27af793a8cc42259216fbeaa302ba03">tempest-TestSecurityGroupsBasicOps-622349977</nova:project>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <nova:port uuid="1dd58203-38ce-47f5-939b-2da425defa40">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <entry name="serial">0d582649-9400-4246-ae39-a06921a27f40</entry>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <entry name="uuid">0d582649-9400-4246-ae39-a06921a27f40</entry>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0d582649-9400-4246-ae39-a06921a27f40_disk">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0d582649-9400-4246-ae39-a06921a27f40_disk.config">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:b1:c1:0c"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <target dev="tap1dd58203-38"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/console.log" append="off"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:34:50 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:34:50 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:34:50 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:34:50 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.319 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Preparing to wait for external event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.320 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.321 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.322 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.324 222021 DEBUG nova.virt.libvirt.vif [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:34:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=185,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqWyBajPk9lxnk3oOggmREf6c3e8Z1l3YtHwaQZ/cnGBO7I0r8ucErB7vOiK0qJNtSaWtRbC9tywTqzZwRuaNd5WHVZcbelBNDiChh7UcxXYMS1JZ3Qp0q7oG5MXquw/g==',key_name='tempest-TestSecurityGroupsBasicOps-910788169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-2eu0ytuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:34:40Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=0d582649-9400-4246-ae39-a06921a27f40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.325 222021 DEBUG nova.network.os_vif_util [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.327 222021 DEBUG nova.network.os_vif_util [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.328 222021 DEBUG os_vif [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.331 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.332 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.333 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.342 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.342 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1dd58203-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.344 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1dd58203-38, col_values=(('external_ids', {'iface-id': '1dd58203-38ce-47f5-939b-2da425defa40', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:c1:0c', 'vm-uuid': '0d582649-9400-4246-ae39-a06921a27f40'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593233 NetworkManager[48871]: <info>  [1769164490.3473] manager: (tap1dd58203-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.359 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.360 222021 INFO os_vif [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38')#033[00m
Jan 23 05:34:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:50.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.731 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.731 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.731 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No VIF found with MAC fa:16:3e:b1:c1:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.732 222021 INFO nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Using config drive#033[00m
Jan 23 05:34:50 np0005593233 nova_compute[222017]: 2026-01-23 10:34:50.768 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:51.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:51 np0005593233 nova_compute[222017]: 2026-01-23 10:34:51.838 222021 INFO nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Creating config drive at /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/disk.config#033[00m
Jan 23 05:34:51 np0005593233 nova_compute[222017]: 2026-01-23 10:34:51.847 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe75wcsst execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.015 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe75wcsst" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.066 222021 DEBUG nova.storage.rbd_utils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 0d582649-9400-4246-ae39-a06921a27f40_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.072 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/disk.config 0d582649-9400-4246-ae39-a06921a27f40_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.305 222021 DEBUG oslo_concurrency.processutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/disk.config 0d582649-9400-4246-ae39-a06921a27f40_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.232s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.306 222021 INFO nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Deleting local config drive /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40/disk.config because it was imported into RBD.#033[00m
Jan 23 05:34:52 np0005593233 kernel: tap1dd58203-38: entered promiscuous mode
Jan 23 05:34:52 np0005593233 NetworkManager[48871]: <info>  [1769164492.4066] manager: (tap1dd58203-38): new Tun device (/org/freedesktop/NetworkManager/Devices/357)
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.405 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:52Z|00780|binding|INFO|Claiming lport 1dd58203-38ce-47f5-939b-2da425defa40 for this chassis.
Jan 23 05:34:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:52Z|00781|binding|INFO|1dd58203-38ce-47f5-939b-2da425defa40: Claiming fa:16:3e:b1:c1:0c 10.100.0.13
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.421 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:c1:0c 10.100.0.13'], port_security=['fa:16:3e:b1:c1:0c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0d582649-9400-4246-ae39-a06921a27f40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4342db0d-36b2-4fd6-ba6a-f76b2638107c db5a0f5f-1e93-4510-8b21-d546f6b70a95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb326908-39bf-4364-9c9c-7086bd6a3074, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1dd58203-38ce-47f5-939b-2da425defa40) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.423 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1dd58203-38ce-47f5-939b-2da425defa40 in datapath bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 bound to our chassis#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.426 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd89ca51-fcdd-4a45-8aab-c59b92ee76f3#033[00m
Jan 23 05:34:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:52Z|00782|binding|INFO|Setting lport 1dd58203-38ce-47f5-939b-2da425defa40 up in Southbound
Jan 23 05:34:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:52Z|00783|binding|INFO|Setting lport 1dd58203-38ce-47f5-939b-2da425defa40 ovn-installed in OVS
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.450 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[27005949-01cb-467a-bdc9-0afd3651c43b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.452 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd89ca51-f1 in ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.454 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd89ca51-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.454 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[993493d4-0752-4920-863f-a32afcafaf0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.457 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a6897970-05aa-4cdc-928c-9afefa8cf12f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 systemd-machined[190954]: New machine qemu-85-instance-000000b9.
Jan 23 05:34:52 np0005593233 systemd-udevd[294217]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.478 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[049713da-d21e-4ee6-b060-fe001db5028e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 systemd[1]: Started Virtual Machine qemu-85-instance-000000b9.
Jan 23 05:34:52 np0005593233 NetworkManager[48871]: <info>  [1769164492.4971] device (tap1dd58203-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:34:52 np0005593233 NetworkManager[48871]: <info>  [1769164492.4987] device (tap1dd58203-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.504 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c089f747-c991-4f30-b1c4-cd85d6cb985a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.562 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e089f060-75ec-4d60-95cc-9c68e3bef4a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.570 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[645a0b10-9267-4b90-8374-7c2057cbadf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 NetworkManager[48871]: <info>  [1769164492.5733] manager: (tapbd89ca51-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/358)
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.624 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[eade0eb1-3166-4a7d-8603-e11545ef9dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.630 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad561a5-b26f-4155-a8d5-fb2717be0e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 NetworkManager[48871]: <info>  [1769164492.6656] device (tapbd89ca51-f0): carrier: link connected
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.672 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[734859e7-65e5-48b6-91f9-72a1dcbd425b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.695 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0808026f-85e9-44cb-8f00-141bc19a7703]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd89ca51-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:85:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840529, 'reachable_time': 19361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294249, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:52.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.714 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[592226c5-0455-4595-91d0-82ae4e0e04e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4c:8529'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 840529, 'tstamp': 840529}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294250, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.739 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[db06b4a1-fb93-4c82-be0f-1b8bca84fd53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd89ca51-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:85:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840529, 'reachable_time': 19361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294258, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.785 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0f508b7e-066a-4038-a486-58a764807e2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.881 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a0cd8c66-96c4-4fbb-954f-5a6a8587fe8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.884 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd89ca51-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.884 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.885 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd89ca51-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 NetworkManager[48871]: <info>  [1769164492.9020] manager: (tapbd89ca51-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 23 05:34:52 np0005593233 kernel: tapbd89ca51-f0: entered promiscuous mode
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.904 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.913 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd89ca51-f0, col_values=(('external_ids', {'iface-id': 'fcb9e31a-a2d9-4eea-8d7c-3d1e3c5567cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:52 np0005593233 ovn_controller[130653]: 2026-01-23T10:34:52Z|00784|binding|INFO|Releasing lport fcb9e31a-a2d9-4eea-8d7c-3d1e3c5567cd from this chassis (sb_readonly=0)
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.915 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.920 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd89ca51-fcdd-4a45-8aab-c59b92ee76f3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd89ca51-fcdd-4a45-8aab-c59b92ee76f3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.921 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4d407cba-295e-4f9d-aadc-7ced951a2151]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.923 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/bd89ca51-fcdd-4a45-8aab-c59b92ee76f3.pid.haproxy
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID bd89ca51-fcdd-4a45-8aab-c59b92ee76f3
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:34:52 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:34:52.924 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'env', 'PROCESS_TAG=haproxy-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd89ca51-fcdd-4a45-8aab-c59b92ee76f3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.942 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.956 222021 DEBUG nova.network.neutron [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updated VIF entry in instance network info cache for port 1dd58203-38ce-47f5-939b-2da425defa40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.957 222021 DEBUG nova.network.neutron [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updating instance_info_cache with network_info: [{"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.990 222021 DEBUG oslo_concurrency.lockutils [req-be68628d-222d-4ba8-ae8e-dcc2ee4aca52 req-027f1efd-a231-45c5-835a-9b1ac6ff6fe4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.996 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164492.9957964, 0d582649-9400-4246-ae39-a06921a27f40 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:34:52 np0005593233 nova_compute[222017]: 2026-01-23 10:34:52.997 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] VM Started (Lifecycle Event)#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.028 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.035 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164492.9961307, 0d582649-9400-4246-ae39-a06921a27f40 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.035 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.042 222021 DEBUG nova.compute.manager [req-df49e04f-3d76-4b8e-9ed2-d557a0eafe70 req-e3d592f5-61ab-4d6c-bdae-aa3dedb3fcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.042 222021 DEBUG oslo_concurrency.lockutils [req-df49e04f-3d76-4b8e-9ed2-d557a0eafe70 req-e3d592f5-61ab-4d6c-bdae-aa3dedb3fcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.043 222021 DEBUG oslo_concurrency.lockutils [req-df49e04f-3d76-4b8e-9ed2-d557a0eafe70 req-e3d592f5-61ab-4d6c-bdae-aa3dedb3fcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.043 222021 DEBUG oslo_concurrency.lockutils [req-df49e04f-3d76-4b8e-9ed2-d557a0eafe70 req-e3d592f5-61ab-4d6c-bdae-aa3dedb3fcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.043 222021 DEBUG nova.compute.manager [req-df49e04f-3d76-4b8e-9ed2-d557a0eafe70 req-e3d592f5-61ab-4d6c-bdae-aa3dedb3fcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Processing event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.044 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.060 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.065 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.071 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164493.0646803, 0d582649-9400-4246-ae39-a06921a27f40 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.072 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.079 222021 INFO nova.virt.libvirt.driver [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Instance spawned successfully.#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.081 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.104 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.111 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:34:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:53.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.111 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.113 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.113 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.114 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.115 222021 DEBUG nova.virt.libvirt.driver [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.121 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.169 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.254 222021 INFO nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Took 12.62 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.255 222021 DEBUG nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.345 222021 INFO nova.compute.manager [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Took 14.36 seconds to build instance.#033[00m
Jan 23 05:34:53 np0005593233 nova_compute[222017]: 2026-01-23 10:34:53.386 222021 DEBUG oslo_concurrency.lockutils [None req-9dc84cbb-6cee-4b65-aeb4-5aebc4eed98a a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.482s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:53 np0005593233 podman[294323]: 2026-01-23 10:34:53.500845331 +0000 UTC m=+0.100547568 container create 24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:34:53 np0005593233 podman[294323]: 2026-01-23 10:34:53.454776522 +0000 UTC m=+0.054478799 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:34:53 np0005593233 systemd[1]: Started libpod-conmon-24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417.scope.
Jan 23 05:34:53 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:34:53 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b462144ee00e8822a20c381b77b654374a4ced91032fde4ce0764b295453a4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:34:53 np0005593233 podman[294323]: 2026-01-23 10:34:53.622707884 +0000 UTC m=+0.222410181 container init 24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:34:53 np0005593233 podman[294323]: 2026-01-23 10:34:53.630626179 +0000 UTC m=+0.230328396 container start 24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:34:53 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [NOTICE]   (294343) : New worker (294345) forked
Jan 23 05:34:53 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [NOTICE]   (294343) : Loading success.
Jan 23 05:34:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:54.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:34:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:55.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.237 222021 DEBUG nova.compute.manager [req-d40a946d-b5a4-401a-a08a-2c067fda079b req-0c74c616-1957-4f43-a8ed-1a2c2021d8ab 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.238 222021 DEBUG oslo_concurrency.lockutils [req-d40a946d-b5a4-401a-a08a-2c067fda079b req-0c74c616-1957-4f43-a8ed-1a2c2021d8ab 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.238 222021 DEBUG oslo_concurrency.lockutils [req-d40a946d-b5a4-401a-a08a-2c067fda079b req-0c74c616-1957-4f43-a8ed-1a2c2021d8ab 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.238 222021 DEBUG oslo_concurrency.lockutils [req-d40a946d-b5a4-401a-a08a-2c067fda079b req-0c74c616-1957-4f43-a8ed-1a2c2021d8ab 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.239 222021 DEBUG nova.compute.manager [req-d40a946d-b5a4-401a-a08a-2c067fda079b req-0c74c616-1957-4f43-a8ed-1a2c2021d8ab 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] No waiting events found dispatching network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.239 222021 WARNING nova.compute.manager [req-d40a946d-b5a4-401a-a08a-2c067fda079b req-0c74c616-1957-4f43-a8ed-1a2c2021d8ab 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received unexpected event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:34:55 np0005593233 nova_compute[222017]: 2026-01-23 10:34:55.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:56.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:57.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:57 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:58.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:59 np0005593233 podman[294407]: 2026-01-23 10:34:59.087245409 +0000 UTC m=+0.083635917 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 05:34:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:34:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:59.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.418 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.418 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.446 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.447 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.448 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.448 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.449 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:34:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1751588255' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:59 np0005593233 nova_compute[222017]: 2026-01-23 10:34:59.976 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.048 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.143 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.144 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.150 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.151 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.423 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.425 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3951MB free_disk=20.78510284423828GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.425 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.426 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:00.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.733 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 23f7c54d-ed5d-404f-8517-b5cd21d0c282 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.734 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0d582649-9400-4246-ae39-a06921a27f40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.734 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.734 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:35:00 np0005593233 nova_compute[222017]: 2026-01-23 10:35:00.865 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:01.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.154 222021 DEBUG nova.compute.manager [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-changed-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.156 222021 DEBUG nova.compute.manager [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Refreshing instance network info cache due to event network-changed-1dd58203-38ce-47f5-939b-2da425defa40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.156 222021 DEBUG oslo_concurrency.lockutils [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.157 222021 DEBUG oslo_concurrency.lockutils [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.158 222021 DEBUG nova.network.neutron [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Refreshing network info cache for port 1dd58203-38ce-47f5-939b-2da425defa40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:35:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:35:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/432471583' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.447 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.456 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.488 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.538 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:35:01 np0005593233 nova_compute[222017]: 2026-01-23 10:35:01.539 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:02.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:03.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:03 np0005593233 nova_compute[222017]: 2026-01-23 10:35:03.506 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:03 np0005593233 nova_compute[222017]: 2026-01-23 10:35:03.507 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:03 np0005593233 nova_compute[222017]: 2026-01-23 10:35:03.507 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:03 np0005593233 nova_compute[222017]: 2026-01-23 10:35:03.508 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:04 np0005593233 nova_compute[222017]: 2026-01-23 10:35:04.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:04 np0005593233 nova_compute[222017]: 2026-01-23 10:35:04.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:35:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:04 np0005593233 nova_compute[222017]: 2026-01-23 10:35:04.636 222021 DEBUG nova.network.neutron [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updated VIF entry in instance network info cache for port 1dd58203-38ce-47f5-939b-2da425defa40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:35:04 np0005593233 nova_compute[222017]: 2026-01-23 10:35:04.637 222021 DEBUG nova.network.neutron [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updating instance_info_cache with network_info: [{"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:04.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:04 np0005593233 nova_compute[222017]: 2026-01-23 10:35:04.752 222021 DEBUG oslo_concurrency.lockutils [req-67057288-1114-4f19-bdd1-25e25dd3c9c5 req-14287eda-ada2-4e33-aab7-2aa5be08978a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:05 np0005593233 nova_compute[222017]: 2026-01-23 10:35:05.051 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:35:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:05.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:35:05 np0005593233 nova_compute[222017]: 2026-01-23 10:35:05.396 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:06.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:07.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:08Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:c1:0c 10.100.0.13
Jan 23 05:35:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:08Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:c1:0c 10.100.0.13
Jan 23 05:35:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:08.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:09.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:10 np0005593233 nova_compute[222017]: 2026-01-23 10:35:10.055 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:10 np0005593233 nova_compute[222017]: 2026-01-23 10:35:10.399 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:10.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:11.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:35:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:12.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.963 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.963 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.964 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:35:12 np0005593233 nova_compute[222017]: 2026-01-23 10:35:12.964 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:13.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:14 np0005593233 podman[294473]: 2026-01-23 10:35:14.151339749 +0000 UTC m=+0.143580521 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:35:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:14.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:15 np0005593233 nova_compute[222017]: 2026-01-23 10:35:15.057 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:15.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:15 np0005593233 nova_compute[222017]: 2026-01-23 10:35:15.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593233 nova_compute[222017]: 2026-01-23 10:35:15.526 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [{"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:15 np0005593233 nova_compute[222017]: 2026-01-23 10:35:15.548 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-23f7c54d-ed5d-404f-8517-b5cd21d0c282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:15 np0005593233 nova_compute[222017]: 2026-01-23 10:35:15.548 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:35:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:16.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:17.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:17 np0005593233 nova_compute[222017]: 2026-01-23 10:35:17.542 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:17.971 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:35:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:17.973 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:35:17 np0005593233 nova_compute[222017]: 2026-01-23 10:35:17.972 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:35:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:18.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:35:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:35:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:19.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:35:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:20 np0005593233 nova_compute[222017]: 2026-01-23 10:35:20.060 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:20 np0005593233 nova_compute[222017]: 2026-01-23 10:35:20.406 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:20.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:21.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:22.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:35:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:23.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.298 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.299 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.335 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.497 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.498 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.509 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.510 222021 INFO nova.compute.claims [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.643 222021 DEBUG nova.scheduler.client.report [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.667 222021 DEBUG nova.scheduler.client.report [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.667 222021 DEBUG nova.compute.provider_tree [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.705 222021 DEBUG nova.scheduler.client.report [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.738 222021 DEBUG nova.scheduler.client.report [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:35:23 np0005593233 nova_compute[222017]: 2026-01-23 10:35:23.864 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:35:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2716554116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.402 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.410 222021 DEBUG nova.compute.provider_tree [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:35:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.451 222021 DEBUG nova.scheduler.client.report [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.519 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.521 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.602 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.603 222021 DEBUG nova.network.neutron [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.650 222021 INFO nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.684 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:35:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:24.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.831 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.834 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.835 222021 INFO nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Creating image(s)#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.892 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.939 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.988 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:24 np0005593233 nova_compute[222017]: 2026-01-23 10:35:24.995 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.062 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.109 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.110 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.110 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.111 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.139 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.144 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:25.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.407 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.638 222021 DEBUG nova.policy [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a3cd8c3758e14f9c8e4ad1a9a94a9995', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b27af793a8cc42259216fbeaa302ba03', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.649 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.737 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] resizing rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.849 222021 DEBUG nova.objects.instance [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'migration_context' on Instance uuid 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.872 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.872 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Ensure instance console log exists: /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.872 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.873 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:25 np0005593233 nova_compute[222017]: 2026-01-23 10:35:25.873 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:26.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:26.975 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:27.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:28 np0005593233 nova_compute[222017]: 2026-01-23 10:35:28.506 222021 DEBUG nova.network.neutron [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Successfully created port: a2c5fade-2999-4b9c-99b1-5e28b3ae3712 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:35:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:28.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:29.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:30 np0005593233 nova_compute[222017]: 2026-01-23 10:35:30.065 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:30 np0005593233 podman[294692]: 2026-01-23 10:35:30.077147815 +0000 UTC m=+0.084513523 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:35:30 np0005593233 nova_compute[222017]: 2026-01-23 10:35:30.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:30.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:31.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.598 222021 DEBUG nova.network.neutron [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Successfully updated port: a2c5fade-2999-4b9c-99b1-5e28b3ae3712 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.636 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.636 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquired lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.636 222021 DEBUG nova.network.neutron [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.834 222021 DEBUG nova.compute.manager [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-changed-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.835 222021 DEBUG nova.compute.manager [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Refreshing instance network info cache due to event network-changed-a2c5fade-2999-4b9c-99b1-5e28b3ae3712. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:35:31 np0005593233 nova_compute[222017]: 2026-01-23 10:35:31.836 222021 DEBUG oslo_concurrency.lockutils [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:32 np0005593233 nova_compute[222017]: 2026-01-23 10:35:32.547 222021 DEBUG nova.network.neutron [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:35:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:32.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:33.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.618 222021 DEBUG nova.network.neutron [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updating instance_info_cache with network_info: [{"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:34.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.777 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Releasing lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.778 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Instance network_info: |[{"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.778 222021 DEBUG oslo_concurrency.lockutils [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.779 222021 DEBUG nova.network.neutron [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Refreshing network info cache for port a2c5fade-2999-4b9c-99b1-5e28b3ae3712 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.782 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Start _get_guest_xml network_info=[{"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.789 222021 WARNING nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.798 222021 DEBUG nova.virt.libvirt.host [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.799 222021 DEBUG nova.virt.libvirt.host [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.807 222021 DEBUG nova.virt.libvirt.host [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.807 222021 DEBUG nova.virt.libvirt.host [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.809 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.809 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.810 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.810 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.810 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.810 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.810 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.811 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.811 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.811 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.811 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.811 222021 DEBUG nova.virt.hardware [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:35:34 np0005593233 nova_compute[222017]: 2026-01-23 10:35:34.815 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:35.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:35:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2296530039' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.324 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.365 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.372 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.485 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:35:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3793167494' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.888 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.889 222021 DEBUG nova.virt.libvirt.vif [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:35:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-gen',id=187,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqWyBajPk9lxnk3oOggmREf6c3e8Z1l3YtHwaQZ/cnGBO7I0r8ucErB7vOiK0qJNtSaWtRbC9tywTqzZwRuaNd5WHVZcbelBNDiChh7UcxXYMS1JZ3Qp0q7oG5MXquw/g==',key_name='tempest-TestSecurityGroupsBasicOps-910788169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-fk9rklan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:35:24Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=3a8663ba-8fc7-40f6-bb82-33fb50a5edd4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.890 222021 DEBUG nova.network.os_vif_util [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.891 222021 DEBUG nova.network.os_vif_util [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.892 222021 DEBUG nova.objects.instance [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.927 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <uuid>3a8663ba-8fc7-40f6-bb82-33fb50a5edd4</uuid>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <name>instance-000000bb</name>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574</nova:name>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:35:34</nova:creationTime>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:user uuid="a3cd8c3758e14f9c8e4ad1a9a94a9995">tempest-TestSecurityGroupsBasicOps-622349977-project-member</nova:user>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:project uuid="b27af793a8cc42259216fbeaa302ba03">tempest-TestSecurityGroupsBasicOps-622349977</nova:project>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <nova:port uuid="a2c5fade-2999-4b9c-99b1-5e28b3ae3712">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <entry name="serial">3a8663ba-8fc7-40f6-bb82-33fb50a5edd4</entry>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <entry name="uuid">3a8663ba-8fc7-40f6-bb82-33fb50a5edd4</entry>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk.config">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:66:04:85"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <target dev="tapa2c5fade-29"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/console.log" append="off"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:35:35 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:35:35 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:35:35 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:35:35 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.928 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Preparing to wait for external event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.929 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.929 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.929 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.931 222021 DEBUG nova.virt.libvirt.vif [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:35:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-gen',id=187,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqWyBajPk9lxnk3oOggmREf6c3e8Z1l3YtHwaQZ/cnGBO7I0r8ucErB7vOiK0qJNtSaWtRbC9tywTqzZwRuaNd5WHVZcbelBNDiChh7UcxXYMS1JZ3Qp0q7oG5MXquw/g==',key_name='tempest-TestSecurityGroupsBasicOps-910788169',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-fk9rklan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:35:24Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=3a8663ba-8fc7-40f6-bb82-33fb50a5edd4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.932 222021 DEBUG nova.network.os_vif_util [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.933 222021 DEBUG nova.network.os_vif_util [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.934 222021 DEBUG os_vif [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.935 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.935 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.936 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.942 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.943 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2c5fade-29, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.943 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2c5fade-29, col_values=(('external_ids', {'iface-id': 'a2c5fade-2999-4b9c-99b1-5e28b3ae3712', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:04:85', 'vm-uuid': '3a8663ba-8fc7-40f6-bb82-33fb50a5edd4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593233 NetworkManager[48871]: <info>  [1769164535.9478] manager: (tapa2c5fade-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.951 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593233 nova_compute[222017]: 2026-01-23 10:35:35.957 222021 INFO os_vif [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29')#033[00m
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.062 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.062 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.062 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No VIF found with MAC fa:16:3e:66:04:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.063 222021 INFO nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Using config drive#033[00m
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.096 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:36.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.917 222021 INFO nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Creating config drive at /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/disk.config#033[00m
Jan 23 05:35:36 np0005593233 nova_compute[222017]: 2026-01-23 10:35:36.926 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvh9x6h91 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.091 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvh9x6h91" returned: 0 in 0.165s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.131 222021 DEBUG nova.storage.rbd_utils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.137 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/disk.config 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:37.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.253 222021 DEBUG nova.network.neutron [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updated VIF entry in instance network info cache for port a2c5fade-2999-4b9c-99b1-5e28b3ae3712. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.255 222021 DEBUG nova.network.neutron [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updating instance_info_cache with network_info: [{"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.279 222021 DEBUG oslo_concurrency.lockutils [req-059ec179-2190-4515-bab1-32a46df7d597 req-f659ade0-1a49-4aa4-902a-6d7b4bccd658 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.396 222021 DEBUG oslo_concurrency.processutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/disk.config 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.398 222021 INFO nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Deleting local config drive /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4/disk.config because it was imported into RBD.#033[00m
Jan 23 05:35:37 np0005593233 kernel: tapa2c5fade-29: entered promiscuous mode
Jan 23 05:35:37 np0005593233 NetworkManager[48871]: <info>  [1769164537.4817] manager: (tapa2c5fade-29): new Tun device (/org/freedesktop/NetworkManager/Devices/361)
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.484 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:37Z|00785|binding|INFO|Claiming lport a2c5fade-2999-4b9c-99b1-5e28b3ae3712 for this chassis.
Jan 23 05:35:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:37Z|00786|binding|INFO|a2c5fade-2999-4b9c-99b1-5e28b3ae3712: Claiming fa:16:3e:66:04:85 10.100.0.6
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.500 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:04:85 10.100.0.6'], port_security=['fa:16:3e:66:04:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3a8663ba-8fc7-40f6-bb82-33fb50a5edd4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4342db0d-36b2-4fd6-ba6a-f76b2638107c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb326908-39bf-4364-9c9c-7086bd6a3074, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=a2c5fade-2999-4b9c-99b1-5e28b3ae3712) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.503 140224 INFO neutron.agent.ovn.metadata.agent [-] Port a2c5fade-2999-4b9c-99b1-5e28b3ae3712 in datapath bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 bound to our chassis#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.506 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd89ca51-fcdd-4a45-8aab-c59b92ee76f3#033[00m
Jan 23 05:35:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:37Z|00787|binding|INFO|Setting lport a2c5fade-2999-4b9c-99b1-5e28b3ae3712 ovn-installed in OVS
Jan 23 05:35:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:37Z|00788|binding|INFO|Setting lport a2c5fade-2999-4b9c-99b1-5e28b3ae3712 up in Southbound
Jan 23 05:35:37 np0005593233 systemd-udevd[294848]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.521 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.525 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.529 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fdd5e6-7c7b-4aa8-83cb-ef12a556cc02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:37 np0005593233 NetworkManager[48871]: <info>  [1769164537.5371] device (tapa2c5fade-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:35:37 np0005593233 NetworkManager[48871]: <info>  [1769164537.5381] device (tapa2c5fade-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:35:37 np0005593233 systemd-machined[190954]: New machine qemu-86-instance-000000bb.
Jan 23 05:35:37 np0005593233 systemd[1]: Started Virtual Machine qemu-86-instance-000000bb.
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.566 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f17eed3a-1c19-4451-b37d-06ebd18407a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.569 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[924790ab-8569-4799-99d3-08c4ab2f2abf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.603 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[4f88f82d-8bc1-41de-a207-1cf922a0b757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.622 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8e3c49fd-6c11-4afd-8f6d-c681096bb809]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd89ca51-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:85:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840529, 'reachable_time': 16257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294863, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.641 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5896f2f1-0113-41c2-9518-a77ad48d3211]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbd89ca51-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 840545, 'tstamp': 840545}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294864, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbd89ca51-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 840550, 'tstamp': 840550}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294864, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.644 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd89ca51-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593233 nova_compute[222017]: 2026-01-23 10:35:37.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.694 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd89ca51-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.695 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.695 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd89ca51-f0, col_values=(('external_ids', {'iface-id': 'fcb9e31a-a2d9-4eea-8d7c-3d1e3c5567cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:37.695 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.420 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164538.4191873, 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.420 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] VM Started (Lifecycle Event)#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.457 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.466 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164538.4223588, 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.466 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.502 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.507 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.543 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.626 222021 DEBUG nova.compute.manager [req-a450ca48-6611-4e22-bec9-368939ecbdb8 req-aae03336-a048-4838-a2c2-2c8f0115c6cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.628 222021 DEBUG oslo_concurrency.lockutils [req-a450ca48-6611-4e22-bec9-368939ecbdb8 req-aae03336-a048-4838-a2c2-2c8f0115c6cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.629 222021 DEBUG oslo_concurrency.lockutils [req-a450ca48-6611-4e22-bec9-368939ecbdb8 req-aae03336-a048-4838-a2c2-2c8f0115c6cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.629 222021 DEBUG oslo_concurrency.lockutils [req-a450ca48-6611-4e22-bec9-368939ecbdb8 req-aae03336-a048-4838-a2c2-2c8f0115c6cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.630 222021 DEBUG nova.compute.manager [req-a450ca48-6611-4e22-bec9-368939ecbdb8 req-aae03336-a048-4838-a2c2-2c8f0115c6cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Processing event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.631 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.636 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164538.6357605, 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.637 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.640 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.647 222021 INFO nova.virt.libvirt.driver [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Instance spawned successfully.#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.648 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.679 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.687 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.688 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.689 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.690 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.691 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.692 222021 DEBUG nova.virt.libvirt.driver [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.699 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.762 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:35:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:38.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.785 222021 INFO nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Took 13.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.786 222021 DEBUG nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.868 222021 INFO nova.compute.manager [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Took 15.45 seconds to build instance.#033[00m
Jan 23 05:35:38 np0005593233 nova_compute[222017]: 2026-01-23 10:35:38.902 222021 DEBUG oslo_concurrency.lockutils [None req-b6a00dcd-eabb-41e2-b59c-aac8d61bc852 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:40.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.894 222021 DEBUG nova.compute.manager [req-fa4799e5-5352-44da-a1a9-4de115afac46 req-2b9057f2-a937-4afc-8a8b-a52fc442177e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.895 222021 DEBUG oslo_concurrency.lockutils [req-fa4799e5-5352-44da-a1a9-4de115afac46 req-2b9057f2-a937-4afc-8a8b-a52fc442177e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.895 222021 DEBUG oslo_concurrency.lockutils [req-fa4799e5-5352-44da-a1a9-4de115afac46 req-2b9057f2-a937-4afc-8a8b-a52fc442177e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.895 222021 DEBUG oslo_concurrency.lockutils [req-fa4799e5-5352-44da-a1a9-4de115afac46 req-2b9057f2-a937-4afc-8a8b-a52fc442177e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.896 222021 DEBUG nova.compute.manager [req-fa4799e5-5352-44da-a1a9-4de115afac46 req-2b9057f2-a937-4afc-8a8b-a52fc442177e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] No waiting events found dispatching network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.896 222021 WARNING nova.compute.manager [req-fa4799e5-5352-44da-a1a9-4de115afac46 req-2b9057f2-a937-4afc-8a8b-a52fc442177e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received unexpected event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:35:40 np0005593233 nova_compute[222017]: 2026-01-23 10:35:40.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:41.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:42.696 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:42.697 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:35:42.698 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:42.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:43.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:44.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:45 np0005593233 podman[294909]: 2026-01-23 10:35:45.126033562 +0000 UTC m=+0.126067333 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 23 05:35:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:45.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.792 222021 DEBUG nova.compute.manager [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-changed-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.793 222021 DEBUG nova.compute.manager [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Refreshing instance network info cache due to event network-changed-a2c5fade-2999-4b9c-99b1-5e28b3ae3712. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.793 222021 DEBUG oslo_concurrency.lockutils [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.794 222021 DEBUG oslo_concurrency.lockutils [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.794 222021 DEBUG nova.network.neutron [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Refreshing network info cache for port a2c5fade-2999-4b9c-99b1-5e28b3ae3712 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:35:45 np0005593233 nova_compute[222017]: 2026-01-23 10:35:45.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:46.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:35:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:47.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:35:47 np0005593233 nova_compute[222017]: 2026-01-23 10:35:47.920 222021 DEBUG nova.compute.manager [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-changed-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:47 np0005593233 nova_compute[222017]: 2026-01-23 10:35:47.921 222021 DEBUG nova.compute.manager [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Refreshing instance network info cache due to event network-changed-a2c5fade-2999-4b9c-99b1-5e28b3ae3712. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:35:47 np0005593233 nova_compute[222017]: 2026-01-23 10:35:47.921 222021 DEBUG oslo_concurrency.lockutils [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:48 np0005593233 nova_compute[222017]: 2026-01-23 10:35:48.623 222021 DEBUG nova.network.neutron [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updated VIF entry in instance network info cache for port a2c5fade-2999-4b9c-99b1-5e28b3ae3712. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:35:48 np0005593233 nova_compute[222017]: 2026-01-23 10:35:48.625 222021 DEBUG nova.network.neutron [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updating instance_info_cache with network_info: [{"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:48 np0005593233 nova_compute[222017]: 2026-01-23 10:35:48.660 222021 DEBUG oslo_concurrency.lockutils [req-ba9e87bf-c8e8-4d56-a7dc-34c3c3ba39a8 req-cb289372-a8e9-4224-8b65-426f5d4fddbe 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:48 np0005593233 nova_compute[222017]: 2026-01-23 10:35:48.661 222021 DEBUG oslo_concurrency.lockutils [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:48 np0005593233 nova_compute[222017]: 2026-01-23 10:35:48.662 222021 DEBUG nova.network.neutron [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Refreshing network info cache for port a2c5fade-2999-4b9c-99b1-5e28b3ae3712 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:35:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:48.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:49.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:50 np0005593233 nova_compute[222017]: 2026-01-23 10:35:50.081 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e359 e359: 3 total, 3 up, 3 in
Jan 23 05:35:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:50.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:50 np0005593233 nova_compute[222017]: 2026-01-23 10:35:50.951 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:51 np0005593233 nova_compute[222017]: 2026-01-23 10:35:51.021 222021 DEBUG nova.network.neutron [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updated VIF entry in instance network info cache for port a2c5fade-2999-4b9c-99b1-5e28b3ae3712. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:35:51 np0005593233 nova_compute[222017]: 2026-01-23 10:35:51.022 222021 DEBUG nova.network.neutron [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updating instance_info_cache with network_info: [{"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:51 np0005593233 nova_compute[222017]: 2026-01-23 10:35:51.038 222021 DEBUG oslo_concurrency.lockutils [req-43a18a40-760f-444d-a709-927f3e0ef458 req-8f974c86-ef73-4c5a-878d-b5056d12f04f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:51.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:35:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:52.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:35:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:53.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:54.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:55 np0005593233 nova_compute[222017]: 2026-01-23 10:35:55.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:55.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e360 e360: 3 total, 3 up, 3 in
Jan 23 05:35:55 np0005593233 nova_compute[222017]: 2026-01-23 10:35:55.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:56.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:57.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e361 e361: 3 total, 3 up, 3 in
Jan 23 05:35:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:58.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e362 e362: 3 total, 3 up, 3 in
Jan 23 05:35:59 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:59Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:04:85 10.100.0.6
Jan 23 05:35:59 np0005593233 ovn_controller[130653]: 2026-01-23T10:35:59Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:04:85 10.100.0.6
Jan 23 05:35:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:35:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:35:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:59.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:35:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e362 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:35:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:35:59 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:36:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e363 e363: 3 total, 3 up, 3 in
Jan 23 05:36:00 np0005593233 nova_compute[222017]: 2026-01-23 10:36:00.125 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:00 np0005593233 nova_compute[222017]: 2026-01-23 10:36:00.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:00.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:00 np0005593233 nova_compute[222017]: 2026-01-23 10:36:00.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:36:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:36:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:36:01 np0005593233 podman[295068]: 2026-01-23 10:36:01.076005519 +0000 UTC m=+0.072283135 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:36:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:01.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.442 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.442 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.443 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.443 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.444 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2400054296' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:01 np0005593233 nova_compute[222017]: 2026-01-23 10:36:01.951 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.068 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.068 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.072 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.073 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000bb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.077 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.077 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000b9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.295 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.296 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3741MB free_disk=20.67003631591797GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.297 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.297 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.421 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 23f7c54d-ed5d-404f-8517-b5cd21d0c282 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.421 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0d582649-9400-4246-ae39-a06921a27f40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.421 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.422 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.422 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:36:02 np0005593233 nova_compute[222017]: 2026-01-23 10:36:02.519 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:02.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4286081051' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:03 np0005593233 nova_compute[222017]: 2026-01-23 10:36:03.000 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:03 np0005593233 nova_compute[222017]: 2026-01-23 10:36:03.013 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:03 np0005593233 nova_compute[222017]: 2026-01-23 10:36:03.035 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:03 np0005593233 nova_compute[222017]: 2026-01-23 10:36:03.064 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:36:03 np0005593233 nova_compute[222017]: 2026-01-23 10:36:03.064 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:03.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.065 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.066 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:36:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:04.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.926 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.927 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.927 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.928 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.929 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.931 222021 INFO nova.compute.manager [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Terminating instance#033[00m
Jan 23 05:36:04 np0005593233 nova_compute[222017]: 2026-01-23 10:36:04.933 222021 DEBUG nova.compute.manager [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:36:05 np0005593233 kernel: tapa2c5fade-29 (unregistering): left promiscuous mode
Jan 23 05:36:05 np0005593233 NetworkManager[48871]: <info>  [1769164565.1227] device (tapa2c5fade-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:36:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:05Z|00789|binding|INFO|Releasing lport a2c5fade-2999-4b9c-99b1-5e28b3ae3712 from this chassis (sb_readonly=0)
Jan 23 05:36:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:05Z|00790|binding|INFO|Setting lport a2c5fade-2999-4b9c-99b1-5e28b3ae3712 down in Southbound
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.147 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:05Z|00791|binding|INFO|Removing iface tapa2c5fade-29 ovn-installed in OVS
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.160 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:04:85 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3a8663ba-8fc7-40f6-bb82-33fb50a5edd4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb326908-39bf-4364-9c9c-7086bd6a3074, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=a2c5fade-2999-4b9c-99b1-5e28b3ae3712) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.162 140224 INFO neutron.agent.ovn.metadata.agent [-] Port a2c5fade-2999-4b9c-99b1-5e28b3ae3712 in datapath bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 unbound from our chassis#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.163 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd89ca51-fcdd-4a45-8aab-c59b92ee76f3#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.170 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.181 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[138dd38d-5cd4-4bde-893f-03cb31105f1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:05 np0005593233 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Jan 23 05:36:05 np0005593233 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000bb.scope: Consumed 17.281s CPU time.
Jan 23 05:36:05 np0005593233 systemd-machined[190954]: Machine qemu-86-instance-000000bb terminated.
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.215 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dcad87-db75-4736-a51f-fd4bc5a654e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.219 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ced70a76-4348-4cb1-9aaf-52416a9a67f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:05.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.264 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c17101a7-6955-4a42-8631-f15ef4766ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.290 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[852798af-6c43-4f4a-a383-ab196f855ef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd89ca51-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:85:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 7, 'rx_bytes': 1370, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 7, 'rx_bytes': 1370, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840529, 'reachable_time': 16257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 13, 'inoctets': 936, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 13, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 936, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 13, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295145, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.314 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3e83326a-f6a7-4a17-983c-40c833de79a4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbd89ca51-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 840545, 'tstamp': 840545}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295146, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbd89ca51-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 840550, 'tstamp': 840550}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295146, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.316 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd89ca51-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.324 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.325 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd89ca51-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.325 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.325 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd89ca51-f0, col_values=(('external_ids', {'iface-id': 'fcb9e31a-a2d9-4eea-8d7c-3d1e3c5567cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.326 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.387 222021 INFO nova.virt.libvirt.driver [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Instance destroyed successfully.#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.388 222021 DEBUG nova.objects.instance [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'resources' on Instance uuid 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.443 222021 DEBUG nova.virt.libvirt.vif [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:35:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-gen-1-979705574',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-gen',id=187,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqWyBajPk9lxnk3oOggmREf6c3e8Z1l3YtHwaQZ/cnGBO7I0r8ucErB7vOiK0qJNtSaWtRbC9tywTqzZwRuaNd5WHVZcbelBNDiChh7UcxXYMS1JZ3Qp0q7oG5MXquw/g==',key_name='tempest-TestSecurityGroupsBasicOps-910788169',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:35:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-fk9rklan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:35:38Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=3a8663ba-8fc7-40f6-bb82-33fb50a5edd4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.444 222021 DEBUG nova.network.os_vif_util [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "address": "fa:16:3e:66:04:85", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c5fade-29", "ovs_interfaceid": "a2c5fade-2999-4b9c-99b1-5e28b3ae3712", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.446 222021 DEBUG nova.network.os_vif_util [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.447 222021 DEBUG os_vif [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.452 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2c5fade-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.455 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.458 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.464 222021 INFO os_vif [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:04:85,bridge_name='br-int',has_traffic_filtering=True,id=a2c5fade-2999-4b9c-99b1-5e28b3ae3712,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c5fade-29')#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.779 222021 DEBUG nova.compute.manager [req-a2ef2036-0127-42b0-9bab-fcf0cb6bc617 req-ae11e013-f515-40ba-87a2-7aea62b8df8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-vif-unplugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.779 222021 DEBUG oslo_concurrency.lockutils [req-a2ef2036-0127-42b0-9bab-fcf0cb6bc617 req-ae11e013-f515-40ba-87a2-7aea62b8df8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.780 222021 DEBUG oslo_concurrency.lockutils [req-a2ef2036-0127-42b0-9bab-fcf0cb6bc617 req-ae11e013-f515-40ba-87a2-7aea62b8df8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.780 222021 DEBUG oslo_concurrency.lockutils [req-a2ef2036-0127-42b0-9bab-fcf0cb6bc617 req-ae11e013-f515-40ba-87a2-7aea62b8df8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.780 222021 DEBUG nova.compute.manager [req-a2ef2036-0127-42b0-9bab-fcf0cb6bc617 req-ae11e013-f515-40ba-87a2-7aea62b8df8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] No waiting events found dispatching network-vif-unplugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.781 222021 DEBUG nova.compute.manager [req-a2ef2036-0127-42b0-9bab-fcf0cb6bc617 req-ae11e013-f515-40ba-87a2-7aea62b8df8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-vif-unplugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.820 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:05 np0005593233 nova_compute[222017]: 2026-01-23 10:36:05.820 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:05.822 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:36:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:06.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:06.825 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:07.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.578 222021 INFO nova.virt.libvirt.driver [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Deleting instance files /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_del#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.579 222021 INFO nova.virt.libvirt.driver [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Deletion of /var/lib/nova/instances/3a8663ba-8fc7-40f6-bb82-33fb50a5edd4_del complete#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.661 222021 INFO nova.compute.manager [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Took 2.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.662 222021 DEBUG oslo.service.loopingcall [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.663 222021 DEBUG nova.compute.manager [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.664 222021 DEBUG nova.network.neutron [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.917 222021 DEBUG nova.compute.manager [req-44ed8c5a-c763-4413-a75a-3b3e1c8eba87 req-d85104ba-2e48-46b3-9818-f2f0fc80fc12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.918 222021 DEBUG oslo_concurrency.lockutils [req-44ed8c5a-c763-4413-a75a-3b3e1c8eba87 req-d85104ba-2e48-46b3-9818-f2f0fc80fc12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.919 222021 DEBUG oslo_concurrency.lockutils [req-44ed8c5a-c763-4413-a75a-3b3e1c8eba87 req-d85104ba-2e48-46b3-9818-f2f0fc80fc12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.919 222021 DEBUG oslo_concurrency.lockutils [req-44ed8c5a-c763-4413-a75a-3b3e1c8eba87 req-d85104ba-2e48-46b3-9818-f2f0fc80fc12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.920 222021 DEBUG nova.compute.manager [req-44ed8c5a-c763-4413-a75a-3b3e1c8eba87 req-d85104ba-2e48-46b3-9818-f2f0fc80fc12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] No waiting events found dispatching network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:07 np0005593233 nova_compute[222017]: 2026-01-23 10:36:07.920 222021 WARNING nova.compute.manager [req-44ed8c5a-c763-4413-a75a-3b3e1c8eba87 req-d85104ba-2e48-46b3-9818-f2f0fc80fc12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received unexpected event network-vif-plugged-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:36:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:08.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:08 np0005593233 nova_compute[222017]: 2026-01-23 10:36:08.868 222021 DEBUG nova.network.neutron [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e364 e364: 3 total, 3 up, 3 in
Jan 23 05:36:08 np0005593233 nova_compute[222017]: 2026-01-23 10:36:08.954 222021 INFO nova.compute.manager [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Took 1.29 seconds to deallocate network for instance.#033[00m
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.051 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.052 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:09 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:36:09 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.224 222021 DEBUG oslo_concurrency.processutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:09.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.322 222021 DEBUG nova.compute.manager [req-0dc72f07-34f6-4088-b4db-5eece13fd9c5 req-46976e6f-f2f8-41cb-97b6-a3579d8da0b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Received event network-vif-deleted-a2c5fade-2999-4b9c-99b1-5e28b3ae3712 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3887533845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.727 222021 DEBUG oslo_concurrency.processutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.736 222021 DEBUG nova.compute.provider_tree [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.781 222021 DEBUG nova.scheduler.client.report [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.818 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.766s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:09 np0005593233 nova_compute[222017]: 2026-01-23 10:36:09.906 222021 INFO nova.scheduler.client.report [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Deleted allocations for instance 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4#033[00m
Jan 23 05:36:10 np0005593233 nova_compute[222017]: 2026-01-23 10:36:10.008 222021 DEBUG oslo_concurrency.lockutils [None req-a92f3bc3-1563-4a43-886c-56ae20e92a4e a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "3a8663ba-8fc7-40f6-bb82-33fb50a5edd4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:10 np0005593233 nova_compute[222017]: 2026-01-23 10:36:10.171 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:10 np0005593233 nova_compute[222017]: 2026-01-23 10:36:10.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:10.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:11.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.585 222021 DEBUG nova.compute.manager [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-changed-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.586 222021 DEBUG nova.compute.manager [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Refreshing instance network info cache due to event network-changed-1dd58203-38ce-47f5-939b-2da425defa40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.586 222021 DEBUG oslo_concurrency.lockutils [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.586 222021 DEBUG oslo_concurrency.lockutils [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.587 222021 DEBUG nova.network.neutron [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Refreshing network info cache for port 1dd58203-38ce-47f5-939b-2da425defa40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.795 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.796 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.797 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.798 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.798 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.800 222021 INFO nova.compute.manager [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Terminating instance#033[00m
Jan 23 05:36:11 np0005593233 nova_compute[222017]: 2026-01-23 10:36:11.801 222021 DEBUG nova.compute.manager [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:36:12 np0005593233 kernel: tap1dd58203-38 (unregistering): left promiscuous mode
Jan 23 05:36:12 np0005593233 NetworkManager[48871]: <info>  [1769164572.0987] device (tap1dd58203-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:36:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:12Z|00792|binding|INFO|Releasing lport 1dd58203-38ce-47f5-939b-2da425defa40 from this chassis (sb_readonly=0)
Jan 23 05:36:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:12Z|00793|binding|INFO|Setting lport 1dd58203-38ce-47f5-939b-2da425defa40 down in Southbound
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.106 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:12Z|00794|binding|INFO|Removing iface tap1dd58203-38 ovn-installed in OVS
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.109 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b9.scope: Deactivated successfully.
Jan 23 05:36:12 np0005593233 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b9.scope: Consumed 18.910s CPU time.
Jan 23 05:36:12 np0005593233 systemd-machined[190954]: Machine qemu-85-instance-000000b9 terminated.
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.176 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:c1:0c 10.100.0.13'], port_security=['fa:16:3e:b1:c1:0c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0d582649-9400-4246-ae39-a06921a27f40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4342db0d-36b2-4fd6-ba6a-f76b2638107c db5a0f5f-1e93-4510-8b21-d546f6b70a95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb326908-39bf-4364-9c9c-7086bd6a3074, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=1dd58203-38ce-47f5-939b-2da425defa40) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.177 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 1dd58203-38ce-47f5-939b-2da425defa40 in datapath bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 unbound from our chassis#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.179 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.180 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ef749c-8097-478d-a9b2-2a5857efa1ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.181 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 namespace which is not needed anymore#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.242 222021 INFO nova.virt.libvirt.driver [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Instance destroyed successfully.#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.243 222021 DEBUG nova.objects.instance [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'resources' on Instance uuid 0d582649-9400-4246-ae39-a06921a27f40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.263 222021 DEBUG nova.virt.libvirt.vif [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:34:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-831936106',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=185,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqWyBajPk9lxnk3oOggmREf6c3e8Z1l3YtHwaQZ/cnGBO7I0r8ucErB7vOiK0qJNtSaWtRbC9tywTqzZwRuaNd5WHVZcbelBNDiChh7UcxXYMS1JZ3Qp0q7oG5MXquw/g==',key_name='tempest-TestSecurityGroupsBasicOps-910788169',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:34:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-2eu0ytuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:34:53Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=0d582649-9400-4246-ae39-a06921a27f40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.264 222021 DEBUG nova.network.os_vif_util [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.265 222021 DEBUG nova.network.os_vif_util [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.266 222021 DEBUG os_vif [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.268 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.268 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1dd58203-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.275 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.280 222021 INFO os_vif [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:c1:0c,bridge_name='br-int',has_traffic_filtering=True,id=1dd58203-38ce-47f5-939b-2da425defa40,network=Network(bd89ca51-fcdd-4a45-8aab-c59b92ee76f3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dd58203-38')#033[00m
Jan 23 05:36:12 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [NOTICE]   (294343) : haproxy version is 2.8.14-c23fe91
Jan 23 05:36:12 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [NOTICE]   (294343) : path to executable is /usr/sbin/haproxy
Jan 23 05:36:12 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [WARNING]  (294343) : Exiting Master process...
Jan 23 05:36:12 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [WARNING]  (294343) : Exiting Master process...
Jan 23 05:36:12 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [ALERT]    (294343) : Current worker (294345) exited with code 143 (Terminated)
Jan 23 05:36:12 np0005593233 neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3[294339]: [WARNING]  (294343) : All workers exited. Exiting... (0)
Jan 23 05:36:12 np0005593233 systemd[1]: libpod-24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417.scope: Deactivated successfully.
Jan 23 05:36:12 np0005593233 podman[295292]: 2026-01-23 10:36:12.35122807 +0000 UTC m=+0.046675658 container died 24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:36:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417-userdata-shm.mount: Deactivated successfully.
Jan 23 05:36:12 np0005593233 systemd[1]: var-lib-containers-storage-overlay-8b462144ee00e8822a20c381b77b654374a4ced91032fde4ce0764b295453a4a-merged.mount: Deactivated successfully.
Jan 23 05:36:12 np0005593233 podman[295292]: 2026-01-23 10:36:12.389126117 +0000 UTC m=+0.084573705 container cleanup 24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:36:12 np0005593233 systemd[1]: libpod-conmon-24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417.scope: Deactivated successfully.
Jan 23 05:36:12 np0005593233 podman[295334]: 2026-01-23 10:36:12.463464439 +0000 UTC m=+0.047012567 container remove 24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.471 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c66e2640-ca17-4002-b18a-790e9b8380b3]: (4, ('Fri Jan 23 10:36:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 (24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417)\n24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417\nFri Jan 23 10:36:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 (24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417)\n24153c48edf0ce895731d078872b9700e22abfdf755cd99a9eb1b24614630417\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.474 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[059a0f08-1554-4709-9e3e-d2ed5d221be6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.476 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd89ca51-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.479 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 kernel: tapbd89ca51-f0: left promiscuous mode
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.493 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.495 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[58123f22-15a4-405f-b127-ae217bb4a957]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.514 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[544c7028-b5fc-4e5d-9dca-4060a10ebcf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.515 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ceadbd69-570f-485d-9f8f-1ccf00b1b920]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.531 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e8dd373d-c491-4816-b220-7a8570e118e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 840518, 'reachable_time': 22610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295349, 'error': None, 'target': 'ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.534 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd89ca51-fcdd-4a45-8aab-c59b92ee76f3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:36:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:12.534 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea1d345-fa19-4c06-b4f8-edfc638669fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:12 np0005593233 systemd[1]: run-netns-ovnmeta\x2dbd89ca51\x2dfcdd\x2d4a45\x2d8aab\x2dc59b92ee76f3.mount: Deactivated successfully.
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.628 222021 DEBUG nova.compute.manager [req-00c57ecf-7c28-4f3b-b956-5b46685999d4 req-906ce0eb-16da-4c96-a9df-418d5eb21773 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-vif-unplugged-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.629 222021 DEBUG oslo_concurrency.lockutils [req-00c57ecf-7c28-4f3b-b956-5b46685999d4 req-906ce0eb-16da-4c96-a9df-418d5eb21773 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.629 222021 DEBUG oslo_concurrency.lockutils [req-00c57ecf-7c28-4f3b-b956-5b46685999d4 req-906ce0eb-16da-4c96-a9df-418d5eb21773 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.629 222021 DEBUG oslo_concurrency.lockutils [req-00c57ecf-7c28-4f3b-b956-5b46685999d4 req-906ce0eb-16da-4c96-a9df-418d5eb21773 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.630 222021 DEBUG nova.compute.manager [req-00c57ecf-7c28-4f3b-b956-5b46685999d4 req-906ce0eb-16da-4c96-a9df-418d5eb21773 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] No waiting events found dispatching network-vif-unplugged-1dd58203-38ce-47f5-939b-2da425defa40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:12 np0005593233 nova_compute[222017]: 2026-01-23 10:36:12.630 222021 DEBUG nova.compute.manager [req-00c57ecf-7c28-4f3b-b956-5b46685999d4 req-906ce0eb-16da-4c96-a9df-418d5eb21773 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-vif-unplugged-1dd58203-38ce-47f5-939b-2da425defa40 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:36:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:12.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.071539) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573071608, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 2428, "num_deletes": 254, "total_data_size": 5800027, "memory_usage": 5900704, "flush_reason": "Manual Compaction"}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573116835, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 3784504, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75835, "largest_seqno": 78258, "table_properties": {"data_size": 3774477, "index_size": 6392, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20991, "raw_average_key_size": 20, "raw_value_size": 3754358, "raw_average_value_size": 3709, "num_data_blocks": 276, "num_entries": 1012, "num_filter_entries": 1012, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164376, "oldest_key_time": 1769164376, "file_creation_time": 1769164573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 45409 microseconds, and 11016 cpu microseconds.
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.116906) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 3784504 bytes OK
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.116971) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.119617) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.119653) EVENT_LOG_v1 {"time_micros": 1769164573119641, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.119682) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 5789089, prev total WAL file size 5789089, number of live WAL files 2.
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.121996) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(3695KB)], [159(9680KB)]
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573122215, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13696925, "oldest_snapshot_seqno": -1}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/885398843' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/885398843' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:36:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:13.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 9730 keys, 11804847 bytes, temperature: kUnknown
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573289280, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 11804847, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11743418, "index_size": 35972, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24389, "raw_key_size": 256410, "raw_average_key_size": 26, "raw_value_size": 11574316, "raw_average_value_size": 1189, "num_data_blocks": 1369, "num_entries": 9730, "num_filter_entries": 9730, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.289692) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 11804847 bytes
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.291188) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 81.9 rd, 70.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.5 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 10257, records dropped: 527 output_compression: NoCompression
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.291215) EVENT_LOG_v1 {"time_micros": 1769164573291200, "job": 102, "event": "compaction_finished", "compaction_time_micros": 167175, "compaction_time_cpu_micros": 62649, "output_level": 6, "num_output_files": 1, "total_output_size": 11804847, "num_input_records": 10257, "num_output_records": 9730, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573292397, "job": 102, "event": "table_file_deletion", "file_number": 161}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573295380, "job": 102, "event": "table_file_deletion", "file_number": 159}
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.121723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.295579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.295597) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.295601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.295604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:36:13.295607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.428 222021 INFO nova.virt.libvirt.driver [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Deleting instance files /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40_del#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.429 222021 INFO nova.virt.libvirt.driver [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Deletion of /var/lib/nova/instances/0d582649-9400-4246-ae39-a06921a27f40_del complete#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.439 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9907#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.439 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.565 222021 INFO nova.compute.manager [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Took 1.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.566 222021 DEBUG oslo.service.loopingcall [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.567 222021 DEBUG nova.compute.manager [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:36:13 np0005593233 nova_compute[222017]: 2026-01-23 10:36:13.567 222021 DEBUG nova.network.neutron [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:36:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:14 np0005593233 nova_compute[222017]: 2026-01-23 10:36:14.434 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:14.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.118 222021 DEBUG nova.network.neutron [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updated VIF entry in instance network info cache for port 1dd58203-38ce-47f5-939b-2da425defa40. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.119 222021 DEBUG nova.network.neutron [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updating instance_info_cache with network_info: [{"id": "1dd58203-38ce-47f5-939b-2da425defa40", "address": "fa:16:3e:b1:c1:0c", "network": {"id": "bd89ca51-fcdd-4a45-8aab-c59b92ee76f3", "bridge": "br-int", "label": "tempest-network-smoke--275511545", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dd58203-38", "ovs_interfaceid": "1dd58203-38ce-47f5-939b-2da425defa40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.156 222021 DEBUG nova.network.neutron [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.174 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.183 222021 DEBUG oslo_concurrency.lockutils [req-c01a3c6b-ed49-492b-93b0-c3ae0167e9d8 req-db64b1a3-d311-465c-8b8d-f68c8023fef0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0d582649-9400-4246-ae39-a06921a27f40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.207 222021 INFO nova.compute.manager [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Took 1.64 seconds to deallocate network for instance.#033[00m
Jan 23 05:36:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:15.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.278 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.279 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.285 222021 DEBUG nova.compute.manager [req-b71f4505-30db-4d73-857b-b37359339e89 req-f122caae-8971-4cdd-a014-da39dbc01ce8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.285 222021 DEBUG oslo_concurrency.lockutils [req-b71f4505-30db-4d73-857b-b37359339e89 req-f122caae-8971-4cdd-a014-da39dbc01ce8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0d582649-9400-4246-ae39-a06921a27f40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.286 222021 DEBUG oslo_concurrency.lockutils [req-b71f4505-30db-4d73-857b-b37359339e89 req-f122caae-8971-4cdd-a014-da39dbc01ce8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.286 222021 DEBUG oslo_concurrency.lockutils [req-b71f4505-30db-4d73-857b-b37359339e89 req-f122caae-8971-4cdd-a014-da39dbc01ce8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.286 222021 DEBUG nova.compute.manager [req-b71f4505-30db-4d73-857b-b37359339e89 req-f122caae-8971-4cdd-a014-da39dbc01ce8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] No waiting events found dispatching network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.286 222021 WARNING nova.compute.manager [req-b71f4505-30db-4d73-857b-b37359339e89 req-f122caae-8971-4cdd-a014-da39dbc01ce8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received unexpected event network-vif-plugged-1dd58203-38ce-47f5-939b-2da425defa40 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.297 222021 DEBUG nova.compute.manager [req-d73af15d-dced-4dbc-95f7-9087bbf4181b req-d669186b-97d9-4792-9667-44ad8d120494 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Received event network-vif-deleted-1dd58203-38ce-47f5-939b-2da425defa40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.452 222021 DEBUG oslo_concurrency.processutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e365 e365: 3 total, 3 up, 3 in
Jan 23 05:36:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1912944071' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.890 222021 DEBUG oslo_concurrency.processutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.898 222021 DEBUG nova.compute.provider_tree [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.926 222021 DEBUG nova.scheduler.client.report [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:15 np0005593233 nova_compute[222017]: 2026-01-23 10:36:15.976 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:16 np0005593233 nova_compute[222017]: 2026-01-23 10:36:16.049 222021 INFO nova.scheduler.client.report [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Deleted allocations for instance 0d582649-9400-4246-ae39-a06921a27f40#033[00m
Jan 23 05:36:16 np0005593233 podman[295375]: 2026-01-23 10:36:16.130300583 +0000 UTC m=+0.133484404 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 23 05:36:16 np0005593233 nova_compute[222017]: 2026-01-23 10:36:16.197 222021 DEBUG oslo_concurrency.lockutils [None req-280f35aa-7749-467a-bf02-79f7324ca632 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "0d582649-9400-4246-ae39-a06921a27f40" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:16.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:17.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:17 np0005593233 nova_compute[222017]: 2026-01-23 10:36:17.272 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:18.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e366 e366: 3 total, 3 up, 3 in
Jan 23 05:36:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:19.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:20 np0005593233 nova_compute[222017]: 2026-01-23 10:36:20.175 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:20 np0005593233 nova_compute[222017]: 2026-01-23 10:36:20.384 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164565.3832684, 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:20 np0005593233 nova_compute[222017]: 2026-01-23 10:36:20.384 222021 INFO nova.compute.manager [-] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:36:20 np0005593233 nova_compute[222017]: 2026-01-23 10:36:20.425 222021 DEBUG nova.compute.manager [None req-c6e339c4-01e1-4f76-abcf-bf549b0314ba - - - - - -] [instance: 3a8663ba-8fc7-40f6-bb82-33fb50a5edd4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:20.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:21.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:22 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:22Z|00795|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:36:22 np0005593233 nova_compute[222017]: 2026-01-23 10:36:22.191 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:22 np0005593233 nova_compute[222017]: 2026-01-23 10:36:22.283 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:22 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:22Z|00796|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:36:22 np0005593233 nova_compute[222017]: 2026-01-23 10:36:22.386 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:22.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:23.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:24.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e367 e367: 3 total, 3 up, 3 in
Jan 23 05:36:25 np0005593233 nova_compute[222017]: 2026-01-23 10:36:25.179 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:25.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:26.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:27 np0005593233 nova_compute[222017]: 2026-01-23 10:36:27.239 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164572.2387547, 0d582649-9400-4246-ae39-a06921a27f40 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:27 np0005593233 nova_compute[222017]: 2026-01-23 10:36:27.240 222021 INFO nova.compute.manager [-] [instance: 0d582649-9400-4246-ae39-a06921a27f40] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:36:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:27.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:27 np0005593233 nova_compute[222017]: 2026-01-23 10:36:27.274 222021 DEBUG nova.compute.manager [None req-f50e37da-886b-4b10-8863-82fe2f47c22b - - - - - -] [instance: 0d582649-9400-4246-ae39-a06921a27f40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:27 np0005593233 nova_compute[222017]: 2026-01-23 10:36:27.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:27 np0005593233 nova_compute[222017]: 2026-01-23 10:36:27.378 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:28.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:29.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.483 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.484 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.484 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.485 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.485 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.487 222021 INFO nova.compute.manager [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Terminating instance#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.489 222021 DEBUG nova.compute.manager [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:36:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:29 np0005593233 kernel: tap66bbd2d4-17 (unregistering): left promiscuous mode
Jan 23 05:36:29 np0005593233 NetworkManager[48871]: <info>  [1769164589.5704] device (tap66bbd2d4-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:29Z|00797|binding|INFO|Releasing lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d from this chassis (sb_readonly=0)
Jan 23 05:36:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:29Z|00798|binding|INFO|Setting lport 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d down in Southbound
Jan 23 05:36:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:29Z|00799|binding|INFO|Removing iface tap66bbd2d4-17 ovn-installed in OVS
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.586 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.590 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:42:72 10.100.0.13'], port_security=['fa:16:3e:e4:42:72 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '23f7c54d-ed5d-404f-8517-b5cd21d0c282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.593 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 66bbd2d4-1733-4a5d-a84b-8d41c36dd82d in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.596 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7d5530f-5227-4f75-bac0-2604bb3d68e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.597 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[592b7a22-076f-4403-b165-af77cd034665]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.598 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace which is not needed anymore#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Jan 23 05:36:29 np0005593233 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000b3.scope: Consumed 26.406s CPU time.
Jan 23 05:36:29 np0005593233 systemd-machined[190954]: Machine qemu-83-instance-000000b3 terminated.
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.734 222021 INFO nova.virt.libvirt.driver [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Instance destroyed successfully.#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.735 222021 DEBUG nova.objects.instance [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'resources' on Instance uuid 23f7c54d-ed5d-404f-8517-b5cd21d0c282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.760 222021 DEBUG nova.virt.libvirt.vif [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:31:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-833764657',display_name='tempest-ServerStableDeviceRescueTest-server-833764657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-833764657',id=179,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:32:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-l34bmfpa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:32:43Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=23f7c54d-ed5d-404f-8517-b5cd21d0c282,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.761 222021 DEBUG nova.network.os_vif_util [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "address": "fa:16:3e:e4:42:72", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66bbd2d4-17", "ovs_interfaceid": "66bbd2d4-1733-4a5d-a84b-8d41c36dd82d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.762 222021 DEBUG nova.network.os_vif_util [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.763 222021 DEBUG os_vif [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.767 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66bbd2d4-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.769 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.773 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.776 222021 INFO os_vif [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:42:72,bridge_name='br-int',has_traffic_filtering=True,id=66bbd2d4-1733-4a5d-a84b-8d41c36dd82d,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66bbd2d4-17')#033[00m
Jan 23 05:36:29 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [NOTICE]   (292556) : haproxy version is 2.8.14-c23fe91
Jan 23 05:36:29 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [NOTICE]   (292556) : path to executable is /usr/sbin/haproxy
Jan 23 05:36:29 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [WARNING]  (292556) : Exiting Master process...
Jan 23 05:36:29 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [ALERT]    (292556) : Current worker (292558) exited with code 143 (Terminated)
Jan 23 05:36:29 np0005593233 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[292552]: [WARNING]  (292556) : All workers exited. Exiting... (0)
Jan 23 05:36:29 np0005593233 systemd[1]: libpod-7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64.scope: Deactivated successfully.
Jan 23 05:36:29 np0005593233 podman[295432]: 2026-01-23 10:36:29.794205507 +0000 UTC m=+0.064485833 container died 7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:36:29 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64-userdata-shm.mount: Deactivated successfully.
Jan 23 05:36:29 np0005593233 systemd[1]: var-lib-containers-storage-overlay-1316b78a5ec41ea0fcd0e6fff88669f947612cec308b83abdb58f8292ed772c0-merged.mount: Deactivated successfully.
Jan 23 05:36:29 np0005593233 podman[295432]: 2026-01-23 10:36:29.845147325 +0000 UTC m=+0.115427651 container cleanup 7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:36:29 np0005593233 systemd[1]: libpod-conmon-7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64.scope: Deactivated successfully.
Jan 23 05:36:29 np0005593233 podman[295489]: 2026-01-23 10:36:29.945405254 +0000 UTC m=+0.072015578 container remove 7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.956 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddd6507-bb61-4724-a691-20e5215a5532]: (4, ('Fri Jan 23 10:36:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64)\n7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64\nFri Jan 23 10:36:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64)\n7bdb81eb9ec17a89569a2f33992acb07757ed01bd889303130d76e9724f99c64\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.958 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3474fe8a-86b8-4eea-b618-d0b5f19062a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.959 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:29 np0005593233 kernel: tapd7d5530f-50: left promiscuous mode
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.962 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 nova_compute[222017]: 2026-01-23 10:36:29.988 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:29.992 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bbde6861-1217-48f0-9d02-be0d5b7c6269]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:30.014 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf0b299-d866-4979-a51a-d2c1764954ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:30.016 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8820b162-2b3f-480c-a09a-a635b67f61f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:30.045 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[07708c53-93f9-4a44-b698-fd178290ffd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827520, 'reachable_time': 16945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295504, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:30 np0005593233 systemd[1]: run-netns-ovnmeta\x2dd7d5530f\x2d5227\x2d4f75\x2dbac0\x2d2604bb3d68e2.mount: Deactivated successfully.
Jan 23 05:36:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:30.050 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:36:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:30.050 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[43caecf7-3bb3-4e3f-8def-48cfec242f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.094 222021 DEBUG nova.compute.manager [req-69bde503-98ce-4d90-988c-b351a202ae12 req-f1e758eb-3fc2-4d3b-91e6-5ff34d4f215b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.094 222021 DEBUG oslo_concurrency.lockutils [req-69bde503-98ce-4d90-988c-b351a202ae12 req-f1e758eb-3fc2-4d3b-91e6-5ff34d4f215b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.094 222021 DEBUG oslo_concurrency.lockutils [req-69bde503-98ce-4d90-988c-b351a202ae12 req-f1e758eb-3fc2-4d3b-91e6-5ff34d4f215b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.095 222021 DEBUG oslo_concurrency.lockutils [req-69bde503-98ce-4d90-988c-b351a202ae12 req-f1e758eb-3fc2-4d3b-91e6-5ff34d4f215b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.095 222021 DEBUG nova.compute.manager [req-69bde503-98ce-4d90-988c-b351a202ae12 req-f1e758eb-3fc2-4d3b-91e6-5ff34d4f215b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.095 222021 DEBUG nova.compute.manager [req-69bde503-98ce-4d90-988c-b351a202ae12 req-f1e758eb-3fc2-4d3b-91e6-5ff34d4f215b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-unplugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:36:30 np0005593233 nova_compute[222017]: 2026-01-23 10:36:30.182 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:30.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:31.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:31 np0005593233 nova_compute[222017]: 2026-01-23 10:36:31.587 222021 INFO nova.virt.libvirt.driver [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Deleting instance files /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282_del#033[00m
Jan 23 05:36:31 np0005593233 nova_compute[222017]: 2026-01-23 10:36:31.588 222021 INFO nova.virt.libvirt.driver [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Deletion of /var/lib/nova/instances/23f7c54d-ed5d-404f-8517-b5cd21d0c282_del complete#033[00m
Jan 23 05:36:31 np0005593233 nova_compute[222017]: 2026-01-23 10:36:31.883 222021 INFO nova.compute.manager [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Took 2.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:36:31 np0005593233 nova_compute[222017]: 2026-01-23 10:36:31.884 222021 DEBUG oslo.service.loopingcall [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:36:31 np0005593233 nova_compute[222017]: 2026-01-23 10:36:31.885 222021 DEBUG nova.compute.manager [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:36:31 np0005593233 nova_compute[222017]: 2026-01-23 10:36:31.885 222021 DEBUG nova.network.neutron [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:36:32 np0005593233 podman[295506]: 2026-01-23 10:36:32.099746761 +0000 UTC m=+0.094621430 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:36:32 np0005593233 nova_compute[222017]: 2026-01-23 10:36:32.243 222021 DEBUG nova.compute.manager [req-67d482f8-52d7-4e83-abbd-afe780bb1544 req-08bb6fb4-4112-4d50-a3e3-06e437f8b48c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:32 np0005593233 nova_compute[222017]: 2026-01-23 10:36:32.244 222021 DEBUG oslo_concurrency.lockutils [req-67d482f8-52d7-4e83-abbd-afe780bb1544 req-08bb6fb4-4112-4d50-a3e3-06e437f8b48c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:32 np0005593233 nova_compute[222017]: 2026-01-23 10:36:32.244 222021 DEBUG oslo_concurrency.lockutils [req-67d482f8-52d7-4e83-abbd-afe780bb1544 req-08bb6fb4-4112-4d50-a3e3-06e437f8b48c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:32 np0005593233 nova_compute[222017]: 2026-01-23 10:36:32.245 222021 DEBUG oslo_concurrency.lockutils [req-67d482f8-52d7-4e83-abbd-afe780bb1544 req-08bb6fb4-4112-4d50-a3e3-06e437f8b48c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:32 np0005593233 nova_compute[222017]: 2026-01-23 10:36:32.245 222021 DEBUG nova.compute.manager [req-67d482f8-52d7-4e83-abbd-afe780bb1544 req-08bb6fb4-4112-4d50-a3e3-06e437f8b48c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] No waiting events found dispatching network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:32 np0005593233 nova_compute[222017]: 2026-01-23 10:36:32.246 222021 WARNING nova.compute.manager [req-67d482f8-52d7-4e83-abbd-afe780bb1544 req-08bb6fb4-4112-4d50-a3e3-06e437f8b48c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received unexpected event network-vif-plugged-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:36:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:32.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:33.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:33 np0005593233 nova_compute[222017]: 2026-01-23 10:36:33.293 222021 DEBUG nova.network.neutron [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:33 np0005593233 nova_compute[222017]: 2026-01-23 10:36:33.316 222021 INFO nova.compute.manager [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Took 1.43 seconds to deallocate network for instance.#033[00m
Jan 23 05:36:33 np0005593233 nova_compute[222017]: 2026-01-23 10:36:33.417 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:33 np0005593233 nova_compute[222017]: 2026-01-23 10:36:33.418 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:33 np0005593233 nova_compute[222017]: 2026-01-23 10:36:33.482 222021 DEBUG nova.compute.manager [req-04279de7-de5a-4d78-bbc6-4dc999733f50 req-3a2a2137-b0fb-4d09-bafe-3caf2e93f7d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Received event network-vif-deleted-66bbd2d4-1733-4a5d-a84b-8d41c36dd82d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:33 np0005593233 nova_compute[222017]: 2026-01-23 10:36:33.529 222021 DEBUG oslo_concurrency.processutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e368 e368: 3 total, 3 up, 3 in
Jan 23 05:36:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1082182260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.059 222021 DEBUG oslo_concurrency.processutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.069 222021 DEBUG nova.compute.provider_tree [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.089 222021 DEBUG nova.scheduler.client.report [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.120 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.148 222021 INFO nova.scheduler.client.report [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Deleted allocations for instance 23f7c54d-ed5d-404f-8517-b5cd21d0c282#033[00m
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.219 222021 DEBUG oslo_concurrency.lockutils [None req-8f97fe25-0ce2-4208-9ee1-587a0a8d2dd8 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "23f7c54d-ed5d-404f-8517-b5cd21d0c282" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e368 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:34 np0005593233 nova_compute[222017]: 2026-01-23 10:36:34.772 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:34.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:35 np0005593233 nova_compute[222017]: 2026-01-23 10:36:35.185 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:35.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:36.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e369 e369: 3 total, 3 up, 3 in
Jan 23 05:36:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:37.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:38.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:39.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:39 np0005593233 nova_compute[222017]: 2026-01-23 10:36:39.775 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:40 np0005593233 nova_compute[222017]: 2026-01-23 10:36:40.189 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:40.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:41.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.843 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.844 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.870 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.970 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.970 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.979 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:36:41 np0005593233 nova_compute[222017]: 2026-01-23 10:36:41.979 222021 INFO nova.compute.claims [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.124 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1650443266' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.620 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.630 222021 DEBUG nova.compute.provider_tree [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.649 222021 DEBUG nova.scheduler.client.report [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:42.697 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:42.697 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:42.698 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.699 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.700 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.758 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.759 222021 DEBUG nova.network.neutron [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.784 222021 INFO nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.814 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:36:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:42.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.912 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.915 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.916 222021 INFO nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Creating image(s)#033[00m
Jan 23 05:36:42 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.957 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:42.999 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.032 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.035 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.075 222021 DEBUG nova.policy [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a3cd8c3758e14f9c8e4ad1a9a94a9995', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b27af793a8cc42259216fbeaa302ba03', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.114 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.116 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.116 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.117 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.153 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:43 np0005593233 nova_compute[222017]: 2026-01-23 10:36:43.158 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:43.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.312 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.421 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] resizing rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:36:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.568 222021 DEBUG nova.objects.instance [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'migration_context' on Instance uuid a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.597 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.598 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Ensure instance console log exists: /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.598 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.598 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.599 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.730 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164589.7292368, 23f7c54d-ed5d-404f-8517-b5cd21d0c282 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.730 222021 INFO nova.compute.manager [-] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.760 222021 DEBUG nova.compute.manager [None req-72963fda-4166-48fa-9e08-39f75bbdd5d6 - - - - - -] [instance: 23f7c54d-ed5d-404f-8517-b5cd21d0c282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:44.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:44 np0005593233 nova_compute[222017]: 2026-01-23 10:36:44.868 222021 DEBUG nova.network.neutron [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Successfully created port: 12caf893-7757-4200-a8b9-afef9f1e9215 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:36:45 np0005593233 nova_compute[222017]: 2026-01-23 10:36:45.190 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:45.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:46.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:46 np0005593233 nova_compute[222017]: 2026-01-23 10:36:46.962 222021 DEBUG nova.network.neutron [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Successfully updated port: 12caf893-7757-4200-a8b9-afef9f1e9215 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:36:46 np0005593233 nova_compute[222017]: 2026-01-23 10:36:46.986 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:36:46 np0005593233 nova_compute[222017]: 2026-01-23 10:36:46.987 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquired lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:36:46 np0005593233 nova_compute[222017]: 2026-01-23 10:36:46.988 222021 DEBUG nova.network.neutron [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:36:47 np0005593233 nova_compute[222017]: 2026-01-23 10:36:47.110 222021 DEBUG nova.compute.manager [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-changed-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:47 np0005593233 nova_compute[222017]: 2026-01-23 10:36:47.111 222021 DEBUG nova.compute.manager [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Refreshing instance network info cache due to event network-changed-12caf893-7757-4200-a8b9-afef9f1e9215. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:36:47 np0005593233 nova_compute[222017]: 2026-01-23 10:36:47.111 222021 DEBUG oslo_concurrency.lockutils [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:36:47 np0005593233 podman[295739]: 2026-01-23 10:36:47.149305789 +0000 UTC m=+0.147423500 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:36:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:47.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:47 np0005593233 nova_compute[222017]: 2026-01-23 10:36:47.641 222021 DEBUG nova.network.neutron [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:36:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:48.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.888 222021 DEBUG nova.network.neutron [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updating instance_info_cache with network_info: [{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.916 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Releasing lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.917 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Instance network_info: |[{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.917 222021 DEBUG oslo_concurrency.lockutils [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.917 222021 DEBUG nova.network.neutron [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Refreshing network info cache for port 12caf893-7757-4200-a8b9-afef9f1e9215 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.920 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Start _get_guest_xml network_info=[{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.925 222021 WARNING nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.930 222021 DEBUG nova.virt.libvirt.host [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.930 222021 DEBUG nova.virt.libvirt.host [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:36:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 e370: 3 total, 3 up, 3 in
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.940 222021 DEBUG nova.virt.libvirt.host [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.941 222021 DEBUG nova.virt.libvirt.host [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.944 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.945 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.946 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.946 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.947 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.947 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.948 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.948 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.949 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.949 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.950 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.950 222021 DEBUG nova.virt.hardware [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:36:48 np0005593233 nova_compute[222017]: 2026-01-23 10:36:48.955 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:49.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:36:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3255763841' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.426 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.476 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.484 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:36:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3760332452' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.974 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.977 222021 DEBUG nova.virt.libvirt.vif [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:36:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=189,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBQ7hbI9dSlZowLrM+55BW4vZefgM5C7AtbCmYzjlG7RhMLD86z6HKT1ky7da4FJ/rvc4D//2MBDEN2yr//ERuTxme2OPEqVyC6OVqkosE4nxK5JvPAi4Vemn/j2yc45jQ==',key_name='tempest-TestSecurityGroupsBasicOps-832716294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-q9pz4vzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:36:42Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.977 222021 DEBUG nova.network.os_vif_util [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.978 222021 DEBUG nova.network.os_vif_util [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:49 np0005593233 nova_compute[222017]: 2026-01-23 10:36:49.980 222021 DEBUG nova.objects.instance [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'pci_devices' on Instance uuid a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.003 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <uuid>a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6</uuid>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <name>instance-000000bd</name>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154</nova:name>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:36:48</nova:creationTime>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:user uuid="a3cd8c3758e14f9c8e4ad1a9a94a9995">tempest-TestSecurityGroupsBasicOps-622349977-project-member</nova:user>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:project uuid="b27af793a8cc42259216fbeaa302ba03">tempest-TestSecurityGroupsBasicOps-622349977</nova:project>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <nova:port uuid="12caf893-7757-4200-a8b9-afef9f1e9215">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <entry name="serial">a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6</entry>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <entry name="uuid">a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6</entry>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk.config">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:f9:f3:56"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <target dev="tap12caf893-77"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/console.log" append="off"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:36:50 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:36:50 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:36:50 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:36:50 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.006 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Preparing to wait for external event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.007 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.007 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.008 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.009 222021 DEBUG nova.virt.libvirt.vif [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:36:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=189,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBQ7hbI9dSlZowLrM+55BW4vZefgM5C7AtbCmYzjlG7RhMLD86z6HKT1ky7da4FJ/rvc4D//2MBDEN2yr//ERuTxme2OPEqVyC6OVqkosE4nxK5JvPAi4Vemn/j2yc45jQ==',key_name='tempest-TestSecurityGroupsBasicOps-832716294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-q9pz4vzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:36:42Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.010 222021 DEBUG nova.network.os_vif_util [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.011 222021 DEBUG nova.network.os_vif_util [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.012 222021 DEBUG os_vif [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.013 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.014 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.015 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.020 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.021 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12caf893-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.022 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12caf893-77, col_values=(('external_ids', {'iface-id': '12caf893-7757-4200-a8b9-afef9f1e9215', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:f3:56', 'vm-uuid': 'a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:50 np0005593233 NetworkManager[48871]: <info>  [1769164610.0298] manager: (tap12caf893-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.036 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.037 222021 INFO os_vif [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77')#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.120 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.121 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.121 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No VIF found with MAC fa:16:3e:f9:f3:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.122 222021 INFO nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Using config drive#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.163 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.197 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.742 222021 INFO nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Creating config drive at /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/disk.config#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.753 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpge9j3gsf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.914 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpge9j3gsf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.975 222021 DEBUG nova.storage.rbd_utils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:36:50 np0005593233 nova_compute[222017]: 2026-01-23 10:36:50.982 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/disk.config a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.203 222021 DEBUG nova.network.neutron [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updated VIF entry in instance network info cache for port 12caf893-7757-4200-a8b9-afef9f1e9215. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.204 222021 DEBUG nova.network.neutron [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updating instance_info_cache with network_info: [{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.230 222021 DEBUG oslo_concurrency.lockutils [req-4f672ffb-b18a-4dc4-a573-4ca95e3405a2 req-52bf6ab0-7827-4d7e-a788-7f68dafaf4ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.298 222021 DEBUG oslo_concurrency.processutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/disk.config a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.299 222021 INFO nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Deleting local config drive /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6/disk.config because it was imported into RBD.#033[00m
Jan 23 05:36:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:51.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:51 np0005593233 kernel: tap12caf893-77: entered promiscuous mode
Jan 23 05:36:51 np0005593233 NetworkManager[48871]: <info>  [1769164611.3894] manager: (tap12caf893-77): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Jan 23 05:36:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:51Z|00800|binding|INFO|Claiming lport 12caf893-7757-4200-a8b9-afef9f1e9215 for this chassis.
Jan 23 05:36:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:51Z|00801|binding|INFO|12caf893-7757-4200-a8b9-afef9f1e9215: Claiming fa:16:3e:f9:f3:56 10.100.0.11
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.390 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.413 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:f3:56 10.100.0.11'], port_security=['fa:16:3e:f9:f3:56 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-868ec025-7796-402b-ba12-8a3a5dac7373', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '05714b57-0768-46c9-8884-47c3caa76f59 92c18c07-ed83-4ff4-89c6-5a468e492558', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07547cf6-dd0a-47b0-85a9-11b383a1aaf0, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=12caf893-7757-4200-a8b9-afef9f1e9215) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.416 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 12caf893-7757-4200-a8b9-afef9f1e9215 in datapath 868ec025-7796-402b-ba12-8a3a5dac7373 bound to our chassis#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.419 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 868ec025-7796-402b-ba12-8a3a5dac7373#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.437 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b10e2531-deb2-476f-b5d8-665d575d5366]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.438 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap868ec025-71 in ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:36:51 np0005593233 systemd-machined[190954]: New machine qemu-87-instance-000000bd.
Jan 23 05:36:51 np0005593233 systemd-udevd[295903]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.441 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap868ec025-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.441 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6877c2f4-e1fc-474c-bc2e-ab0fbc040cd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.443 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fa82bb24-3277-4417-a123-91409c44c5a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 NetworkManager[48871]: <info>  [1769164611.4582] device (tap12caf893-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:36:51 np0005593233 NetworkManager[48871]: <info>  [1769164611.4597] device (tap12caf893-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.456 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[60ffadc7-a6f6-4d37-b36e-c00ecb7df9ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 systemd[1]: Started Virtual Machine qemu-87-instance-000000bd.
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.486 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:51Z|00802|binding|INFO|Setting lport 12caf893-7757-4200-a8b9-afef9f1e9215 ovn-installed in OVS
Jan 23 05:36:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:51Z|00803|binding|INFO|Setting lport 12caf893-7757-4200-a8b9-afef9f1e9215 up in Southbound
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.492 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.492 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a54cb5c7-9204-419d-a321-ed84b494218c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.540 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[47a27e42-a45a-4425-81a7-a9f36d08917d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 systemd-udevd[295907]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:36:51 np0005593233 NetworkManager[48871]: <info>  [1769164611.5494] manager: (tap868ec025-70): new Veth device (/org/freedesktop/NetworkManager/Devices/364)
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.551 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d6de8323-2126-4fbb-82ca-2981c376f1bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.593 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[d232364c-f93e-4a5f-8a1e-8733a1917be0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.596 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[260a60ab-652f-4068-bc55-670e5dfc3028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 NetworkManager[48871]: <info>  [1769164611.6367] device (tap868ec025-70): carrier: link connected
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.645 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[432253e5-7295-4182-aaba-b155a4e89d55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.670 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4c119c27-b298-4abd-898f-55113da88c88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap868ec025-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:85:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 852426, 'reachable_time': 35575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295936, 'error': None, 'target': 'ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.697 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[43a4f27f-0548-4a3b-97a7-a3199cb7872b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:858e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 852426, 'tstamp': 852426}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 295937, 'error': None, 'target': 'ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.722 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc2b4dc-4bf9-4b65-924e-f530d37e6930]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap868ec025-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:85:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 852426, 'reachable_time': 35575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 295938, 'error': None, 'target': 'ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.768 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[83c3d1be-1c17-46e6-b671-c5c49bf01581]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.863 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[42f98fbc-07de-4169-b534-bb932205df25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.865 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap868ec025-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.866 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.866 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap868ec025-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:51 np0005593233 kernel: tap868ec025-70: entered promiscuous mode
Jan 23 05:36:51 np0005593233 NetworkManager[48871]: <info>  [1769164611.8715] manager: (tap868ec025-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.869 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.875 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap868ec025-70, col_values=(('external_ids', {'iface-id': 'a8d6af13-33ee-459a-b9e0-d67de69b3614'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.874 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:51Z|00804|binding|INFO|Releasing lport a8d6af13-33ee-459a-b9e0-d67de69b3614 from this chassis (sb_readonly=0)
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.877 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.879 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.880 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/868ec025-7796-402b-ba12-8a3a5dac7373.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/868ec025-7796-402b-ba12-8a3a5dac7373.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.881 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dafb9e60-fb6a-4ba0-9d82-63863efef48e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.882 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-868ec025-7796-402b-ba12-8a3a5dac7373
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/868ec025-7796-402b-ba12-8a3a5dac7373.pid.haproxy
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 868ec025-7796-402b-ba12-8a3a5dac7373
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:36:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:36:51.883 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373', 'env', 'PROCESS_TAG=haproxy-868ec025-7796-402b-ba12-8a3a5dac7373', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/868ec025-7796-402b-ba12-8a3a5dac7373.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.908 222021 DEBUG nova.compute.manager [req-3a494c76-1121-44ef-a36d-9d049c1e45da req-c7d885bd-99c3-486b-ac1b-844195abd6a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.909 222021 DEBUG oslo_concurrency.lockutils [req-3a494c76-1121-44ef-a36d-9d049c1e45da req-c7d885bd-99c3-486b-ac1b-844195abd6a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.909 222021 DEBUG oslo_concurrency.lockutils [req-3a494c76-1121-44ef-a36d-9d049c1e45da req-c7d885bd-99c3-486b-ac1b-844195abd6a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.910 222021 DEBUG oslo_concurrency.lockutils [req-3a494c76-1121-44ef-a36d-9d049c1e45da req-c7d885bd-99c3-486b-ac1b-844195abd6a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:51 np0005593233 nova_compute[222017]: 2026-01-23 10:36:51.910 222021 DEBUG nova.compute.manager [req-3a494c76-1121-44ef-a36d-9d049c1e45da req-c7d885bd-99c3-486b-ac1b-844195abd6a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Processing event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.147 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164612.1465454, a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.149 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] VM Started (Lifecycle Event)#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.153 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.159 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.167 222021 INFO nova.virt.libvirt.driver [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Instance spawned successfully.#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.169 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.177 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.187 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.208 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.209 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.210 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.211 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.211 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.212 222021 DEBUG nova.virt.libvirt.driver [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.217 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.218 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164612.1468172, a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.218 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.267 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.273 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164612.1575365, a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.273 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.287 222021 INFO nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Took 9.37 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.287 222021 DEBUG nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.295 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.299 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.323 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.358 222021 INFO nova.compute.manager [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Took 10.42 seconds to build instance.#033[00m
Jan 23 05:36:52 np0005593233 nova_compute[222017]: 2026-01-23 10:36:52.375 222021 DEBUG oslo_concurrency.lockutils [None req-cb4eddd6-5f2a-4846-8dc5-b6c080c3c922 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:52 np0005593233 podman[296010]: 2026-01-23 10:36:52.453678914 +0000 UTC m=+0.087898898 container create 9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:36:52 np0005593233 podman[296010]: 2026-01-23 10:36:52.401281566 +0000 UTC m=+0.035501580 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:36:52 np0005593233 systemd[1]: Started libpod-conmon-9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5.scope.
Jan 23 05:36:52 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:36:52 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75d6061fa678ea565e6fb9a1616e452a49b2a4fb0009b190b0a9e4719de2d47f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:36:52 np0005593233 podman[296010]: 2026-01-23 10:36:52.57566421 +0000 UTC m=+0.209884234 container init 9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:36:52 np0005593233 podman[296010]: 2026-01-23 10:36:52.586833737 +0000 UTC m=+0.221053711 container start 9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:36:52 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [NOTICE]   (296029) : New worker (296031) forked
Jan 23 05:36:52 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [NOTICE]   (296029) : Loading success.
Jan 23 05:36:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:52.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:53.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:54 np0005593233 nova_compute[222017]: 2026-01-23 10:36:54.088 222021 DEBUG nova.compute.manager [req-40b537af-850b-4b01-9812-24091a1068cd req-e1714465-a5c9-4032-9816-56faa204fbb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:54 np0005593233 nova_compute[222017]: 2026-01-23 10:36:54.089 222021 DEBUG oslo_concurrency.lockutils [req-40b537af-850b-4b01-9812-24091a1068cd req-e1714465-a5c9-4032-9816-56faa204fbb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:54 np0005593233 nova_compute[222017]: 2026-01-23 10:36:54.090 222021 DEBUG oslo_concurrency.lockutils [req-40b537af-850b-4b01-9812-24091a1068cd req-e1714465-a5c9-4032-9816-56faa204fbb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:54 np0005593233 nova_compute[222017]: 2026-01-23 10:36:54.090 222021 DEBUG oslo_concurrency.lockutils [req-40b537af-850b-4b01-9812-24091a1068cd req-e1714465-a5c9-4032-9816-56faa204fbb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:54 np0005593233 nova_compute[222017]: 2026-01-23 10:36:54.091 222021 DEBUG nova.compute.manager [req-40b537af-850b-4b01-9812-24091a1068cd req-e1714465-a5c9-4032-9816-56faa204fbb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] No waiting events found dispatching network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:54 np0005593233 nova_compute[222017]: 2026-01-23 10:36:54.091 222021 WARNING nova.compute.manager [req-40b537af-850b-4b01-9812-24091a1068cd req-e1714465-a5c9-4032-9816-56faa204fbb0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received unexpected event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:36:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:36:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:54.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:36:55 np0005593233 nova_compute[222017]: 2026-01-23 10:36:55.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:55 np0005593233 nova_compute[222017]: 2026-01-23 10:36:55.195 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:55.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:56.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.140 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:57 np0005593233 NetworkManager[48871]: <info>  [1769164617.1441] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 23 05:36:57 np0005593233 NetworkManager[48871]: <info>  [1769164617.1460] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.292 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:57 np0005593233 ovn_controller[130653]: 2026-01-23T10:36:57Z|00805|binding|INFO|Releasing lport a8d6af13-33ee-459a-b9e0-d67de69b3614 from this chassis (sb_readonly=0)
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.314 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:57.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.541 222021 DEBUG nova.compute.manager [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-changed-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.542 222021 DEBUG nova.compute.manager [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Refreshing instance network info cache due to event network-changed-12caf893-7757-4200-a8b9-afef9f1e9215. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.542 222021 DEBUG oslo_concurrency.lockutils [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.543 222021 DEBUG oslo_concurrency.lockutils [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:36:57 np0005593233 nova_compute[222017]: 2026-01-23 10:36:57.543 222021 DEBUG nova.network.neutron [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Refreshing network info cache for port 12caf893-7757-4200-a8b9-afef9f1e9215 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:36:58 np0005593233 nova_compute[222017]: 2026-01-23 10:36:58.820 222021 DEBUG nova.network.neutron [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updated VIF entry in instance network info cache for port 12caf893-7757-4200-a8b9-afef9f1e9215. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:36:58 np0005593233 nova_compute[222017]: 2026-01-23 10:36:58.821 222021 DEBUG nova.network.neutron [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updating instance_info_cache with network_info: [{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:58 np0005593233 nova_compute[222017]: 2026-01-23 10:36:58.860 222021 DEBUG oslo_concurrency.lockutils [req-0d3bf78f-3577-459b-aaaf-0d4a3cddc3f0 req-14d371cc-af02-4337-8ed3-cf3943f1ceda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:36:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:36:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:58.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:36:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:36:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:59.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:00 np0005593233 nova_compute[222017]: 2026-01-23 10:37:00.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:00 np0005593233 nova_compute[222017]: 2026-01-23 10:37:00.197 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:00 np0005593233 nova_compute[222017]: 2026-01-23 10:37:00.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:00.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:01.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.412 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.412 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.412 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:37:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/915470303' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:02.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:02 np0005593233 nova_compute[222017]: 2026-01-23 10:37:02.909 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.027 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.031 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:37:03 np0005593233 podman[296066]: 2026-01-23 10:37:03.082413003 +0000 UTC m=+0.106090516 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.259 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.261 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4076MB free_disk=20.908065795898438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.262 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.262 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:03.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.368 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.369 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.370 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:37:03 np0005593233 nova_compute[222017]: 2026-01-23 10:37:03.597 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:37:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/129868948' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:04 np0005593233 nova_compute[222017]: 2026-01-23 10:37:04.179 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:04 np0005593233 nova_compute[222017]: 2026-01-23 10:37:04.190 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:37:04 np0005593233 nova_compute[222017]: 2026-01-23 10:37:04.208 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:37:04 np0005593233 nova_compute[222017]: 2026-01-23 10:37:04.233 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:37:04 np0005593233 nova_compute[222017]: 2026-01-23 10:37:04.234 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.972s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:04.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:05 np0005593233 nova_compute[222017]: 2026-01-23 10:37:05.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:05 np0005593233 nova_compute[222017]: 2026-01-23 10:37:05.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:05 np0005593233 nova_compute[222017]: 2026-01-23 10:37:05.235 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:05 np0005593233 nova_compute[222017]: 2026-01-23 10:37:05.236 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:05.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:05 np0005593233 nova_compute[222017]: 2026-01-23 10:37:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:37:06Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:f3:56 10.100.0.11
Jan 23 05:37:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:37:06Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:f3:56 10.100.0.11
Jan 23 05:37:06 np0005593233 nova_compute[222017]: 2026-01-23 10:37:06.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:06 np0005593233 nova_compute[222017]: 2026-01-23 10:37:06.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:37:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:06.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:07.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:08.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:09.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:09 np0005593233 nova_compute[222017]: 2026-01-23 10:37:09.965 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:37:09.967 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:37:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:37:09.969 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:37:10 np0005593233 nova_compute[222017]: 2026-01-23 10:37:10.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:10 np0005593233 nova_compute[222017]: 2026-01-23 10:37:10.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:37:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:37:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:37:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:37:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:10.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:11.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:12.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:13.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:37:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.649 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.650 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.650 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:37:14 np0005593233 nova_compute[222017]: 2026-01-23 10:37:14.650 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:14.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:15 np0005593233 nova_compute[222017]: 2026-01-23 10:37:15.075 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:15 np0005593233 nova_compute[222017]: 2026-01-23 10:37:15.264 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:15.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:15 np0005593233 nova_compute[222017]: 2026-01-23 10:37:15.852 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updating instance_info_cache with network_info: [{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:37:15 np0005593233 nova_compute[222017]: 2026-01-23 10:37:15.870 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:37:15 np0005593233 nova_compute[222017]: 2026-01-23 10:37:15.871 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:37:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:16.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:37:16.972 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:37:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:17.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:17 np0005593233 podman[296268]: 2026-01-23 10:37:17.864064967 +0000 UTC m=+0.194604871 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:37:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:37:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:37:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:18.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:19.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e371 e371: 3 total, 3 up, 3 in
Jan 23 05:37:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e371 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:20 np0005593233 nova_compute[222017]: 2026-01-23 10:37:20.111 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:20 np0005593233 nova_compute[222017]: 2026-01-23 10:37:20.268 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e372 e372: 3 total, 3 up, 3 in
Jan 23 05:37:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:20.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:21.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:22.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:23.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:24.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:25 np0005593233 nova_compute[222017]: 2026-01-23 10:37:25.115 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:25 np0005593233 nova_compute[222017]: 2026-01-23 10:37:25.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:25.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:27.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:28.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 e373: 3 total, 3 up, 3 in
Jan 23 05:37:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:29.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:30 np0005593233 nova_compute[222017]: 2026-01-23 10:37:30.119 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:30 np0005593233 nova_compute[222017]: 2026-01-23 10:37:30.274 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:30.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:37:31Z|00806|binding|INFO|Releasing lport a8d6af13-33ee-459a-b9e0-d67de69b3614 from this chassis (sb_readonly=0)
Jan 23 05:37:31 np0005593233 nova_compute[222017]: 2026-01-23 10:37:31.314 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:31.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:32.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:33.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:34 np0005593233 podman[296324]: 2026-01-23 10:37:34.103653161 +0000 UTC m=+0.096949166 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:37:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:34.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:35 np0005593233 nova_compute[222017]: 2026-01-23 10:37:35.123 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:35 np0005593233 nova_compute[222017]: 2026-01-23 10:37:35.277 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:35.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:36.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:37.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:38 np0005593233 ovn_controller[130653]: 2026-01-23T10:37:38Z|00807|binding|INFO|Releasing lport a8d6af13-33ee-459a-b9e0-d67de69b3614 from this chassis (sb_readonly=0)
Jan 23 05:37:38 np0005593233 nova_compute[222017]: 2026-01-23 10:37:38.174 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:38.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:39.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:40 np0005593233 nova_compute[222017]: 2026-01-23 10:37:40.127 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:40 np0005593233 nova_compute[222017]: 2026-01-23 10:37:40.280 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:40.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:41.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:37:42.698 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:37:42.699 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:37:42.700 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:42.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:43.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:44 np0005593233 nova_compute[222017]: 2026-01-23 10:37:44.650 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:44.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:45 np0005593233 nova_compute[222017]: 2026-01-23 10:37:45.131 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:45 np0005593233 nova_compute[222017]: 2026-01-23 10:37:45.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:45.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:46.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:47.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:48 np0005593233 podman[296347]: 2026-01-23 10:37:48.152778828 +0000 UTC m=+0.154280885 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:37:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:48.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:49.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:50 np0005593233 nova_compute[222017]: 2026-01-23 10:37:50.133 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:50 np0005593233 nova_compute[222017]: 2026-01-23 10:37:50.344 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:50.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:37:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:51.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:37:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:52.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:53.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:54 np0005593233 nova_compute[222017]: 2026-01-23 10:37:54.921 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:54.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:55 np0005593233 nova_compute[222017]: 2026-01-23 10:37:55.137 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:55 np0005593233 nova_compute[222017]: 2026-01-23 10:37:55.374 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:55.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:55 np0005593233 nova_compute[222017]: 2026-01-23 10:37:55.851 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:57.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:57.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:59.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.218143) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679218266, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 1416, "num_deletes": 265, "total_data_size": 2963569, "memory_usage": 2992608, "flush_reason": "Manual Compaction"}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679239316, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1955379, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 78263, "largest_seqno": 79674, "table_properties": {"data_size": 1949267, "index_size": 3314, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13695, "raw_average_key_size": 20, "raw_value_size": 1936705, "raw_average_value_size": 2873, "num_data_blocks": 146, "num_entries": 674, "num_filter_entries": 674, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164574, "oldest_key_time": 1769164574, "file_creation_time": 1769164679, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 21237 microseconds, and 10487 cpu microseconds.
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.239390) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1955379 bytes OK
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.239423) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.241398) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.241425) EVENT_LOG_v1 {"time_micros": 1769164679241415, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.241459) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 2956728, prev total WAL file size 2956728, number of live WAL files 2.
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.242830) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303135' seq:72057594037927935, type:22 .. '6C6F676D0033323638' seq:0, type:0; will stop at (end)
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1909KB)], [162(11MB)]
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679242964, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13760226, "oldest_snapshot_seqno": -1}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 9860 keys, 13626415 bytes, temperature: kUnknown
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679425284, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 13626415, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13561893, "index_size": 38745, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24709, "raw_key_size": 260173, "raw_average_key_size": 26, "raw_value_size": 13388270, "raw_average_value_size": 1357, "num_data_blocks": 1484, "num_entries": 9860, "num_filter_entries": 9860, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164679, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.425866) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 13626415 bytes
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.427803) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 75.4 rd, 74.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.3 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(14.0) write-amplify(7.0) OK, records in: 10404, records dropped: 544 output_compression: NoCompression
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.427883) EVENT_LOG_v1 {"time_micros": 1769164679427841, "job": 104, "event": "compaction_finished", "compaction_time_micros": 182431, "compaction_time_cpu_micros": 62641, "output_level": 6, "num_output_files": 1, "total_output_size": 13626415, "num_input_records": 10404, "num_output_records": 9860, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679429011, "job": 104, "event": "table_file_deletion", "file_number": 164}
Jan 23 05:37:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:37:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:37:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:59.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679433949, "job": 104, "event": "table_file_deletion", "file_number": 162}
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.242603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.434050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.434058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.434060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.434062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:37:59.434064) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.141 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.378 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.688 222021 DEBUG nova.compute.manager [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-changed-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.689 222021 DEBUG nova.compute.manager [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Refreshing instance network info cache due to event network-changed-12caf893-7757-4200-a8b9-afef9f1e9215. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.689 222021 DEBUG oslo_concurrency.lockutils [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.689 222021 DEBUG oslo_concurrency.lockutils [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.690 222021 DEBUG nova.network.neutron [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Refreshing network info cache for port 12caf893-7757-4200-a8b9-afef9f1e9215 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.759 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.759 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.760 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.760 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.761 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.762 222021 INFO nova.compute.manager [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Terminating instance#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.764 222021 DEBUG nova.compute.manager [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:38:00 np0005593233 kernel: tap12caf893-77 (unregistering): left promiscuous mode
Jan 23 05:38:00 np0005593233 NetworkManager[48871]: <info>  [1769164680.8468] device (tap12caf893-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:00 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:00Z|00808|binding|INFO|Releasing lport 12caf893-7757-4200-a8b9-afef9f1e9215 from this chassis (sb_readonly=0)
Jan 23 05:38:00 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:00Z|00809|binding|INFO|Setting lport 12caf893-7757-4200-a8b9-afef9f1e9215 down in Southbound
Jan 23 05:38:00 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:00Z|00810|binding|INFO|Removing iface tap12caf893-77 ovn-installed in OVS
Jan 23 05:38:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:00.882 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:f3:56 10.100.0.11'], port_security=['fa:16:3e:f9:f3:56 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-868ec025-7796-402b-ba12-8a3a5dac7373', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '4', 'neutron:security_group_ids': '05714b57-0768-46c9-8884-47c3caa76f59 92c18c07-ed83-4ff4-89c6-5a468e492558', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07547cf6-dd0a-47b0-85a9-11b383a1aaf0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=12caf893-7757-4200-a8b9-afef9f1e9215) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:38:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:00.885 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 12caf893-7757-4200-a8b9-afef9f1e9215 in datapath 868ec025-7796-402b-ba12-8a3a5dac7373 unbound from our chassis#033[00m
Jan 23 05:38:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:00.887 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 868ec025-7796-402b-ba12-8a3a5dac7373, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:38:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:00.890 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[68645fc6-a2cd-4d95-b19f-f965e958267c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:00.891 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373 namespace which is not needed anymore#033[00m
Jan 23 05:38:00 np0005593233 nova_compute[222017]: 2026-01-23 10:38:00.917 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:00 np0005593233 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000bd.scope: Deactivated successfully.
Jan 23 05:38:00 np0005593233 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000bd.scope: Consumed 17.693s CPU time.
Jan 23 05:38:00 np0005593233 systemd-machined[190954]: Machine qemu-87-instance-000000bd terminated.
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.021 222021 INFO nova.virt.libvirt.driver [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Instance destroyed successfully.#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.021 222021 DEBUG nova.objects.instance [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'resources' on Instance uuid a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:38:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:01.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:01 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [NOTICE]   (296029) : haproxy version is 2.8.14-c23fe91
Jan 23 05:38:01 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [NOTICE]   (296029) : path to executable is /usr/sbin/haproxy
Jan 23 05:38:01 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [WARNING]  (296029) : Exiting Master process...
Jan 23 05:38:01 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [ALERT]    (296029) : Current worker (296031) exited with code 143 (Terminated)
Jan 23 05:38:01 np0005593233 neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373[296025]: [WARNING]  (296029) : All workers exited. Exiting... (0)
Jan 23 05:38:01 np0005593233 systemd[1]: libpod-9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5.scope: Deactivated successfully.
Jan 23 05:38:01 np0005593233 podman[296410]: 2026-01-23 10:38:01.107118001 +0000 UTC m=+0.053603244 container died 9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:38:01 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5-userdata-shm.mount: Deactivated successfully.
Jan 23 05:38:01 np0005593233 systemd[1]: var-lib-containers-storage-overlay-75d6061fa678ea565e6fb9a1616e452a49b2a4fb0009b190b0a9e4719de2d47f-merged.mount: Deactivated successfully.
Jan 23 05:38:01 np0005593233 podman[296410]: 2026-01-23 10:38:01.149460194 +0000 UTC m=+0.095945437 container cleanup 9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:38:01 np0005593233 systemd[1]: libpod-conmon-9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5.scope: Deactivated successfully.
Jan 23 05:38:01 np0005593233 podman[296440]: 2026-01-23 10:38:01.237138886 +0000 UTC m=+0.060416828 container remove 9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.244 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f562cf83-473d-4d0c-b7c9-5b1a10d36691]: (4, ('Fri Jan 23 10:38:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373 (9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5)\n9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5\nFri Jan 23 10:38:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373 (9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5)\n9e76c025d4e86c5f418fd754666ffa9fb14fb3b1ab777104b45b165085e80bb5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.246 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5368cdd6-5709-4912-89a9-35f53f8e60ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.248 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap868ec025-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:01 np0005593233 kernel: tap868ec025-70: left promiscuous mode
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.251 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.272 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.275 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[899dc366-8457-4773-a2c3-122f69e0a9cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.306 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4955d8d3-da2e-495e-a381-db82857126aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.308 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[83ee3302-d1c4-459d-a08e-c63285a68eb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.327 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8108690f-19d0-41e6-8dee-536141fbf4f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 852415, 'reachable_time': 19474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296459, 'error': None, 'target': 'ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 systemd[1]: run-netns-ovnmeta\x2d868ec025\x2d7796\x2d402b\x2dba12\x2d8a3a5dac7373.mount: Deactivated successfully.
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.332 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-868ec025-7796-402b-ba12-8a3a5dac7373 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:38:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:01.332 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[37bce455-e785-49ee-b50c-1f812a0b121c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.404 222021 DEBUG nova.compute.manager [req-6c1e3785-0325-4efb-8ee4-18ce64f5784b req-c1bd775f-cdb0-471a-bd88-f6156c7d1684 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-vif-unplugged-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.405 222021 DEBUG oslo_concurrency.lockutils [req-6c1e3785-0325-4efb-8ee4-18ce64f5784b req-c1bd775f-cdb0-471a-bd88-f6156c7d1684 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.405 222021 DEBUG oslo_concurrency.lockutils [req-6c1e3785-0325-4efb-8ee4-18ce64f5784b req-c1bd775f-cdb0-471a-bd88-f6156c7d1684 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.405 222021 DEBUG oslo_concurrency.lockutils [req-6c1e3785-0325-4efb-8ee4-18ce64f5784b req-c1bd775f-cdb0-471a-bd88-f6156c7d1684 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.405 222021 DEBUG nova.compute.manager [req-6c1e3785-0325-4efb-8ee4-18ce64f5784b req-c1bd775f-cdb0-471a-bd88-f6156c7d1684 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] No waiting events found dispatching network-vif-unplugged-12caf893-7757-4200-a8b9-afef9f1e9215 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.406 222021 DEBUG nova.compute.manager [req-6c1e3785-0325-4efb-8ee4-18ce64f5784b req-c1bd775f-cdb0-471a-bd88-f6156c7d1684 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-vif-unplugged-12caf893-7757-4200-a8b9-afef9f1e9215 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.408 222021 DEBUG nova.virt.libvirt.vif [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:36:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-396645154',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=189,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBQ7hbI9dSlZowLrM+55BW4vZefgM5C7AtbCmYzjlG7RhMLD86z6HKT1ky7da4FJ/rvc4D//2MBDEN2yr//ERuTxme2OPEqVyC6OVqkosE4nxK5JvPAi4Vemn/j2yc45jQ==',key_name='tempest-TestSecurityGroupsBasicOps-832716294',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:36:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-q9pz4vzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:36:52Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.408 222021 DEBUG nova.network.os_vif_util [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.409 222021 DEBUG nova.network.os_vif_util [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.409 222021 DEBUG os_vif [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.413 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12caf893-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.415 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:38:01 np0005593233 nova_compute[222017]: 2026-01-23 10:38:01.420 222021 INFO os_vif [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:f3:56,bridge_name='br-int',has_traffic_filtering=True,id=12caf893-7757-4200-a8b9-afef9f1e9215,network=Network(868ec025-7796-402b-ba12-8a3a5dac7373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12caf893-77')#033[00m
Jan 23 05:38:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:01.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.408 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.409 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.409 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.410 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.410 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.497 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.896 222021 DEBUG nova.network.neutron [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updated VIF entry in instance network info cache for port 12caf893-7757-4200-a8b9-afef9f1e9215. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.897 222021 DEBUG nova.network.neutron [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updating instance_info_cache with network_info: [{"id": "12caf893-7757-4200-a8b9-afef9f1e9215", "address": "fa:16:3e:f9:f3:56", "network": {"id": "868ec025-7796-402b-ba12-8a3a5dac7373", "bridge": "br-int", "label": "tempest-network-smoke--2132727647", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12caf893-77", "ovs_interfaceid": "12caf893-7757-4200-a8b9-afef9f1e9215", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:02 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1538184054' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.926 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:02 np0005593233 nova_compute[222017]: 2026-01-23 10:38:02.942 222021 DEBUG oslo_concurrency.lockutils [req-572a2952-63f6-455b-9085-725fad7cb6bd req-7c3b822e-6e08-4f1e-812e-2b817055ae25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.002 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.003 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000bd as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:38:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:03.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.148 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.150 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4337MB free_disk=20.956493377685547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.151 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.151 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.250 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.251 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.251 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.344 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:03.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.697 222021 DEBUG nova.compute.manager [req-448cede4-aae3-4ac8-b4ea-2e222f585380 req-5d6a6a89-9d19-4b59-9a4d-202eda964d20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.698 222021 DEBUG oslo_concurrency.lockutils [req-448cede4-aae3-4ac8-b4ea-2e222f585380 req-5d6a6a89-9d19-4b59-9a4d-202eda964d20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.699 222021 DEBUG oslo_concurrency.lockutils [req-448cede4-aae3-4ac8-b4ea-2e222f585380 req-5d6a6a89-9d19-4b59-9a4d-202eda964d20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.699 222021 DEBUG oslo_concurrency.lockutils [req-448cede4-aae3-4ac8-b4ea-2e222f585380 req-5d6a6a89-9d19-4b59-9a4d-202eda964d20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.700 222021 DEBUG nova.compute.manager [req-448cede4-aae3-4ac8-b4ea-2e222f585380 req-5d6a6a89-9d19-4b59-9a4d-202eda964d20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] No waiting events found dispatching network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.700 222021 WARNING nova.compute.manager [req-448cede4-aae3-4ac8-b4ea-2e222f585380 req-5d6a6a89-9d19-4b59-9a4d-202eda964d20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received unexpected event network-vif-plugged-12caf893-7757-4200-a8b9-afef9f1e9215 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:38:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:03 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/945799165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.790 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.799 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.821 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.876 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:38:03 np0005593233 nova_compute[222017]: 2026-01-23 10:38:03.876 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:05.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:05 np0005593233 podman[296524]: 2026-01-23 10:38:05.095658346 +0000 UTC m=+0.099303202 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:38:05 np0005593233 nova_compute[222017]: 2026-01-23 10:38:05.428 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:05.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:05 np0005593233 nova_compute[222017]: 2026-01-23 10:38:05.960 222021 INFO nova.virt.libvirt.driver [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Deleting instance files /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_del#033[00m
Jan 23 05:38:05 np0005593233 nova_compute[222017]: 2026-01-23 10:38:05.961 222021 INFO nova.virt.libvirt.driver [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Deletion of /var/lib/nova/instances/a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6_del complete#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.078 222021 INFO nova.compute.manager [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Took 5.31 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.079 222021 DEBUG oslo.service.loopingcall [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.080 222021 DEBUG nova.compute.manager [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.080 222021 DEBUG nova.network.neutron [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.878 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.880 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.880 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:06 np0005593233 nova_compute[222017]: 2026-01-23 10:38:06.881 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:38:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:07.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.118 222021 DEBUG nova.network.neutron [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.155 222021 INFO nova.compute.manager [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Took 1.08 seconds to deallocate network for instance.#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.212 222021 DEBUG nova.compute.manager [req-fd860e16-17a1-47c6-9e99-13165757cc59 req-a4ebcdbd-f6bc-494e-84e7-5d1b9338f926 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Received event network-vif-deleted-12caf893-7757-4200-a8b9-afef9f1e9215 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.215 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.215 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.274 222021 DEBUG oslo_concurrency.processutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:07.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/611923231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.769 222021 DEBUG oslo_concurrency.processutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.779 222021 DEBUG nova.compute.provider_tree [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.797 222021 DEBUG nova.scheduler.client.report [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.819 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.846 222021 INFO nova.scheduler.client.report [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Deleted allocations for instance a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6#033[00m
Jan 23 05:38:07 np0005593233 nova_compute[222017]: 2026-01-23 10:38:07.916 222021 DEBUG oslo_concurrency.lockutils [None req-1a311c57-d28f-41b8-a38c-3416ee7167b6 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:09.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:09.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:10 np0005593233 nova_compute[222017]: 2026-01-23 10:38:10.168 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:10 np0005593233 nova_compute[222017]: 2026-01-23 10:38:10.432 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:10.486 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:38:10 np0005593233 nova_compute[222017]: 2026-01-23 10:38:10.486 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:10 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:10.487 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:38:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:11.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:11.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:11 np0005593233 nova_compute[222017]: 2026-01-23 10:38:11.458 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:11.490 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:12 np0005593233 nova_compute[222017]: 2026-01-23 10:38:12.883 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:12 np0005593233 ceph-osd[78880]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 23 05:38:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:13.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:13 np0005593233 nova_compute[222017]: 2026-01-23 10:38:13.104 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:15.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:15 np0005593233 nova_compute[222017]: 2026-01-23 10:38:15.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:15.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.019 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164681.0179412, a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.020 222021 INFO nova.compute.manager [-] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.052 222021 DEBUG nova.compute.manager [None req-03cde61e-2148-4b7c-bb75-f5f82e9fbc34 - - - - - -] [instance: a22ebe7d-4b60-4d78-8c0a-89655a7d4bc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.402 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:38:16 np0005593233 nova_compute[222017]: 2026-01-23 10:38:16.460 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:17.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:17.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:18 np0005593233 podman[296642]: 2026-01-23 10:38:18.336326992 +0000 UTC m=+0.121559244 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:38:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:19.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:19.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:38:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:20 np0005593233 nova_compute[222017]: 2026-01-23 10:38:20.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:38:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:38:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:38:21 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2622278632' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:38:21 np0005593233 nova_compute[222017]: 2026-01-23 10:38:21.463 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:21.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:23.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:23.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:25.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:25.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:25 np0005593233 nova_compute[222017]: 2026-01-23 10:38:25.517 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:26 np0005593233 nova_compute[222017]: 2026-01-23 10:38:26.465 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:27.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:27.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:38:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:38:28 np0005593233 nova_compute[222017]: 2026-01-23 10:38:28.397 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:29.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:29.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e374 e374: 3 total, 3 up, 3 in
Jan 23 05:38:30 np0005593233 nova_compute[222017]: 2026-01-23 10:38:30.521 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e375 e375: 3 total, 3 up, 3 in
Jan 23 05:38:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:31.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:31 np0005593233 nova_compute[222017]: 2026-01-23 10:38:31.467 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:31.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e376 e376: 3 total, 3 up, 3 in
Jan 23 05:38:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:33.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:33.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e377 e377: 3 total, 3 up, 3 in
Jan 23 05:38:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:35.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:35 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 23 05:38:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:35.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:35 np0005593233 nova_compute[222017]: 2026-01-23 10:38:35.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:36 np0005593233 podman[296782]: 2026-01-23 10:38:36.08643348 +0000 UTC m=+0.088207927 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:38:36 np0005593233 nova_compute[222017]: 2026-01-23 10:38:36.469 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:37.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:37.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.467 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.468 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.563 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.675 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.676 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.689 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.690 222021 INFO nova.compute.claims [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:38:38 np0005593233 nova_compute[222017]: 2026-01-23 10:38:38.833 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e378 e378: 3 total, 3 up, 3 in
Jan 23 05:38:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:39.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2871507891' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.354 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.366 222021 DEBUG nova.compute.provider_tree [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.390 222021 DEBUG nova.scheduler.client.report [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.428 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.429 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.500 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.501 222021 DEBUG nova.network.neutron [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:38:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.529 222021 INFO nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.548 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:38:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:39.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.660 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.662 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.663 222021 INFO nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Creating image(s)#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.704 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.739 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.788 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.794 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.832 222021 DEBUG nova.policy [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93cd560e84264023877c47122b5919de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e762fca3b634c7aa1d994314c059c54', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.865 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.866 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.867 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.868 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.914 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:39 np0005593233 nova_compute[222017]: 2026-01-23 10:38:39.921 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0a916952-341a-4caf-bf6f-6abe504830f9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 e379: 3 total, 3 up, 3 in
Jan 23 05:38:40 np0005593233 nova_compute[222017]: 2026-01-23 10:38:40.558 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:41.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:41 np0005593233 nova_compute[222017]: 2026-01-23 10:38:41.137 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0a916952-341a-4caf-bf6f-6abe504830f9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:41 np0005593233 nova_compute[222017]: 2026-01-23 10:38:41.253 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] resizing rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:38:41 np0005593233 nova_compute[222017]: 2026-01-23 10:38:41.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:41 np0005593233 nova_compute[222017]: 2026-01-23 10:38:41.559 222021 DEBUG nova.network.neutron [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Successfully created port: f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:38:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:41.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:41 np0005593233 nova_compute[222017]: 2026-01-23 10:38:41.704 222021 DEBUG nova.objects.instance [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'migration_context' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:38:42 np0005593233 nova_compute[222017]: 2026-01-23 10:38:42.066 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:38:42 np0005593233 nova_compute[222017]: 2026-01-23 10:38:42.067 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Ensure instance console log exists: /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:38:42 np0005593233 nova_compute[222017]: 2026-01-23 10:38:42.068 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:42 np0005593233 nova_compute[222017]: 2026-01-23 10:38:42.068 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:42 np0005593233 nova_compute[222017]: 2026-01-23 10:38:42.069 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:42.699 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:42.701 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:42.701 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:43.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:43.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:44 np0005593233 nova_compute[222017]: 2026-01-23 10:38:44.882 222021 DEBUG nova.network.neutron [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Successfully updated port: f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:38:44 np0005593233 nova_compute[222017]: 2026-01-23 10:38:44.901 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:38:44 np0005593233 nova_compute[222017]: 2026-01-23 10:38:44.902 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:38:44 np0005593233 nova_compute[222017]: 2026-01-23 10:38:44.902 222021 DEBUG nova.network.neutron [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:38:45 np0005593233 nova_compute[222017]: 2026-01-23 10:38:45.028 222021 DEBUG nova.compute.manager [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-changed-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:45 np0005593233 nova_compute[222017]: 2026-01-23 10:38:45.029 222021 DEBUG nova.compute.manager [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Refreshing instance network info cache due to event network-changed-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:38:45 np0005593233 nova_compute[222017]: 2026-01-23 10:38:45.030 222021 DEBUG oslo_concurrency.lockutils [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:38:45 np0005593233 nova_compute[222017]: 2026-01-23 10:38:45.091 222021 DEBUG nova.network.neutron [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:38:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:45.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:45 np0005593233 nova_compute[222017]: 2026-01-23 10:38:45.559 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:45.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:46 np0005593233 nova_compute[222017]: 2026-01-23 10:38:46.600 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.032 222021 DEBUG nova.network.neutron [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.080 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.081 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Instance network_info: |[{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.082 222021 DEBUG oslo_concurrency.lockutils [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.082 222021 DEBUG nova.network.neutron [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Refreshing network info cache for port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.092 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Start _get_guest_xml network_info=[{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.102 222021 WARNING nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.118 222021 DEBUG nova.virt.libvirt.host [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.119 222021 DEBUG nova.virt.libvirt.host [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.126 222021 DEBUG nova.virt.libvirt.host [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.128 222021 DEBUG nova.virt.libvirt.host [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.130 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.130 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.131 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.132 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.132 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.133 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.133 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.134 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.135 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.135 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.136 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.136 222021 DEBUG nova.virt.hardware [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.142 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:47.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:47.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:38:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3640057829' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.643 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.677 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:47 np0005593233 nova_compute[222017]: 2026-01-23 10:38:47.685 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:38:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2131487104' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.174 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.177 222021 DEBUG nova.virt.libvirt.vif [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:38:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-0',id=193,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-ubbwika9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:38:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=0a916952-341a-4caf-bf6f-6abe504830f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.178 222021 DEBUG nova.network.os_vif_util [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.179 222021 DEBUG nova.network.os_vif_util [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.181 222021 DEBUG nova.objects.instance [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.214 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <uuid>0a916952-341a-4caf-bf6f-6abe504830f9</uuid>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <name>instance-000000c1</name>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:name>multiattach-server-0</nova:name>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:38:47</nova:creationTime>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:user uuid="93cd560e84264023877c47122b5919de">tempest-AttachVolumeMultiAttachTest-63035580-project-member</nova:user>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:project uuid="6e762fca3b634c7aa1d994314c059c54">tempest-AttachVolumeMultiAttachTest-63035580</nova:project>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <nova:port uuid="f6b87cda-0bd8-4fbb-a92e-b86c0a65df79">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <entry name="serial">0a916952-341a-4caf-bf6f-6abe504830f9</entry>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <entry name="uuid">0a916952-341a-4caf-bf6f-6abe504830f9</entry>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0a916952-341a-4caf-bf6f-6abe504830f9_disk">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/0a916952-341a-4caf-bf6f-6abe504830f9_disk.config">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:78:1c:0c"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <target dev="tapf6b87cda-0b"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/console.log" append="off"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:38:48 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:38:48 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:38:48 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:38:48 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.217 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Preparing to wait for external event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.218 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.218 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.219 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.220 222021 DEBUG nova.virt.libvirt.vif [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:38:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-0',id=193,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-ubbwika9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:38:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=0a916952-341a-4caf-bf6f-6abe504830f9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.221 222021 DEBUG nova.network.os_vif_util [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.222 222021 DEBUG nova.network.os_vif_util [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.223 222021 DEBUG os_vif [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.224 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.225 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.226 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.230 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.230 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6b87cda-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.231 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6b87cda-0b, col_values=(('external_ids', {'iface-id': 'f6b87cda-0bd8-4fbb-a92e-b86c0a65df79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:1c:0c', 'vm-uuid': '0a916952-341a-4caf-bf6f-6abe504830f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:48 np0005593233 NetworkManager[48871]: <info>  [1769164728.2353] manager: (tapf6b87cda-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.236 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.247 222021 INFO os_vif [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b')#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.362 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.363 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.364 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:78:1c:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.364 222021 INFO nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Using config drive#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.407 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.853 222021 INFO nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Creating config drive at /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/disk.config#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.865 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4bgtns8i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.932 222021 DEBUG nova.network.neutron [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updated VIF entry in instance network info cache for port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.933 222021 DEBUG nova.network.neutron [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:48 np0005593233 nova_compute[222017]: 2026-01-23 10:38:48.955 222021 DEBUG oslo_concurrency.lockutils [req-a03c6bf2-d417-46f0-a8f9-86eedf5863e1 req-d5c5fcff-e4bb-4ec0-b788-179d87b7155a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.037 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4bgtns8i" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.089 222021 DEBUG nova.storage.rbd_utils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 0a916952-341a-4caf-bf6f-6abe504830f9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.094 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/disk.config 0a916952-341a-4caf-bf6f-6abe504830f9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:49.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:49 np0005593233 podman[297078]: 2026-01-23 10:38:49.149630815 +0000 UTC m=+0.149079578 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.300 222021 DEBUG oslo_concurrency.processutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/disk.config 0a916952-341a-4caf-bf6f-6abe504830f9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.303 222021 INFO nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Deleting local config drive /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9/disk.config because it was imported into RBD.#033[00m
Jan 23 05:38:49 np0005593233 kernel: tapf6b87cda-0b: entered promiscuous mode
Jan 23 05:38:49 np0005593233 NetworkManager[48871]: <info>  [1769164729.3757] manager: (tapf6b87cda-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/369)
Jan 23 05:38:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:49Z|00811|binding|INFO|Claiming lport f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 for this chassis.
Jan 23 05:38:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:49Z|00812|binding|INFO|f6b87cda-0bd8-4fbb-a92e-b86c0a65df79: Claiming fa:16:3e:78:1c:0c 10.100.0.6
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.377 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.384 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.396 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:1c:0c 10.100.0.6'], port_security=['fa:16:3e:78:1c:0c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0a916952-341a-4caf-bf6f-6abe504830f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed138636-f650-4a09-b808-0b05f9067a5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.397 140224 INFO neutron.agent.ovn.metadata.agent [-] Port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 bound to our chassis#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.399 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04#033[00m
Jan 23 05:38:49 np0005593233 systemd-udevd[297153]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.421 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[abf4687d-4ec3-46d8-ac66-5c0000c1d9fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.424 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfba2ba4a-d1 in ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.428 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfba2ba4a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.428 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5df0ecea-263d-4c29-ae7a-d1930ab1e45f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.430 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f8937c-43ce-4b4e-a18d-7bfa16e87f49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 systemd-machined[190954]: New machine qemu-88-instance-000000c1.
Jan 23 05:38:49 np0005593233 NetworkManager[48871]: <info>  [1769164729.4412] device (tapf6b87cda-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:38:49 np0005593233 NetworkManager[48871]: <info>  [1769164729.4424] device (tapf6b87cda-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.450 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[d502545a-ae4a-4284-9d93-4c77f43030af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.457 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:49Z|00813|binding|INFO|Setting lport f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 ovn-installed in OVS
Jan 23 05:38:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:49Z|00814|binding|INFO|Setting lport f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 up in Southbound
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 systemd[1]: Started Virtual Machine qemu-88-instance-000000c1.
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.487 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cc23627c-bbad-4655-b588-c1daab1c47ce]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.535 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4329da-7e4f-4e62-b26e-e42671ca5fb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.544 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbd855f-43bc-4b57-bb4b-b586dcbc996d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 NetworkManager[48871]: <info>  [1769164729.5469] manager: (tapfba2ba4a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/370)
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.602 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2f866df8-daec-4b00-bd4d-7fb33714609e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.609 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[171c8a8f-be33-4cb0-b551-d47c2eef61f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:49.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:49 np0005593233 NetworkManager[48871]: <info>  [1769164729.6485] device (tapfba2ba4a-d0): carrier: link connected
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.658 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[2bdb1667-8dcb-48ab-bbf9-24564d30015d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.691 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[11defe08-d73a-442d-a7cc-615294ae44ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864227, 'reachable_time': 25848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297186, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.721 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[01303ba1-ebf1-4b87-9f52-a8bb4ff50ba7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:db55'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 864227, 'tstamp': 864227}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297187, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.759 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7f2ba3-d757-4280-a53a-ef2204054179]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864227, 'reachable_time': 25848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 297188, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.812 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8c594c65-5fa2-4f99-b9da-aad30aed63ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.924 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[232a2cfc-889c-400d-9a16-d3f5e1937ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.929 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.929 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.930 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba2ba4a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:49 np0005593233 NetworkManager[48871]: <info>  [1769164729.9330] manager: (tapfba2ba4a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Jan 23 05:38:49 np0005593233 kernel: tapfba2ba4a-d0: entered promiscuous mode
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.933 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.938 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfba2ba4a-d0, col_values=(('external_ids', {'iface-id': '2348ddba-3dc3-4456-a637-f3065ba0d8f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:49 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:49Z|00815|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.939 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 nova_compute[222017]: 2026-01-23 10:38:49.966 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.967 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.969 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2309d514-3520-4c2b-a344-6109f403fc8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.970 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-fba2ba4a-d82c-4f8b-9754-c13fbec41a04
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.pid.haproxy
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID fba2ba4a-d82c-4f8b-9754-c13fbec41a04
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:38:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:38:49.971 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'env', 'PROCESS_TAG=haproxy-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.117 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164730.1165595, 0a916952-341a-4caf-bf6f-6abe504830f9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.119 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.355 222021 DEBUG nova.compute.manager [req-a03126eb-0e08-4dfa-aea6-58dbe67c76ac req-bd4c937d-3679-4d7b-9749-df7ff0c7f20f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.356 222021 DEBUG oslo_concurrency.lockutils [req-a03126eb-0e08-4dfa-aea6-58dbe67c76ac req-bd4c937d-3679-4d7b-9749-df7ff0c7f20f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.356 222021 DEBUG oslo_concurrency.lockutils [req-a03126eb-0e08-4dfa-aea6-58dbe67c76ac req-bd4c937d-3679-4d7b-9749-df7ff0c7f20f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.357 222021 DEBUG oslo_concurrency.lockutils [req-a03126eb-0e08-4dfa-aea6-58dbe67c76ac req-bd4c937d-3679-4d7b-9749-df7ff0c7f20f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.357 222021 DEBUG nova.compute.manager [req-a03126eb-0e08-4dfa-aea6-58dbe67c76ac req-bd4c937d-3679-4d7b-9749-df7ff0c7f20f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Processing event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.358 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.363 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.368 222021 INFO nova.virt.libvirt.driver [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Instance spawned successfully.#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.369 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.377 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.383 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.420 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.421 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164730.1187022, 0a916952-341a-4caf-bf6f-6abe504830f9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.421 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.429 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.430 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.430 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.431 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.431 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.432 222021 DEBUG nova.virt.libvirt.driver [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:50 np0005593233 podman[297262]: 2026-01-23 10:38:50.44976986 +0000 UTC m=+0.072501941 container create f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.459 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.465 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164730.3617754, 0a916952-341a-4caf-bf6f-6abe504830f9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.465 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.488 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.496 222021 INFO nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Took 10.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.496 222021 DEBUG nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.498 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:38:50 np0005593233 systemd[1]: Started libpod-conmon-f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd.scope.
Jan 23 05:38:50 np0005593233 podman[297262]: 2026-01-23 10:38:50.422648949 +0000 UTC m=+0.045381030 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.544 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:38:50 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:38:50 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b379705ba850a21a308304eb46ee60d17a454a13ff22fd8029d3867119a50f92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.592 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.602 222021 INFO nova.compute.manager [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Took 11.98 seconds to build instance.#033[00m
Jan 23 05:38:50 np0005593233 podman[297262]: 2026-01-23 10:38:50.606570275 +0000 UTC m=+0.229302376 container init f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:38:50 np0005593233 podman[297262]: 2026-01-23 10:38:50.622361983 +0000 UTC m=+0.245094064 container start f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:38:50 np0005593233 nova_compute[222017]: 2026-01-23 10:38:50.621 222021 DEBUG oslo_concurrency.lockutils [None req-9b7816f5-58bc-4133-8dc6-ad9e2b4c8cc0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:50 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [NOTICE]   (297280) : New worker (297282) forked
Jan 23 05:38:50 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [NOTICE]   (297280) : Loading success.
Jan 23 05:38:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:51.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:51.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:52 np0005593233 nova_compute[222017]: 2026-01-23 10:38:52.452 222021 DEBUG nova.compute.manager [req-dd2fece6-e4a8-4e28-b9ef-19777360b72e req-2cebb436-a59e-4930-9d0d-643594dda5c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:52 np0005593233 nova_compute[222017]: 2026-01-23 10:38:52.453 222021 DEBUG oslo_concurrency.lockutils [req-dd2fece6-e4a8-4e28-b9ef-19777360b72e req-2cebb436-a59e-4930-9d0d-643594dda5c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:52 np0005593233 nova_compute[222017]: 2026-01-23 10:38:52.453 222021 DEBUG oslo_concurrency.lockutils [req-dd2fece6-e4a8-4e28-b9ef-19777360b72e req-2cebb436-a59e-4930-9d0d-643594dda5c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:52 np0005593233 nova_compute[222017]: 2026-01-23 10:38:52.454 222021 DEBUG oslo_concurrency.lockutils [req-dd2fece6-e4a8-4e28-b9ef-19777360b72e req-2cebb436-a59e-4930-9d0d-643594dda5c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:52 np0005593233 nova_compute[222017]: 2026-01-23 10:38:52.455 222021 DEBUG nova.compute.manager [req-dd2fece6-e4a8-4e28-b9ef-19777360b72e req-2cebb436-a59e-4930-9d0d-643594dda5c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] No waiting events found dispatching network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:38:52 np0005593233 nova_compute[222017]: 2026-01-23 10:38:52.455 222021 WARNING nova.compute.manager [req-dd2fece6-e4a8-4e28-b9ef-19777360b72e req-2cebb436-a59e-4930-9d0d-643594dda5c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received unexpected event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:38:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:53.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:53 np0005593233 nova_compute[222017]: 2026-01-23 10:38:53.236 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:53 np0005593233 nova_compute[222017]: 2026-01-23 10:38:53.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:53 np0005593233 NetworkManager[48871]: <info>  [1769164733.4326] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 23 05:38:53 np0005593233 NetworkManager[48871]: <info>  [1769164733.4363] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/373)
Jan 23 05:38:53 np0005593233 nova_compute[222017]: 2026-01-23 10:38:53.622 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:38:53Z|00816|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:38:53 np0005593233 nova_compute[222017]: 2026-01-23 10:38:53.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:54 np0005593233 nova_compute[222017]: 2026-01-23 10:38:54.857 222021 DEBUG nova.compute.manager [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-changed-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:54 np0005593233 nova_compute[222017]: 2026-01-23 10:38:54.858 222021 DEBUG nova.compute.manager [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Refreshing instance network info cache due to event network-changed-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:38:54 np0005593233 nova_compute[222017]: 2026-01-23 10:38:54.859 222021 DEBUG oslo_concurrency.lockutils [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:38:54 np0005593233 nova_compute[222017]: 2026-01-23 10:38:54.859 222021 DEBUG oslo_concurrency.lockutils [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:38:54 np0005593233 nova_compute[222017]: 2026-01-23 10:38:54.860 222021 DEBUG nova.network.neutron [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Refreshing network info cache for port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:38:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:55.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:55 np0005593233 nova_compute[222017]: 2026-01-23 10:38:55.594 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:55.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:57.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:57 np0005593233 nova_compute[222017]: 2026-01-23 10:38:57.315 222021 DEBUG nova.network.neutron [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updated VIF entry in instance network info cache for port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:38:57 np0005593233 nova_compute[222017]: 2026-01-23 10:38:57.315 222021 DEBUG nova.network.neutron [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:38:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:57.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:38:58 np0005593233 nova_compute[222017]: 2026-01-23 10:38:58.238 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:58 np0005593233 nova_compute[222017]: 2026-01-23 10:38:58.990 222021 DEBUG oslo_concurrency.lockutils [req-d3c4cb83-b30a-4473-b86f-6e1157adf1c0 req-b19b7ab5-2c25-4497-aef4-12766d0deddd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:38:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:59.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:38:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:38:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:59.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:38:59 np0005593233 nova_compute[222017]: 2026-01-23 10:38:59.741 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:59 np0005593233 nova_compute[222017]: 2026-01-23 10:38:59.742 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:59 np0005593233 nova_compute[222017]: 2026-01-23 10:38:59.873 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:39:00 np0005593233 nova_compute[222017]: 2026-01-23 10:39:00.384 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:00 np0005593233 nova_compute[222017]: 2026-01-23 10:39:00.385 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:00 np0005593233 nova_compute[222017]: 2026-01-23 10:39:00.395 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:39:00 np0005593233 nova_compute[222017]: 2026-01-23 10:39:00.396 222021 INFO nova.compute.claims [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:39:00 np0005593233 nova_compute[222017]: 2026-01-23 10:39:00.554 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:00 np0005593233 nova_compute[222017]: 2026-01-23 10:39:00.597 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3248546176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.039 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.047 222021 DEBUG nova.compute.provider_tree [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.089 222021 DEBUG nova.scheduler.client.report [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.134 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.136 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:39:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:01.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.209 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.210 222021 DEBUG nova.network.neutron [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.233 222021 INFO nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.306 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.438 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.441 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.441 222021 INFO nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Creating image(s)#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.486 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.536 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.579 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.588 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:01.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.676 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.677 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.678 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.679 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.711 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.716 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 5b48f07a-f160-4459-8e47-98a5500c02b7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:01 np0005593233 nova_compute[222017]: 2026-01-23 10:39:01.856 222021 DEBUG nova.policy [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9d677e04372453aaea353af3361fe80', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ecd453e6632f42749f93ba49369d62a6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.110 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 5b48f07a-f160-4459-8e47-98a5500c02b7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.394s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.201 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] resizing rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.430 222021 DEBUG nova.objects.instance [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lazy-loading 'migration_context' on Instance uuid 5b48f07a-f160-4459-8e47-98a5500c02b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.442 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.442 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Ensure instance console log exists: /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.443 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.443 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:02 np0005593233 nova_compute[222017]: 2026-01-23 10:39:02.443 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:03.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:03 np0005593233 nova_compute[222017]: 2026-01-23 10:39:03.243 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:03 np0005593233 nova_compute[222017]: 2026-01-23 10:39:03.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:03.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.045 222021 DEBUG nova.network.neutron [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Successfully created port: ccccfc7d-ea6e-4765-8263-544b4b63a7c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:39:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:04Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:1c:0c 10.100.0.6
Jan 23 05:39:04 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:04Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:1c:0c 10.100.0.6
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.415 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.416 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1633603625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.897 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.997 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:39:04 np0005593233 nova_compute[222017]: 2026-01-23 10:39:04.998 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:39:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:05.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.241 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.244 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4110MB free_disk=20.925708770751953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.245 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.245 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.339 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0a916952-341a-4caf-bf6f-6abe504830f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.340 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 5b48f07a-f160-4459-8e47-98a5500c02b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.341 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.342 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.427 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.599 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:05.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.903 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.912 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.935 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.974 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:39:05 np0005593233 nova_compute[222017]: 2026-01-23 10:39:05.975 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:06 np0005593233 nova_compute[222017]: 2026-01-23 10:39:06.899 222021 DEBUG nova.network.neutron [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Successfully updated port: ccccfc7d-ea6e-4765-8263-544b4b63a7c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:39:06 np0005593233 nova_compute[222017]: 2026-01-23 10:39:06.924 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:06 np0005593233 nova_compute[222017]: 2026-01-23 10:39:06.925 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquired lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:06 np0005593233 nova_compute[222017]: 2026-01-23 10:39:06.925 222021 DEBUG nova.network.neutron [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:39:06 np0005593233 nova_compute[222017]: 2026-01-23 10:39:06.975 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:07 np0005593233 podman[297527]: 2026-01-23 10:39:07.070949486 +0000 UTC m=+0.074343633 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 05:39:07 np0005593233 nova_compute[222017]: 2026-01-23 10:39:07.149 222021 DEBUG nova.network.neutron [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:39:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:07.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:07 np0005593233 nova_compute[222017]: 2026-01-23 10:39:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:07 np0005593233 nova_compute[222017]: 2026-01-23 10:39:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:07 np0005593233 nova_compute[222017]: 2026-01-23 10:39:07.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:07 np0005593233 nova_compute[222017]: 2026-01-23 10:39:07.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:39:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:07.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:08 np0005593233 nova_compute[222017]: 2026-01-23 10:39:08.248 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:09.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.245 222021 DEBUG nova.network.neutron [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updating instance_info_cache with network_info: [{"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.289 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Releasing lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.290 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Instance network_info: |[{"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.295 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Start _get_guest_xml network_info=[{"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.300 222021 WARNING nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.306 222021 DEBUG nova.virt.libvirt.host [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.307 222021 DEBUG nova.virt.libvirt.host [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.311 222021 DEBUG nova.virt.libvirt.host [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.312 222021 DEBUG nova.virt.libvirt.host [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.315 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.315 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.316 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.316 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.317 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.317 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.318 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.318 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.319 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.319 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.320 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.320 222021 DEBUG nova.virt.hardware [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.326 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.378 222021 DEBUG nova.compute.manager [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-changed-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.379 222021 DEBUG nova.compute.manager [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Refreshing instance network info cache due to event network-changed-ccccfc7d-ea6e-4765-8263-544b4b63a7c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.381 222021 DEBUG oslo_concurrency.lockutils [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.383 222021 DEBUG oslo_concurrency.lockutils [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:09 np0005593233 nova_compute[222017]: 2026-01-23 10:39:09.383 222021 DEBUG nova.network.neutron [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Refreshing network info cache for port ccccfc7d-ea6e-4765-8263-544b4b63a7c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:39:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:09.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:39:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3277152640' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.080 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.755s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.126 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.134 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:39:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1993598349' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.632 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.650 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.652 222021 DEBUG nova.virt.libvirt.vif [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:38:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-138432656',display_name='tempest-TestServerBasicOps-server-138432656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-138432656',id=195,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKPrLh7oBeMxAadp8CKmLCwe+9bKooMOUtXxlccg8xtxX2slj0m59Qu9EFiU7fv32bze9eg44+9LaJHHEk3nUdeKcsQwELY9zyX2Ae62OFD2H2qMZLHXivujUjdiHX+M/g==',key_name='tempest-TestServerBasicOps-1901429679',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ecd453e6632f42749f93ba49369d62a6',ramdisk_id='',reservation_id='r-vbghndh0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1506790344',owner_user_name='tempest-TestServerBasicOps-1506790344-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:39:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a9d677e04372453aaea353af3361fe80',uuid=5b48f07a-f160-4459-8e47-98a5500c02b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.652 222021 DEBUG nova.network.os_vif_util [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Converting VIF {"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.653 222021 DEBUG nova.network.os_vif_util [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.655 222021 DEBUG nova.objects.instance [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b48f07a-f160-4459-8e47-98a5500c02b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.671 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <uuid>5b48f07a-f160-4459-8e47-98a5500c02b7</uuid>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <name>instance-000000c3</name>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestServerBasicOps-server-138432656</nova:name>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:39:09</nova:creationTime>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:user uuid="a9d677e04372453aaea353af3361fe80">tempest-TestServerBasicOps-1506790344-project-member</nova:user>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:project uuid="ecd453e6632f42749f93ba49369d62a6">tempest-TestServerBasicOps-1506790344</nova:project>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <nova:port uuid="ccccfc7d-ea6e-4765-8263-544b4b63a7c4">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <entry name="serial">5b48f07a-f160-4459-8e47-98a5500c02b7</entry>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <entry name="uuid">5b48f07a-f160-4459-8e47-98a5500c02b7</entry>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/5b48f07a-f160-4459-8e47-98a5500c02b7_disk">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/5b48f07a-f160-4459-8e47-98a5500c02b7_disk.config">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:af:72:36"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <target dev="tapccccfc7d-ea"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/console.log" append="off"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:39:10 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:39:10 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:39:10 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:39:10 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.672 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Preparing to wait for external event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.673 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.673 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.673 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.674 222021 DEBUG nova.virt.libvirt.vif [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:38:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-138432656',display_name='tempest-TestServerBasicOps-server-138432656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-138432656',id=195,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKPrLh7oBeMxAadp8CKmLCwe+9bKooMOUtXxlccg8xtxX2slj0m59Qu9EFiU7fv32bze9eg44+9LaJHHEk3nUdeKcsQwELY9zyX2Ae62OFD2H2qMZLHXivujUjdiHX+M/g==',key_name='tempest-TestServerBasicOps-1901429679',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ecd453e6632f42749f93ba49369d62a6',ramdisk_id='',reservation_id='r-vbghndh0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1506790344',owner_user_name='tempest-TestServerBasicOps-1506790344-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:39:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a9d677e04372453aaea353af3361fe80',uuid=5b48f07a-f160-4459-8e47-98a5500c02b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.674 222021 DEBUG nova.network.os_vif_util [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Converting VIF {"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.675 222021 DEBUG nova.network.os_vif_util [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.676 222021 DEBUG os_vif [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.677 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.677 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.682 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapccccfc7d-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.682 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapccccfc7d-ea, col_values=(('external_ids', {'iface-id': 'ccccfc7d-ea6e-4765-8263-544b4b63a7c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:72:36', 'vm-uuid': '5b48f07a-f160-4459-8e47-98a5500c02b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.684 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593233 NetworkManager[48871]: <info>  [1769164750.6854] manager: (tapccccfc7d-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.687 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.693 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.695 222021 INFO os_vif [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea')#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.784 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.785 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.785 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] No VIF found with MAC fa:16:3e:af:72:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.786 222021 INFO nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Using config drive#033[00m
Jan 23 05:39:10 np0005593233 nova_compute[222017]: 2026-01-23 10:39:10.824 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:11.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.368 222021 INFO nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Creating config drive at /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/disk.config#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.381 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbqxulg61 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.531 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbqxulg61" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.570 222021 DEBUG nova.storage.rbd_utils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] rbd image 5b48f07a-f160-4459-8e47-98a5500c02b7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.576 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/disk.config 5b48f07a-f160-4459-8e47-98a5500c02b7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:11.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.819 222021 DEBUG oslo_concurrency.processutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/disk.config 5b48f07a-f160-4459-8e47-98a5500c02b7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.243s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.820 222021 INFO nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Deleting local config drive /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7/disk.config because it was imported into RBD.#033[00m
Jan 23 05:39:11 np0005593233 kernel: tapccccfc7d-ea: entered promiscuous mode
Jan 23 05:39:11 np0005593233 NetworkManager[48871]: <info>  [1769164751.9118] manager: (tapccccfc7d-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/375)
Jan 23 05:39:11 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:11Z|00817|binding|INFO|Claiming lport ccccfc7d-ea6e-4765-8263-544b4b63a7c4 for this chassis.
Jan 23 05:39:11 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:11Z|00818|binding|INFO|ccccfc7d-ea6e-4765-8263-544b4b63a7c4: Claiming fa:16:3e:af:72:36 10.100.0.9
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.912 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.949 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:72:36 10.100.0.9'], port_security=['fa:16:3e:af:72:36 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5b48f07a-f160-4459-8e47-98a5500c02b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-35bf9e25-1db9-42a8-a951-69ce10610117', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ecd453e6632f42749f93ba49369d62a6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b0cc229-aa1b-4f69-84aa-a66ef2dce720 dd02754c-e40f-4dac-bc64-17c63b440f48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1d062f1-f47b-4783-a5da-4f9b3731ad66, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=ccccfc7d-ea6e-4765-8263-544b4b63a7c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:39:11 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:11Z|00819|binding|INFO|Setting lport ccccfc7d-ea6e-4765-8263-544b4b63a7c4 up in Southbound
Jan 23 05:39:11 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:11Z|00820|binding|INFO|Setting lport ccccfc7d-ea6e-4765-8263-544b4b63a7c4 ovn-installed in OVS
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.951 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.951 140224 INFO neutron.agent.ovn.metadata.agent [-] Port ccccfc7d-ea6e-4765-8263-544b4b63a7c4 in datapath 35bf9e25-1db9-42a8-a951-69ce10610117 bound to our chassis#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.952 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.953 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 35bf9e25-1db9-42a8-a951-69ce10610117#033[00m
Jan 23 05:39:11 np0005593233 nova_compute[222017]: 2026-01-23 10:39:11.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.971 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[54567ef0-703f-44f2-94e8-c59c61248bd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.973 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap35bf9e25-11 in ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.976 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap35bf9e25-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.976 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a95d35b4-83d5-45e4-a9db-086600aee18c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.978 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f39bb161-c04b-41f1-9357-9b012a396eca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:11 np0005593233 systemd-machined[190954]: New machine qemu-89-instance-000000c3.
Jan 23 05:39:11 np0005593233 systemd-udevd[297683]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:39:11 np0005593233 systemd[1]: Started Virtual Machine qemu-89-instance-000000c3.
Jan 23 05:39:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:11.997 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[91d971ce-106a-4d34-a672-2acb73c5f450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 NetworkManager[48871]: <info>  [1769164752.0044] device (tapccccfc7d-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:39:12 np0005593233 NetworkManager[48871]: <info>  [1769164752.0073] device (tapccccfc7d-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.017 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[21c0b7c3-24c3-4465-be92-d1eebe405171]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.063 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e994a03e-55c4-4b9a-b94e-7f61fcd0d237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.072 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[edcf1d00-7716-49f0-9219-70b4fa75da1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 systemd-udevd[297687]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:39:12 np0005593233 NetworkManager[48871]: <info>  [1769164752.0749] manager: (tap35bf9e25-10): new Veth device (/org/freedesktop/NetworkManager/Devices/376)
Jan 23 05:39:12 np0005593233 nova_compute[222017]: 2026-01-23 10:39:12.093 222021 DEBUG nova.network.neutron [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updated VIF entry in instance network info cache for port ccccfc7d-ea6e-4765-8263-544b4b63a7c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:39:12 np0005593233 nova_compute[222017]: 2026-01-23 10:39:12.093 222021 DEBUG nova.network.neutron [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updating instance_info_cache with network_info: [{"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.121 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d13b9a-1047-4adc-b6e9-fd89b88b61b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 nova_compute[222017]: 2026-01-23 10:39:12.125 222021 DEBUG oslo_concurrency.lockutils [req-3d22cc4d-9396-4a7b-b55f-68149ee83327 req-9157e2fd-bbb5-49ad-a3f7-4a7561908402 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.127 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e95910-cbee-4509-b4bb-0abeb615de88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 NetworkManager[48871]: <info>  [1769164752.1590] device (tap35bf9e25-10): carrier: link connected
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.170 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[7465f5cd-ce44-400c-ac30-ccd5ee0c53b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.197 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b26aec-a650-4935-8595-a0cbcea13a15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap35bf9e25-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:26:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 866479, 'reachable_time': 23971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297715, 'error': None, 'target': 'ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.223 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1a250bc6-71a4-4750-9904-0c73256fdfe7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:264f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 866479, 'tstamp': 866479}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297716, 'error': None, 'target': 'ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.248 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[041a12f9-5933-47d2-a213-8d9b93537909]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap35bf9e25-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:26:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 866479, 'reachable_time': 23971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 297717, 'error': None, 'target': 'ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.296 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[50bc308a-162b-432a-b45c-d9529c91cce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.390 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[00e3f9b6-5468-406f-9efe-fd4993cb2fcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.392 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35bf9e25-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.393 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.393 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap35bf9e25-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:12 np0005593233 NetworkManager[48871]: <info>  [1769164752.3965] manager: (tap35bf9e25-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Jan 23 05:39:12 np0005593233 nova_compute[222017]: 2026-01-23 10:39:12.395 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:12 np0005593233 kernel: tap35bf9e25-10: entered promiscuous mode
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.400 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap35bf9e25-10, col_values=(('external_ids', {'iface-id': '04af8666-509d-4616-83c2-b2541188cd56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:12 np0005593233 nova_compute[222017]: 2026-01-23 10:39:12.403 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:12 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:12Z|00821|binding|INFO|Releasing lport 04af8666-509d-4616-83c2-b2541188cd56 from this chassis (sb_readonly=0)
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.404 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/35bf9e25-1db9-42a8-a951-69ce10610117.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/35bf9e25-1db9-42a8-a951-69ce10610117.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.405 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[396b9057-e175-4b5e-9a7f-0b49169031c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.407 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-35bf9e25-1db9-42a8-a951-69ce10610117
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/35bf9e25-1db9-42a8-a951-69ce10610117.pid.haproxy
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 35bf9e25-1db9-42a8-a951-69ce10610117
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:39:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:12.408 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117', 'env', 'PROCESS_TAG=haproxy-35bf9e25-1db9-42a8-a951-69ce10610117', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/35bf9e25-1db9-42a8-a951-69ce10610117.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:39:12 np0005593233 nova_compute[222017]: 2026-01-23 10:39:12.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:12 np0005593233 podman[297751]: 2026-01-23 10:39:12.96228759 +0000 UTC m=+0.090753160 container create c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:39:13 np0005593233 podman[297751]: 2026-01-23 10:39:12.920044079 +0000 UTC m=+0.048509669 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:39:13 np0005593233 systemd[1]: Started libpod-conmon-c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7.scope.
Jan 23 05:39:13 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:39:13 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc944c365de26ba193d1798e4de1495dbedd32384f39def80e39fa1ba688c0b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:39:13 np0005593233 podman[297751]: 2026-01-23 10:39:13.075568139 +0000 UTC m=+0.204033749 container init c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:39:13 np0005593233 podman[297751]: 2026-01-23 10:39:13.091007768 +0000 UTC m=+0.219473328 container start c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:39:13 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [NOTICE]   (297804) : New worker (297808) forked
Jan 23 05:39:13 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [NOTICE]   (297804) : Loading success.
Jan 23 05:39:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:13.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.492 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164753.4910996, 5b48f07a-f160-4459-8e47-98a5500c02b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.492 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] VM Started (Lifecycle Event)#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.554 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.560 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164753.4916599, 5b48f07a-f160-4459-8e47-98a5500c02b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.561 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.607 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.612 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.641 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:39:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:13.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.689 222021 DEBUG nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.690 222021 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.690 222021 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.690 222021 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.691 222021 DEBUG nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Processing event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.691 222021 DEBUG nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.692 222021 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.692 222021 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.692 222021 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.693 222021 DEBUG nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] No waiting events found dispatching network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.693 222021 WARNING nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received unexpected event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.694 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.698 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164753.6982148, 5b48f07a-f160-4459-8e47-98a5500c02b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.698 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.701 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.706 222021 INFO nova.virt.libvirt.driver [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Instance spawned successfully.#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.708 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.737 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.744 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.745 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.745 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.746 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.747 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.748 222021 DEBUG nova.virt.libvirt.driver [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.760 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.882 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.926 222021 INFO nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Took 12.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:39:13 np0005593233 nova_compute[222017]: 2026-01-23 10:39:13.927 222021 DEBUG nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:14 np0005593233 nova_compute[222017]: 2026-01-23 10:39:14.027 222021 INFO nova.compute.manager [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Took 13.68 seconds to build instance.#033[00m
Jan 23 05:39:14 np0005593233 nova_compute[222017]: 2026-01-23 10:39:14.063 222021 DEBUG oslo_concurrency.lockutils [None req-2778bfe0-8683-484e-b704-b7cdd7f2fc74 a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:15.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:15.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:15 np0005593233 nova_compute[222017]: 2026-01-23 10:39:15.690 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:15 np0005593233 nova_compute[222017]: 2026-01-23 10:39:15.860 222021 DEBUG nova.compute.manager [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-changed-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:15 np0005593233 nova_compute[222017]: 2026-01-23 10:39:15.861 222021 DEBUG nova.compute.manager [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Refreshing instance network info cache due to event network-changed-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:39:15 np0005593233 nova_compute[222017]: 2026-01-23 10:39:15.861 222021 DEBUG oslo_concurrency.lockutils [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:15 np0005593233 nova_compute[222017]: 2026-01-23 10:39:15.862 222021 DEBUG oslo_concurrency.lockutils [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:15 np0005593233 nova_compute[222017]: 2026-01-23 10:39:15.862 222021 DEBUG nova.network.neutron [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Refreshing network info cache for port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:39:16 np0005593233 nova_compute[222017]: 2026-01-23 10:39:16.381 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:16 np0005593233 nova_compute[222017]: 2026-01-23 10:39:16.976 222021 DEBUG nova.compute.manager [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-changed-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:16 np0005593233 nova_compute[222017]: 2026-01-23 10:39:16.976 222021 DEBUG nova.compute.manager [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Refreshing instance network info cache due to event network-changed-ccccfc7d-ea6e-4765-8263-544b4b63a7c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:39:16 np0005593233 nova_compute[222017]: 2026-01-23 10:39:16.977 222021 DEBUG oslo_concurrency.lockutils [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:16 np0005593233 nova_compute[222017]: 2026-01-23 10:39:16.977 222021 DEBUG oslo_concurrency.lockutils [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:16 np0005593233 nova_compute[222017]: 2026-01-23 10:39:16.978 222021 DEBUG nova.network.neutron [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Refreshing network info cache for port ccccfc7d-ea6e-4765-8263-544b4b63a7c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:39:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:17.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:17 np0005593233 nova_compute[222017]: 2026-01-23 10:39:17.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:17 np0005593233 nova_compute[222017]: 2026-01-23 10:39:17.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:39:17 np0005593233 nova_compute[222017]: 2026-01-23 10:39:17.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:39:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:17.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:17 np0005593233 nova_compute[222017]: 2026-01-23 10:39:17.791 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:18.040 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:39:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:18.041 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:39:18 np0005593233 nova_compute[222017]: 2026-01-23 10:39:18.084 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:19.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:19.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:20 np0005593233 podman[297823]: 2026-01-23 10:39:20.143412213 +0000 UTC m=+0.139131384 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.554 222021 DEBUG nova.network.neutron [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updated VIF entry in instance network info cache for port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.556 222021 DEBUG nova.network.neutron [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.595 222021 DEBUG oslo_concurrency.lockutils [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.597 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.598 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.598 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:20 np0005593233 ceph-osd[78880]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.736 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.991 222021 DEBUG nova.network.neutron [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updated VIF entry in instance network info cache for port ccccfc7d-ea6e-4765-8263-544b4b63a7c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:39:20 np0005593233 nova_compute[222017]: 2026-01-23 10:39:20.992 222021 DEBUG nova.network.neutron [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updating instance_info_cache with network_info: [{"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:21 np0005593233 nova_compute[222017]: 2026-01-23 10:39:21.031 222021 DEBUG oslo_concurrency.lockutils [req-192a311f-841e-4593-9fa5-63891167a498 req-90ab8fa6-a621-4419-9bec-a22310249227 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5b48f07a-f160-4459-8e47-98a5500c02b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:21.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:21.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.409 222021 DEBUG oslo_concurrency.lockutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.410 222021 DEBUG oslo_concurrency.lockutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.445 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.452 222021 DEBUG nova.objects.instance [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.463 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.463 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:39:22 np0005593233 nova_compute[222017]: 2026-01-23 10:39:22.498 222021 DEBUG oslo_concurrency.lockutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:23.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:23.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:24.043 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.394 222021 DEBUG oslo_concurrency.lockutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.395 222021 DEBUG oslo_concurrency.lockutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.395 222021 INFO nova.compute.manager [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Attaching volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5 to /dev/vdb#033[00m
Jan 23 05:39:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.631 222021 DEBUG os_brick.utils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.633 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.656 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.657 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[12856d96-3f49-4b2d-9d74-9cd075bef30a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.659 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.674 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.674 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[6a11f706-bc42-4fd8-92b2-c0bbc49ce83d]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.676 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.691 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.691 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[90f0be98-bad1-4bc3-b473-2707f67f765a]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.693 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f97c7e-5a33-4b2a-9a70-d9ea5dfc5b3b]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.694 222021 DEBUG oslo_concurrency.processutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.746 222021 DEBUG oslo_concurrency.processutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "nvme version" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.751 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.752 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.752 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.753 222021 DEBUG os_brick.utils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] <== get_connector_properties: return (121ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:39:24 np0005593233 nova_compute[222017]: 2026-01-23 10:39:24.754 222021 DEBUG nova.virt.block_device [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating existing volume attachment record: 86b46c56-c46e-4e27-91fe-49fe600d4821 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:39:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:25.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:25.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:25 np0005593233 nova_compute[222017]: 2026-01-23 10:39:25.738 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:27.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.248 222021 DEBUG nova.objects.instance [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.280 222021 DEBUG nova.virt.libvirt.driver [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Attempting to attach volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.283 222021 DEBUG nova.virt.libvirt.guest [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5">
Jan 23 05:39:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:39:27 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  <serial>6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5</serial>
Jan 23 05:39:27 np0005593233 nova_compute[222017]:  <shareable/>
Jan 23 05:39:27 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:39:27 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.463 222021 DEBUG nova.virt.libvirt.driver [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.464 222021 DEBUG nova.virt.libvirt.driver [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.464 222021 DEBUG nova.virt.libvirt.driver [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.465 222021 DEBUG nova.virt.libvirt.driver [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:78:1c:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:39:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:27.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:27 np0005593233 nova_compute[222017]: 2026-01-23 10:39:27.743 222021 DEBUG oslo_concurrency.lockutils [None req-c03e3315-39d4-4c91-bcd5-2dd80fa62b2f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:28Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:72:36 10.100.0.9
Jan 23 05:39:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:28Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:72:36 10.100.0.9
Jan 23 05:39:28 np0005593233 podman[298051]: 2026-01-23 10:39:28.293304256 +0000 UTC m=+0.062082745 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:39:28 np0005593233 podman[298051]: 2026-01-23 10:39:28.499895896 +0000 UTC m=+0.268674395 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 23 05:39:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:29.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:29.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:30 np0005593233 nova_compute[222017]: 2026-01-23 10:39:30.741 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:39:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:39:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:31.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:31.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e380 e380: 3 total, 3 up, 3 in
Jan 23 05:39:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:33.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:33.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.107 222021 DEBUG oslo_concurrency.lockutils [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.108 222021 DEBUG oslo_concurrency.lockutils [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.131 222021 INFO nova.compute.manager [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Detaching volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.260 222021 INFO nova.virt.block_device [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Attempting to driver detach volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5 from mountpoint /dev/vdb#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.276 222021 DEBUG nova.virt.libvirt.driver [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Attempting to detach device vdb from instance 0a916952-341a-4caf-bf6f-6abe504830f9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.278 222021 DEBUG nova.virt.libvirt.guest [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5">
Jan 23 05:39:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <serial>6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5</serial>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <shareable/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:39:34 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.330 222021 INFO nova.virt.libvirt.driver [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully detached device vdb from instance 0a916952-341a-4caf-bf6f-6abe504830f9 from the persistent domain config.#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.331 222021 DEBUG nova.virt.libvirt.driver [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 0a916952-341a-4caf-bf6f-6abe504830f9 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.331 222021 DEBUG nova.virt.libvirt.guest [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5">
Jan 23 05:39:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <serial>6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5</serial>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <shareable/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:39:34 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:39:34 np0005593233 nova_compute[222017]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:39:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.673 222021 DEBUG nova.virt.libvirt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Received event <DeviceRemovedEvent: 1769164774.6725404, 0a916952-341a-4caf-bf6f-6abe504830f9 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.677 222021 DEBUG nova.virt.libvirt.driver [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 0a916952-341a-4caf-bf6f-6abe504830f9 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.682 222021 INFO nova.virt.libvirt.driver [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully detached device vdb from instance 0a916952-341a-4caf-bf6f-6abe504830f9 from the live domain config.#033[00m
Jan 23 05:39:34 np0005593233 nova_compute[222017]: 2026-01-23 10:39:34.970 222021 DEBUG nova.objects.instance [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.016 222021 DEBUG oslo_concurrency.lockutils [None req-fbed3cf5-9325-4550-b1ef-4550d513ef05 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:35.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:35.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.746 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.748 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:39:35 np0005593233 nova_compute[222017]: 2026-01-23 10:39:35.753 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.716261) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776716380, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1297, "num_deletes": 251, "total_data_size": 2740272, "memory_usage": 2775136, "flush_reason": "Manual Compaction"}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776732519, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 1809017, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 79679, "largest_seqno": 80971, "table_properties": {"data_size": 1803231, "index_size": 3116, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13032, "raw_average_key_size": 20, "raw_value_size": 1791371, "raw_average_value_size": 2821, "num_data_blocks": 136, "num_entries": 635, "num_filter_entries": 635, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164679, "oldest_key_time": 1769164679, "file_creation_time": 1769164776, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 16296 microseconds, and 6091 cpu microseconds.
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.732579) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 1809017 bytes OK
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.732605) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.734104) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.734122) EVENT_LOG_v1 {"time_micros": 1769164776734117, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.734145) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 2733942, prev total WAL file size 2733942, number of live WAL files 2.
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.735353) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(1766KB)], [165(12MB)]
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776735495, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 15435432, "oldest_snapshot_seqno": -1}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 9974 keys, 13459948 bytes, temperature: kUnknown
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776873705, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 13459948, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13394794, "index_size": 39121, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 263527, "raw_average_key_size": 26, "raw_value_size": 13219359, "raw_average_value_size": 1325, "num_data_blocks": 1493, "num_entries": 9974, "num_filter_entries": 9974, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164776, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.874566) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 13459948 bytes
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.876463) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.5 rd, 97.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(16.0) write-amplify(7.4) OK, records in: 10495, records dropped: 521 output_compression: NoCompression
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.876498) EVENT_LOG_v1 {"time_micros": 1769164776876480, "job": 106, "event": "compaction_finished", "compaction_time_micros": 138418, "compaction_time_cpu_micros": 64202, "output_level": 6, "num_output_files": 1, "total_output_size": 13459948, "num_input_records": 10495, "num_output_records": 9974, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776877426, "job": 106, "event": "table_file_deletion", "file_number": 167}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776882863, "job": 106, "event": "table_file_deletion", "file_number": 165}
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.735136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.882967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.882976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.882980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.882984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:39:36.882988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:37.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:37 np0005593233 podman[298336]: 2026-01-23 10:39:37.738858125 +0000 UTC m=+0.080416306 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 23 05:39:38 np0005593233 nova_compute[222017]: 2026-01-23 10:39:38.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 e381: 3 total, 3 up, 3 in
Jan 23 05:39:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:39.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:39.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:40 np0005593233 nova_compute[222017]: 2026-01-23 10:39:40.751 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:41Z|00822|binding|INFO|Releasing lport 04af8666-509d-4616-83c2-b2541188cd56 from this chassis (sb_readonly=0)
Jan 23 05:39:41 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:41Z|00823|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:39:41 np0005593233 nova_compute[222017]: 2026-01-23 10:39:41.182 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:41.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:41.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:42.700 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:42.701 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:42.702 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:43.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:43.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:44 np0005593233 nova_compute[222017]: 2026-01-23 10:39:44.931 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:44 np0005593233 nova_compute[222017]: 2026-01-23 10:39:44.934 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:39:44 np0005593233 nova_compute[222017]: 2026-01-23 10:39:44.951 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:39:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:45.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:45.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:45 np0005593233 nova_compute[222017]: 2026-01-23 10:39:45.787 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:47.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:47 np0005593233 nova_compute[222017]: 2026-01-23 10:39:47.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:47 np0005593233 nova_compute[222017]: 2026-01-23 10:39:47.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:39:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:47Z|00824|binding|INFO|Releasing lport 04af8666-509d-4616-83c2-b2541188cd56 from this chassis (sb_readonly=0)
Jan 23 05:39:47 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:47Z|00825|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:39:47 np0005593233 nova_compute[222017]: 2026-01-23 10:39:47.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:47.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:49.674 140481 DEBUG eventlet.wsgi.server [-] (140481) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:49.675 140481 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: Accept: */*#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: Connection: close#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: Content-Type: text/plain#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: Host: 169.254.169.254#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: User-Agent: curl/7.84.0#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: X-Forwarded-For: 10.100.0.9#015
Jan 23 05:39:49 np0005593233 ovn_metadata_agent[140219]: X-Ovn-Network-Id: 35bf9e25-1db9-42a8-a951-69ce10610117 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 23 05:39:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:49.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:50.745 140481 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:50.746 140481 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.0708134#033[00m
Jan 23 05:39:50 np0005593233 haproxy-metadata-proxy-35bf9e25-1db9-42a8-a951-69ce10610117[297808]: 10.100.0.9:55448 [23/Jan/2026:10:39:49.673] listener listener/metadata 0/0/0/1073/1073 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Jan 23 05:39:50 np0005593233 nova_compute[222017]: 2026-01-23 10:39:50.790 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:50.884 140481 DEBUG eventlet.wsgi.server [-] (140481) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:50.885 140481 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: Accept: */*#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: Connection: close#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: Content-Length: 100#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: Content-Type: application/x-www-form-urlencoded#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: Host: 169.254.169.254#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: User-Agent: curl/7.84.0#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: X-Forwarded-For: 10.100.0.9#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: X-Ovn-Network-Id: 35bf9e25-1db9-42a8-a951-69ce10610117#015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: #015
Jan 23 05:39:50 np0005593233 ovn_metadata_agent[140219]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 23 05:39:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:51.011 140481 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 23 05:39:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:51.012 140481 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.1271667#033[00m
Jan 23 05:39:51 np0005593233 haproxy-metadata-proxy-35bf9e25-1db9-42a8-a951-69ce10610117[297808]: 10.100.0.9:55454 [23/Jan/2026:10:39:50.883] listener listener/metadata 0/0/0/129/129 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Jan 23 05:39:51 np0005593233 podman[298385]: 2026-01-23 10:39:51.124420499 +0000 UTC m=+0.133614128 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:39:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:51.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:51.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:53.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.252 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.253 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.254 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.254 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.255 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.257 222021 INFO nova.compute.manager [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Terminating instance#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.258 222021 DEBUG nova.compute.manager [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:39:53 np0005593233 kernel: tapccccfc7d-ea (unregistering): left promiscuous mode
Jan 23 05:39:53 np0005593233 NetworkManager[48871]: <info>  [1769164793.3827] device (tapccccfc7d-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:39:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:53Z|00826|binding|INFO|Releasing lport ccccfc7d-ea6e-4765-8263-544b4b63a7c4 from this chassis (sb_readonly=0)
Jan 23 05:39:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:53Z|00827|binding|INFO|Setting lport ccccfc7d-ea6e-4765-8263-544b4b63a7c4 down in Southbound
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.398 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:39:53Z|00828|binding|INFO|Removing iface tapccccfc7d-ea ovn-installed in OVS
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.401 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.418 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:72:36 10.100.0.9'], port_security=['fa:16:3e:af:72:36 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5b48f07a-f160-4459-8e47-98a5500c02b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-35bf9e25-1db9-42a8-a951-69ce10610117', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ecd453e6632f42749f93ba49369d62a6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b0cc229-aa1b-4f69-84aa-a66ef2dce720 dd02754c-e40f-4dac-bc64-17c63b440f48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f1d062f1-f47b-4783-a5da-4f9b3731ad66, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=ccccfc7d-ea6e-4765-8263-544b4b63a7c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.418 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.421 140224 INFO neutron.agent.ovn.metadata.agent [-] Port ccccfc7d-ea6e-4765-8263-544b4b63a7c4 in datapath 35bf9e25-1db9-42a8-a951-69ce10610117 unbound from our chassis#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.423 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 35bf9e25-1db9-42a8-a951-69ce10610117, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.425 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[98855b5f-6a18-43e9-843e-700dc3c204fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.426 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117 namespace which is not needed anymore#033[00m
Jan 23 05:39:53 np0005593233 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000c3.scope: Deactivated successfully.
Jan 23 05:39:53 np0005593233 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000c3.scope: Consumed 17.000s CPU time.
Jan 23 05:39:53 np0005593233 systemd-machined[190954]: Machine qemu-89-instance-000000c3 terminated.
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.512 222021 INFO nova.virt.libvirt.driver [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Instance destroyed successfully.#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.515 222021 DEBUG nova.objects.instance [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lazy-loading 'resources' on Instance uuid 5b48f07a-f160-4459-8e47-98a5500c02b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.560 222021 DEBUG nova.virt.libvirt.vif [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:38:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-138432656',display_name='tempest-TestServerBasicOps-server-138432656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-138432656',id=195,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKPrLh7oBeMxAadp8CKmLCwe+9bKooMOUtXxlccg8xtxX2slj0m59Qu9EFiU7fv32bze9eg44+9LaJHHEk3nUdeKcsQwELY9zyX2Ae62OFD2H2qMZLHXivujUjdiHX+M/g==',key_name='tempest-TestServerBasicOps-1901429679',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:39:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ecd453e6632f42749f93ba49369d62a6',ramdisk_id='',reservation_id='r-vbghndh0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1506790344',owner_user_name='tempest-TestServerBasicOps-1506790344-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:39:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a9d677e04372453aaea353af3361fe80',uuid=5b48f07a-f160-4459-8e47-98a5500c02b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.561 222021 DEBUG nova.network.os_vif_util [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Converting VIF {"id": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "address": "fa:16:3e:af:72:36", "network": {"id": "35bf9e25-1db9-42a8-a951-69ce10610117", "bridge": "br-int", "label": "tempest-TestServerBasicOps-803056254-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ecd453e6632f42749f93ba49369d62a6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccccfc7d-ea", "ovs_interfaceid": "ccccfc7d-ea6e-4765-8263-544b4b63a7c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.562 222021 DEBUG nova.network.os_vif_util [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.563 222021 DEBUG os_vif [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.565 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.565 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapccccfc7d-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.567 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.569 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.573 222021 INFO os_vif [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:72:36,bridge_name='br-int',has_traffic_filtering=True,id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4,network=Network(35bf9e25-1db9-42a8-a951-69ce10610117),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccccfc7d-ea')#033[00m
Jan 23 05:39:53 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [NOTICE]   (297804) : haproxy version is 2.8.14-c23fe91
Jan 23 05:39:53 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [NOTICE]   (297804) : path to executable is /usr/sbin/haproxy
Jan 23 05:39:53 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [WARNING]  (297804) : Exiting Master process...
Jan 23 05:39:53 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [WARNING]  (297804) : Exiting Master process...
Jan 23 05:39:53 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [ALERT]    (297804) : Current worker (297808) exited with code 143 (Terminated)
Jan 23 05:39:53 np0005593233 neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117[297781]: [WARNING]  (297804) : All workers exited. Exiting... (0)
Jan 23 05:39:53 np0005593233 systemd[1]: libpod-c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7.scope: Deactivated successfully.
Jan 23 05:39:53 np0005593233 podman[298445]: 2026-01-23 10:39:53.627068141 +0000 UTC m=+0.057930247 container died c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:39:53 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7-userdata-shm.mount: Deactivated successfully.
Jan 23 05:39:53 np0005593233 systemd[1]: var-lib-containers-storage-overlay-cc944c365de26ba193d1798e4de1495dbedd32384f39def80e39fa1ba688c0b5-merged.mount: Deactivated successfully.
Jan 23 05:39:53 np0005593233 podman[298445]: 2026-01-23 10:39:53.680428917 +0000 UTC m=+0.111291023 container cleanup c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 05:39:53 np0005593233 systemd[1]: libpod-conmon-c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7.scope: Deactivated successfully.
Jan 23 05:39:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:53.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:53 np0005593233 podman[298494]: 2026-01-23 10:39:53.76250526 +0000 UTC m=+0.055718165 container remove c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.769 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc3b734-b28d-4e46-a1ce-700ec7041618]: (4, ('Fri Jan 23 10:39:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117 (c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7)\nc1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7\nFri Jan 23 10:39:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117 (c1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7)\nc1a0fb26b38fd99a5f73deb6e6db83dad31cb4683d89fd86f0d105ffa9d936b7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.773 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5bbb57ec-2587-4f0a-b7f5-ddb0d6529cfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.775 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35bf9e25-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 kernel: tap35bf9e25-10: left promiscuous mode
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.796 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5079dd-8e30-4e64-a67d-1ffa70548531]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.805 222021 DEBUG nova.compute.manager [req-3399c0ca-af99-48ec-a02f-1c8d4573a0c8 req-b0d88238-397c-4c80-b136-ed6fe4af5b12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-vif-unplugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.805 222021 DEBUG oslo_concurrency.lockutils [req-3399c0ca-af99-48ec-a02f-1c8d4573a0c8 req-b0d88238-397c-4c80-b136-ed6fe4af5b12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.805 222021 DEBUG oslo_concurrency.lockutils [req-3399c0ca-af99-48ec-a02f-1c8d4573a0c8 req-b0d88238-397c-4c80-b136-ed6fe4af5b12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.806 222021 DEBUG oslo_concurrency.lockutils [req-3399c0ca-af99-48ec-a02f-1c8d4573a0c8 req-b0d88238-397c-4c80-b136-ed6fe4af5b12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.806 222021 DEBUG nova.compute.manager [req-3399c0ca-af99-48ec-a02f-1c8d4573a0c8 req-b0d88238-397c-4c80-b136-ed6fe4af5b12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] No waiting events found dispatching network-vif-unplugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:39:53 np0005593233 nova_compute[222017]: 2026-01-23 10:39:53.806 222021 DEBUG nova.compute.manager [req-3399c0ca-af99-48ec-a02f-1c8d4573a0c8 req-b0d88238-397c-4c80-b136-ed6fe4af5b12 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-vif-unplugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.812 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[113c6940-3b4e-47c7-b6d0-f4313996f6d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.814 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[dc848f43-37a4-4628-93d4-6a112163cbee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.839 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f49cdc15-aafe-4384-b192-7ecaaf34e3e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 866468, 'reachable_time': 25041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298510, 'error': None, 'target': 'ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.843 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-35bf9e25-1db9-42a8-a951-69ce10610117 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:39:53 np0005593233 systemd[1]: run-netns-ovnmeta\x2d35bf9e25\x2d1db9\x2d42a8\x2da951\x2d69ce10610117.mount: Deactivated successfully.
Jan 23 05:39:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:39:53.844 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[b674d1a0-876b-482b-a410-a64d025b3fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:54 np0005593233 nova_compute[222017]: 2026-01-23 10:39:54.658 222021 INFO nova.virt.libvirt.driver [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Deleting instance files /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7_del#033[00m
Jan 23 05:39:54 np0005593233 nova_compute[222017]: 2026-01-23 10:39:54.659 222021 INFO nova.virt.libvirt.driver [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Deletion of /var/lib/nova/instances/5b48f07a-f160-4459-8e47-98a5500c02b7_del complete#033[00m
Jan 23 05:39:54 np0005593233 nova_compute[222017]: 2026-01-23 10:39:54.796 222021 INFO nova.compute.manager [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Took 1.54 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:39:54 np0005593233 nova_compute[222017]: 2026-01-23 10:39:54.797 222021 DEBUG oslo.service.loopingcall [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:39:54 np0005593233 nova_compute[222017]: 2026-01-23 10:39:54.797 222021 DEBUG nova.compute.manager [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:39:54 np0005593233 nova_compute[222017]: 2026-01-23 10:39:54.798 222021 DEBUG nova.network.neutron [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:39:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:55.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:55.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.887 222021 DEBUG nova.compute.manager [req-da70de9e-e11f-4b66-afc8-e8a90fc3fd13 req-7a67b971-974b-47d0-ac6f-11d568fa5c4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.888 222021 DEBUG oslo_concurrency.lockutils [req-da70de9e-e11f-4b66-afc8-e8a90fc3fd13 req-7a67b971-974b-47d0-ac6f-11d568fa5c4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.888 222021 DEBUG oslo_concurrency.lockutils [req-da70de9e-e11f-4b66-afc8-e8a90fc3fd13 req-7a67b971-974b-47d0-ac6f-11d568fa5c4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.888 222021 DEBUG oslo_concurrency.lockutils [req-da70de9e-e11f-4b66-afc8-e8a90fc3fd13 req-7a67b971-974b-47d0-ac6f-11d568fa5c4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.889 222021 DEBUG nova.compute.manager [req-da70de9e-e11f-4b66-afc8-e8a90fc3fd13 req-7a67b971-974b-47d0-ac6f-11d568fa5c4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] No waiting events found dispatching network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:39:55 np0005593233 nova_compute[222017]: 2026-01-23 10:39:55.889 222021 WARNING nova.compute.manager [req-da70de9e-e11f-4b66-afc8-e8a90fc3fd13 req-7a67b971-974b-47d0-ac6f-11d568fa5c4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received unexpected event network-vif-plugged-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.533 222021 DEBUG nova.compute.manager [req-c670ba2b-6319-45ce-a1c3-db2bdc369f8e req-cca853d1-7ec2-4728-afe0-27c7699605db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Received event network-vif-deleted-ccccfc7d-ea6e-4765-8263-544b4b63a7c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.533 222021 INFO nova.compute.manager [req-c670ba2b-6319-45ce-a1c3-db2bdc369f8e req-cca853d1-7ec2-4728-afe0-27c7699605db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Neutron deleted interface ccccfc7d-ea6e-4765-8263-544b4b63a7c4; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.534 222021 DEBUG nova.network.neutron [req-c670ba2b-6319-45ce-a1c3-db2bdc369f8e req-cca853d1-7ec2-4728-afe0-27c7699605db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.556 222021 DEBUG nova.network.neutron [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.561 222021 DEBUG nova.compute.manager [req-c670ba2b-6319-45ce-a1c3-db2bdc369f8e req-cca853d1-7ec2-4728-afe0-27c7699605db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Detach interface failed, port_id=ccccfc7d-ea6e-4765-8263-544b4b63a7c4, reason: Instance 5b48f07a-f160-4459-8e47-98a5500c02b7 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.569 222021 INFO nova.compute.manager [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Took 1.77 seconds to deallocate network for instance.#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.623 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.624 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:56 np0005593233 nova_compute[222017]: 2026-01-23 10:39:56.780 222021 DEBUG oslo_concurrency.processutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:39:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:57.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:39:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3458724352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:57 np0005593233 nova_compute[222017]: 2026-01-23 10:39:57.291 222021 DEBUG oslo_concurrency.processutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:57 np0005593233 nova_compute[222017]: 2026-01-23 10:39:57.298 222021 DEBUG nova.compute.provider_tree [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:57 np0005593233 nova_compute[222017]: 2026-01-23 10:39:57.342 222021 DEBUG nova.scheduler.client.report [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:57 np0005593233 nova_compute[222017]: 2026-01-23 10:39:57.363 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:57 np0005593233 nova_compute[222017]: 2026-01-23 10:39:57.394 222021 INFO nova.scheduler.client.report [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Deleted allocations for instance 5b48f07a-f160-4459-8e47-98a5500c02b7#033[00m
Jan 23 05:39:57 np0005593233 nova_compute[222017]: 2026-01-23 10:39:57.461 222021 DEBUG oslo_concurrency.lockutils [None req-516d75c4-1f7f-47b7-9604-85d7902ddd4e a9d677e04372453aaea353af3361fe80 ecd453e6632f42749f93ba49369d62a6 - - default default] Lock "5b48f07a-f160-4459-8e47-98a5500c02b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:39:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:57.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:39:58 np0005593233 nova_compute[222017]: 2026-01-23 10:39:58.568 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:59 np0005593233 nova_compute[222017]: 2026-01-23 10:39:59.156 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:59.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:39:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:59.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:00 np0005593233 nova_compute[222017]: 2026-01-23 10:40:00.796 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 05:40:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:01.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:01.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:03.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:03 np0005593233 nova_compute[222017]: 2026-01-23 10:40:03.572 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:03.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.006 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.007 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.040 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.136 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.136 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.143 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.144 222021 INFO nova.compute.claims [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.278 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.405 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:40:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2492435872' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.835 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.844 222021 DEBUG nova.compute.provider_tree [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.870 222021 DEBUG nova.scheduler.client.report [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.899 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.900 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.951 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.951 222021 DEBUG nova.network.neutron [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.972 222021 INFO nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:40:04 np0005593233 nova_compute[222017]: 2026-01-23 10:40:04.993 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.094 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.096 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.096 222021 INFO nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Creating image(s)#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.131 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.168 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.200 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.206 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:05.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.290 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.292 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.293 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.293 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.325 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.331 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.388 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.412 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.413 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.414 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:05.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.914 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:05 np0005593233 nova_compute[222017]: 2026-01-23 10:40:05.935 222021 DEBUG nova.policy [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93cd560e84264023877c47122b5919de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e762fca3b634c7aa1d994314c059c54', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.025 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.026 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.039 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.708s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.138 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] resizing rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.314 222021 DEBUG nova.objects.instance [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'migration_context' on Instance uuid f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.334 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.335 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Ensure instance console log exists: /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.335 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.335 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.336 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.429 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.430 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4003MB free_disk=20.87606430053711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.518 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0a916952-341a-4caf-bf6f-6abe504830f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.518 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.518 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.518 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:40:06 np0005593233 nova_compute[222017]: 2026-01-23 10:40:06.592 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.074 222021 DEBUG nova.network.neutron [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Successfully created port: 3b1ac782-1188-42b9-a89f-eb26c7876140 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:40:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:40:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2365154193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.117 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.130 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.158 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.192 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.193 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:07.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:07 np0005593233 nova_compute[222017]: 2026-01-23 10:40:07.272 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:07.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:07Z|00829|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:40:08 np0005593233 podman[298768]: 2026-01-23 10:40:08.057354922 +0000 UTC m=+0.060296364 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.509 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164793.508285, 5b48f07a-f160-4459-8e47-98a5500c02b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.510 222021 INFO nova.compute.manager [-] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.540 222021 DEBUG nova.compute.manager [None req-cd258f73-c596-4a07-8fad-df8e9b3f7ca7 - - - - - -] [instance: 5b48f07a-f160-4459-8e47-98a5500c02b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.575 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.672 222021 DEBUG nova.network.neutron [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Successfully updated port: 3b1ac782-1188-42b9-a89f-eb26c7876140 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.735 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.736 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.736 222021 DEBUG nova.network.neutron [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.956 222021 DEBUG nova.compute.manager [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.956 222021 DEBUG nova.compute.manager [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing instance network info cache due to event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:40:08 np0005593233 nova_compute[222017]: 2026-01-23 10:40:08.957 222021 DEBUG oslo_concurrency.lockutils [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:40:09 np0005593233 nova_compute[222017]: 2026-01-23 10:40:09.036 222021 DEBUG nova.network.neutron [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:40:09 np0005593233 nova_compute[222017]: 2026-01-23 10:40:09.190 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:09 np0005593233 nova_compute[222017]: 2026-01-23 10:40:09.191 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:09 np0005593233 nova_compute[222017]: 2026-01-23 10:40:09.191 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:09 np0005593233 nova_compute[222017]: 2026-01-23 10:40:09.192 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:40:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:09.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:09 np0005593233 nova_compute[222017]: 2026-01-23 10:40:09.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:09.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.601 222021 DEBUG nova.network.neutron [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.626 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.626 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Instance network_info: |[{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.627 222021 DEBUG oslo_concurrency.lockutils [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.627 222021 DEBUG nova.network.neutron [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.631 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Start _get_guest_xml network_info=[{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.639 222021 WARNING nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.648 222021 DEBUG nova.virt.libvirt.host [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.649 222021 DEBUG nova.virt.libvirt.host [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.654 222021 DEBUG nova.virt.libvirt.host [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.654 222021 DEBUG nova.virt.libvirt.host [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.655 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.656 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.656 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.656 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.656 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.656 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.657 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.657 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.657 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.657 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.657 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.657 222021 DEBUG nova.virt.hardware [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.661 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:10 np0005593233 nova_compute[222017]: 2026-01-23 10:40:10.801 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:11.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:40:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3395810892' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.367 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.706s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.394 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.399 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:11.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:40:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2916064983' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.895 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.898 222021 DEBUG nova.virt.libvirt.vif [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:40:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=198,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1k48k1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:40:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=f34f1af9-6c51-42ec-97f8-fb5bb146aeb6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.899 222021 DEBUG nova.network.os_vif_util [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.900 222021 DEBUG nova.network.os_vif_util [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.902 222021 DEBUG nova.objects.instance [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'pci_devices' on Instance uuid f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.928 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <uuid>f34f1af9-6c51-42ec-97f8-fb5bb146aeb6</uuid>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <name>instance-000000c6</name>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:name>multiattach-server-1</nova:name>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:40:10</nova:creationTime>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:user uuid="93cd560e84264023877c47122b5919de">tempest-AttachVolumeMultiAttachTest-63035580-project-member</nova:user>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:project uuid="6e762fca3b634c7aa1d994314c059c54">tempest-AttachVolumeMultiAttachTest-63035580</nova:project>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <nova:port uuid="3b1ac782-1188-42b9-a89f-eb26c7876140">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <entry name="serial">f34f1af9-6c51-42ec-97f8-fb5bb146aeb6</entry>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <entry name="uuid">f34f1af9-6c51-42ec-97f8-fb5bb146aeb6</entry>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk.config">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:ee:36:e0"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <target dev="tap3b1ac782-11"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/console.log" append="off"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:40:11 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:40:11 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:40:11 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:40:11 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.929 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Preparing to wait for external event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.929 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.930 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.930 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.932 222021 DEBUG nova.virt.libvirt.vif [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:40:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=198,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1k48k1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:40:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=f34f1af9-6c51-42ec-97f8-fb5bb146aeb6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.932 222021 DEBUG nova.network.os_vif_util [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.933 222021 DEBUG nova.network.os_vif_util [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.934 222021 DEBUG os_vif [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.935 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.935 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.936 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.942 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b1ac782-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.943 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b1ac782-11, col_values=(('external_ids', {'iface-id': '3b1ac782-1188-42b9-a89f-eb26c7876140', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:36:e0', 'vm-uuid': 'f34f1af9-6c51-42ec-97f8-fb5bb146aeb6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.945 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:11 np0005593233 NetworkManager[48871]: <info>  [1769164811.9466] manager: (tap3b1ac782-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:11 np0005593233 nova_compute[222017]: 2026-01-23 10:40:11.959 222021 INFO os_vif [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11')#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.038 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.038 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.038 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:ee:36:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.039 222021 INFO nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Using config drive#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.073 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.940 222021 INFO nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Creating config drive at /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/disk.config#033[00m
Jan 23 05:40:12 np0005593233 nova_compute[222017]: 2026-01-23 10:40:12.952 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvgttcp8a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.115 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvgttcp8a" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.157 222021 DEBUG nova.storage.rbd_utils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.161 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/disk.config f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:13.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.662 222021 DEBUG nova.network.neutron [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updated VIF entry in instance network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.663 222021 DEBUG nova.network.neutron [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.695 222021 DEBUG oslo_concurrency.lockutils [req-143dc4e3-a29b-480f-a23a-c4c6b93361a1 req-2627fb05-25b6-4cf5-8476-264fb76e86c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:13.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:13 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:13Z|00830|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:40:13 np0005593233 nova_compute[222017]: 2026-01-23 10:40:13.942 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593233 nova_compute[222017]: 2026-01-23 10:40:14.562 222021 DEBUG oslo_concurrency.processutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/disk.config f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:14 np0005593233 nova_compute[222017]: 2026-01-23 10:40:14.563 222021 INFO nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Deleting local config drive /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/disk.config because it was imported into RBD.#033[00m
Jan 23 05:40:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:14 np0005593233 kernel: tap3b1ac782-11: entered promiscuous mode
Jan 23 05:40:14 np0005593233 NetworkManager[48871]: <info>  [1769164814.6623] manager: (tap3b1ac782-11): new Tun device (/org/freedesktop/NetworkManager/Devices/379)
Jan 23 05:40:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:14Z|00831|binding|INFO|Claiming lport 3b1ac782-1188-42b9-a89f-eb26c7876140 for this chassis.
Jan 23 05:40:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:14Z|00832|binding|INFO|3b1ac782-1188-42b9-a89f-eb26c7876140: Claiming fa:16:3e:ee:36:e0 10.100.0.4
Jan 23 05:40:14 np0005593233 nova_compute[222017]: 2026-01-23 10:40:14.662 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.677 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:36:e0 10.100.0.4'], port_security=['fa:16:3e:ee:36:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f34f1af9-6c51-42ec-97f8-fb5bb146aeb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed138636-f650-4a09-b808-0b05f9067a5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=3b1ac782-1188-42b9-a89f-eb26c7876140) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.679 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 3b1ac782-1188-42b9-a89f-eb26c7876140 in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 bound to our chassis#033[00m
Jan 23 05:40:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:14Z|00833|binding|INFO|Setting lport 3b1ac782-1188-42b9-a89f-eb26c7876140 ovn-installed in OVS
Jan 23 05:40:14 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:14Z|00834|binding|INFO|Setting lport 3b1ac782-1188-42b9-a89f-eb26c7876140 up in Southbound
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.682 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04#033[00m
Jan 23 05:40:14 np0005593233 nova_compute[222017]: 2026-01-23 10:40:14.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.714 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[91fef562-9111-47cc-897b-e0df56d3e703]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:14 np0005593233 systemd-udevd[298924]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:40:14 np0005593233 systemd-machined[190954]: New machine qemu-90-instance-000000c6.
Jan 23 05:40:14 np0005593233 NetworkManager[48871]: <info>  [1769164814.7422] device (tap3b1ac782-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:40:14 np0005593233 NetworkManager[48871]: <info>  [1769164814.7431] device (tap3b1ac782-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:40:14 np0005593233 systemd[1]: Started Virtual Machine qemu-90-instance-000000c6.
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.767 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9ddff310-709a-43df-b220-9b8592c13c75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.772 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[c2de9552-d2e3-415b-9d08-55c6494ea127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.828 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[675888e1-8a58-4517-ad8d-b0d0a6d77e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.863 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f4ad3f-7c3b-4b73-9f80-1fad2b24ff2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864227, 'reachable_time': 40884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298937, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.891 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1338a9-d236-40d9-b539-25e02b9e0dcb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 864249, 'tstamp': 864249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298939, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 864254, 'tstamp': 864254}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298939, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.894 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:14 np0005593233 nova_compute[222017]: 2026-01-23 10:40:14.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593233 nova_compute[222017]: 2026-01-23 10:40:14.898 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.899 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba2ba4a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.900 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.907 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfba2ba4a-d0, col_values=(('external_ids', {'iface-id': '2348ddba-3dc3-4456-a637-f3065ba0d8f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:14.908 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.094 222021 DEBUG nova.compute.manager [req-b7cb7e3e-2584-4674-98e7-3f5df5e75199 req-190a09a1-dbc5-4234-a41e-37e9a28b3ba6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.095 222021 DEBUG oslo_concurrency.lockutils [req-b7cb7e3e-2584-4674-98e7-3f5df5e75199 req-190a09a1-dbc5-4234-a41e-37e9a28b3ba6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.095 222021 DEBUG oslo_concurrency.lockutils [req-b7cb7e3e-2584-4674-98e7-3f5df5e75199 req-190a09a1-dbc5-4234-a41e-37e9a28b3ba6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.096 222021 DEBUG oslo_concurrency.lockutils [req-b7cb7e3e-2584-4674-98e7-3f5df5e75199 req-190a09a1-dbc5-4234-a41e-37e9a28b3ba6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.096 222021 DEBUG nova.compute.manager [req-b7cb7e3e-2584-4674-98e7-3f5df5e75199 req-190a09a1-dbc5-4234-a41e-37e9a28b3ba6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Processing event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:40:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:15.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.313 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.707 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164815.7072914, f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.708 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] VM Started (Lifecycle Event)#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.712 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.717 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.722 222021 INFO nova.virt.libvirt.driver [-] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Instance spawned successfully.#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.722 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.739 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.744 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.760 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.761 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.762 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.762 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.763 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.764 222021 DEBUG nova.virt.libvirt.driver [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:15.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.777 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.778 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164815.7111225, f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.778 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.803 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.982 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.985 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164815.7167659, f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:15 np0005593233 nova_compute[222017]: 2026-01-23 10:40:15.986 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.100 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.105 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.127 222021 INFO nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Took 11.03 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.128 222021 DEBUG nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.157 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.207 222021 INFO nova.compute.manager [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Took 12.11 seconds to build instance.#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.231 222021 DEBUG oslo_concurrency.lockutils [None req-f7a1d6dc-633a-4e1b-b0c2-bf22c8f40f94 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:16 np0005593233 nova_compute[222017]: 2026-01-23 10:40:16.947 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:17.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:17.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.453 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.455 222021 DEBUG nova.compute.manager [req-407485eb-dd48-4ee1-8265-fb2d56c6d2ac req-91b38115-d9b9-475a-9fbc-132e9c521b8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.455 222021 DEBUG oslo_concurrency.lockutils [req-407485eb-dd48-4ee1-8265-fb2d56c6d2ac req-91b38115-d9b9-475a-9fbc-132e9c521b8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.455 222021 DEBUG oslo_concurrency.lockutils [req-407485eb-dd48-4ee1-8265-fb2d56c6d2ac req-91b38115-d9b9-475a-9fbc-132e9c521b8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.455 222021 DEBUG oslo_concurrency.lockutils [req-407485eb-dd48-4ee1-8265-fb2d56c6d2ac req-91b38115-d9b9-475a-9fbc-132e9c521b8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.455 222021 DEBUG nova.compute.manager [req-407485eb-dd48-4ee1-8265-fb2d56c6d2ac req-91b38115-d9b9-475a-9fbc-132e9c521b8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] No waiting events found dispatching network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.455 222021 WARNING nova.compute.manager [req-407485eb-dd48-4ee1-8265-fb2d56c6d2ac req-91b38115-d9b9-475a-9fbc-132e9c521b8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received unexpected event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:40:18 np0005593233 nova_compute[222017]: 2026-01-23 10:40:18.512 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:18.514 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:40:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:18.515 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:40:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:19.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:19 np0005593233 nova_compute[222017]: 2026-01-23 10:40:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:19 np0005593233 nova_compute[222017]: 2026-01-23 10:40:19.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:40:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:19.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:20 np0005593233 nova_compute[222017]: 2026-01-23 10:40:20.236 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:40:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:20.519 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:20 np0005593233 nova_compute[222017]: 2026-01-23 10:40:20.809 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:21.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:21.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:21 np0005593233 nova_compute[222017]: 2026-01-23 10:40:21.950 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:22 np0005593233 podman[298983]: 2026-01-23 10:40:22.140562031 +0000 UTC m=+0.141493792 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:40:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:23.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:23.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:23 np0005593233 nova_compute[222017]: 2026-01-23 10:40:23.922 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:24 np0005593233 nova_compute[222017]: 2026-01-23 10:40:24.194 222021 DEBUG nova.compute.manager [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:24 np0005593233 nova_compute[222017]: 2026-01-23 10:40:24.194 222021 DEBUG nova.compute.manager [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing instance network info cache due to event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:40:24 np0005593233 nova_compute[222017]: 2026-01-23 10:40:24.195 222021 DEBUG oslo_concurrency.lockutils [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:40:24 np0005593233 nova_compute[222017]: 2026-01-23 10:40:24.195 222021 DEBUG oslo_concurrency.lockutils [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:40:24 np0005593233 nova_compute[222017]: 2026-01-23 10:40:24.195 222021 DEBUG nova.network.neutron [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:40:24 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:40:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:25.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:25 np0005593233 ceph-osd[78880]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 23 05:40:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:25.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:25 np0005593233 nova_compute[222017]: 2026-01-23 10:40:25.816 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:26 np0005593233 nova_compute[222017]: 2026-01-23 10:40:26.060 222021 DEBUG nova.network.neutron [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updated VIF entry in instance network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:40:26 np0005593233 nova_compute[222017]: 2026-01-23 10:40:26.062 222021 DEBUG nova.network.neutron [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:40:26 np0005593233 nova_compute[222017]: 2026-01-23 10:40:26.848 222021 DEBUG oslo_concurrency.lockutils [req-16f7a851-ffa4-4578-87bf-f5aa34920493 req-61f4e580-35ee-40fa-bd4f-274d68e205e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:26 np0005593233 nova_compute[222017]: 2026-01-23 10:40:26.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:27.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:27.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:40:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4034719437' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:40:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:29.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:29.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:30 np0005593233 nova_compute[222017]: 2026-01-23 10:40:30.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:31.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.327 222021 DEBUG oslo_concurrency.lockutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.327 222021 DEBUG oslo_concurrency.lockutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.354 222021 DEBUG nova.objects.instance [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.603 222021 DEBUG oslo_concurrency.lockutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:31.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.858 222021 DEBUG oslo_concurrency.lockutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.859 222021 DEBUG oslo_concurrency.lockutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:31 np0005593233 nova_compute[222017]: 2026-01-23 10:40:31.859 222021 INFO nova.compute.manager [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Attaching volume 4393c992-1666-40a0-ab11-4cc66bdcd721 to /dev/vdb#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.002 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.023 222021 DEBUG os_brick.utils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.025 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.048 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.048 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c28aa9-e30e-4717-ab10-042b1dc166b6]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.050 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.070 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.071 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[baa2560a-7640-4f7f-87ae-dbeb17eb4c9b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.073 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.093 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.093 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[8698ba5b-bd5b-42eb-a007-8db4dc3a9ff0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.097 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c3334a64-fe5b-40d2-85bd-4a55c547f21b]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.099 222021 DEBUG oslo_concurrency.processutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.155 222021 DEBUG oslo_concurrency.processutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "nvme version" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.160 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.161 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.161 222021 DEBUG os_brick.initiator.connectors.lightos [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.161 222021 DEBUG os_brick.utils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] <== get_connector_properties: return (138ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.162 222021 DEBUG nova.virt.block_device [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating existing volume attachment record: bb54dc99-5583-479d-bf3a-45ee4e2d4c5a _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:40:32 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:32Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:36:e0 10.100.0.4
Jan 23 05:40:32 np0005593233 ovn_controller[130653]: 2026-01-23T10:40:32Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:36:e0 10.100.0.4
Jan 23 05:40:32 np0005593233 nova_compute[222017]: 2026-01-23 10:40:32.230 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:40:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1609784403' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:40:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:33.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:33.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:33 np0005593233 nova_compute[222017]: 2026-01-23 10:40:33.853 222021 DEBUG nova.objects.instance [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:40:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:34 np0005593233 nova_compute[222017]: 2026-01-23 10:40:34.905 222021 DEBUG nova.virt.libvirt.driver [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Attempting to attach volume 4393c992-1666-40a0-ab11-4cc66bdcd721 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:40:34 np0005593233 nova_compute[222017]: 2026-01-23 10:40:34.911 222021 DEBUG nova.virt.libvirt.guest [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  <source protocol="rbd" name="volumes/volume-4393c992-1666-40a0-ab11-4cc66bdcd721">
Jan 23 05:40:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  </source>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  <auth username="openstack">
Jan 23 05:40:34 np0005593233 nova_compute[222017]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  </auth>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  <serial>4393c992-1666-40a0-ab11-4cc66bdcd721</serial>
Jan 23 05:40:34 np0005593233 nova_compute[222017]:  <shareable/>
Jan 23 05:40:34 np0005593233 nova_compute[222017]: </disk>
Jan 23 05:40:34 np0005593233 nova_compute[222017]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:40:35 np0005593233 nova_compute[222017]: 2026-01-23 10:40:35.175 222021 DEBUG nova.virt.libvirt.driver [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:35 np0005593233 nova_compute[222017]: 2026-01-23 10:40:35.177 222021 DEBUG nova.virt.libvirt.driver [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:35 np0005593233 nova_compute[222017]: 2026-01-23 10:40:35.178 222021 DEBUG nova.virt.libvirt.driver [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:35 np0005593233 nova_compute[222017]: 2026-01-23 10:40:35.178 222021 DEBUG nova.virt.libvirt.driver [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:ee:36:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:40:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:35.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:35 np0005593233 nova_compute[222017]: 2026-01-23 10:40:35.539 222021 DEBUG oslo_concurrency.lockutils [None req-c1f813b1-fd3a-46f5-99e3-86f75c39cae4 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:35.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:35 np0005593233 nova_compute[222017]: 2026-01-23 10:40:35.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:37 np0005593233 nova_compute[222017]: 2026-01-23 10:40:37.053 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:37.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:37.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:38 np0005593233 podman[299092]: 2026-01-23 10:40:38.268405517 +0000 UTC m=+0.128545034 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:40:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:39.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:39.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:40:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:40:40 np0005593233 nova_compute[222017]: 2026-01-23 10:40:40.899 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:41.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:41.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:42 np0005593233 nova_compute[222017]: 2026-01-23 10:40:42.055 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:42.701 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:42.702 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:40:42.703 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:43.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:43.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:45.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:45.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:45 np0005593233 nova_compute[222017]: 2026-01-23 10:40:45.943 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:47 np0005593233 nova_compute[222017]: 2026-01-23 10:40:47.100 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:47.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:49.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.337478) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849337561, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 989, "num_deletes": 253, "total_data_size": 2028847, "memory_usage": 2051592, "flush_reason": "Manual Compaction"}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849350340, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 853521, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 80976, "largest_seqno": 81960, "table_properties": {"data_size": 849839, "index_size": 1397, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10230, "raw_average_key_size": 21, "raw_value_size": 841737, "raw_average_value_size": 1735, "num_data_blocks": 62, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164777, "oldest_key_time": 1769164777, "file_creation_time": 1769164849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 12940 microseconds, and 7569 cpu microseconds.
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.350422) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 853521 bytes OK
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.350455) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.353173) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.353203) EVENT_LOG_v1 {"time_micros": 1769164849353192, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.353235) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 2023831, prev total WAL file size 2044824, number of live WAL files 2.
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.355069) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373538' seq:72057594037927935, type:22 .. '6D6772737461740033303130' seq:0, type:0; will stop at (end)
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(833KB)], [168(12MB)]
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849355165, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14313469, "oldest_snapshot_seqno": -1}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 9966 keys, 10927141 bytes, temperature: kUnknown
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849511224, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 10927141, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10865866, "index_size": 35242, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 263543, "raw_average_key_size": 26, "raw_value_size": 10694377, "raw_average_value_size": 1073, "num_data_blocks": 1333, "num_entries": 9966, "num_filter_entries": 9966, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.511698) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 10927141 bytes
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.513425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 91.6 rd, 70.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 12.8 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(29.6) write-amplify(12.8) OK, records in: 10459, records dropped: 493 output_compression: NoCompression
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.513459) EVENT_LOG_v1 {"time_micros": 1769164849513444, "job": 108, "event": "compaction_finished", "compaction_time_micros": 156185, "compaction_time_cpu_micros": 54779, "output_level": 6, "num_output_files": 1, "total_output_size": 10927141, "num_input_records": 10459, "num_output_records": 9966, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849514039, "job": 108, "event": "table_file_deletion", "file_number": 170}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849518866, "job": 108, "event": "table_file_deletion", "file_number": 168}
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.354891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.519021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.519031) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.519035) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.519038) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:40:49.519041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:49.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:50 np0005593233 nova_compute[222017]: 2026-01-23 10:40:50.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:51.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:51.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:52 np0005593233 nova_compute[222017]: 2026-01-23 10:40:52.103 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:53 np0005593233 podman[299246]: 2026-01-23 10:40:53.164869445 +0000 UTC m=+0.165526484 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:40:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:53.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:53.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:55.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:55.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:55 np0005593233 nova_compute[222017]: 2026-01-23 10:40:55.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:57 np0005593233 nova_compute[222017]: 2026-01-23 10:40:57.106 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:40:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:57.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:40:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:57.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:40:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:59.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:40:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e382 e382: 3 total, 3 up, 3 in
Jan 23 05:40:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:40:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:59.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:00 np0005593233 nova_compute[222017]: 2026-01-23 10:41:00.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:01.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:01.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:02 np0005593233 nova_compute[222017]: 2026-01-23 10:41:02.109 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:03.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:03.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:04 np0005593233 nova_compute[222017]: 2026-01-23 10:41:04.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:05.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:05.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:05.944 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:05.946 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.969 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.970 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.971 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.971 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:41:05 np0005593233 nova_compute[222017]: 2026-01-23 10:41:05.972 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:41:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2770732387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.518 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.704 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.705 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.709 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.709 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.709 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.891 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.893 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3858MB free_disk=20.805782318115234GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.894 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:06 np0005593233 nova_compute[222017]: 2026-01-23 10:41:06.894 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.112 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:07.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.407 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0a916952-341a-4caf-bf6f-6abe504830f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.407 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.408 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.408 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.430 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.451 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.452 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.468 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.493 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:41:07 np0005593233 nova_compute[222017]: 2026-01-23 10:41:07.561 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:07.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:41:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2626647116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:41:08 np0005593233 nova_compute[222017]: 2026-01-23 10:41:08.056 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:08 np0005593233 nova_compute[222017]: 2026-01-23 10:41:08.064 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:41:08 np0005593233 nova_compute[222017]: 2026-01-23 10:41:08.114 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:41:08 np0005593233 nova_compute[222017]: 2026-01-23 10:41:08.290 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:41:08 np0005593233 nova_compute[222017]: 2026-01-23 10:41:08.291 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:08.947 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:09 np0005593233 podman[299323]: 2026-01-23 10:41:09.06600106 +0000 UTC m=+0.069443414 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:41:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:09.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:09.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.290 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.291 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.292 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.292 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.292 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:10 np0005593233 nova_compute[222017]: 2026-01-23 10:41:10.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:11.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:41:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1176690165' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:41:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:41:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1176690165' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:41:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:11.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e383 e383: 3 total, 3 up, 3 in
Jan 23 05:41:12 np0005593233 nova_compute[222017]: 2026-01-23 10:41:12.114 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:13.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:13.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e383 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:15.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:15 np0005593233 nova_compute[222017]: 2026-01-23 10:41:15.960 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:17 np0005593233 nova_compute[222017]: 2026-01-23 10:41:17.148 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:17.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:17.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.8 total, 600.0 interval#012Cumulative writes: 60K writes, 234K keys, 60K commit groups, 1.0 writes per commit group, ingest: 0.22 GB, 0.04 MB/s#012Cumulative WAL: 60K writes, 22K syncs, 2.65 writes per sync, written: 0.22 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8121 writes, 30K keys, 8121 commit groups, 1.0 writes per commit group, ingest: 30.43 MB, 0.05 MB/s#012Interval WAL: 8121 writes, 3259 syncs, 2.49 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.67              0.00         1    0.665       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.8 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.7 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.8 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f132801610#2 capacity: 1.11 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000122231%) FilterBlock(3,0.33 KB,2.82073e-05%) IndexBlock(3,0.34 KB,2.95505e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.8 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 
Jan 23 05:41:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:19.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e383 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:19.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.790 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.791 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.791 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.791 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:41:20 np0005593233 nova_compute[222017]: 2026-01-23 10:41:20.964 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:21.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:21.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:22 np0005593233 nova_compute[222017]: 2026-01-23 10:41:22.152 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:23.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:23.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:24 np0005593233 podman[299345]: 2026-01-23 10:41:24.088047222 +0000 UTC m=+0.103674967 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:41:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e384 e384: 3 total, 3 up, 3 in
Jan 23 05:41:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:25.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:25.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:25 np0005593233 nova_compute[222017]: 2026-01-23 10:41:25.967 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:27 np0005593233 nova_compute[222017]: 2026-01-23 10:41:27.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:27.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:27.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:29.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:29.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:30 np0005593233 nova_compute[222017]: 2026-01-23 10:41:30.705 222021 DEBUG nova.compute.manager [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:30 np0005593233 nova_compute[222017]: 2026-01-23 10:41:30.706 222021 DEBUG nova.compute.manager [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing instance network info cache due to event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:41:30 np0005593233 nova_compute[222017]: 2026-01-23 10:41:30.707 222021 DEBUG oslo_concurrency.lockutils [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:30 np0005593233 nova_compute[222017]: 2026-01-23 10:41:30.707 222021 DEBUG oslo_concurrency.lockutils [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:30 np0005593233 nova_compute[222017]: 2026-01-23 10:41:30.707 222021 DEBUG nova.network.neutron [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:41:30 np0005593233 nova_compute[222017]: 2026-01-23 10:41:30.998 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:31.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:31.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:32 np0005593233 nova_compute[222017]: 2026-01-23 10:41:32.158 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:33.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:33 np0005593233 nova_compute[222017]: 2026-01-23 10:41:33.696 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:33.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:34 np0005593233 nova_compute[222017]: 2026-01-23 10:41:34.554 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:34 np0005593233 nova_compute[222017]: 2026-01-23 10:41:34.555 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:41:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:35.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:35.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:36 np0005593233 nova_compute[222017]: 2026-01-23 10:41:36.045 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:37 np0005593233 nova_compute[222017]: 2026-01-23 10:41:37.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:37.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:37.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:39.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e385 e385: 3 total, 3 up, 3 in
Jan 23 05:41:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:39.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:39 np0005593233 nova_compute[222017]: 2026-01-23 10:41:39.981 222021 DEBUG nova.network.neutron [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updated VIF entry in instance network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:41:39 np0005593233 nova_compute[222017]: 2026-01-23 10:41:39.982 222021 DEBUG nova.network.neutron [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:40 np0005593233 podman[299376]: 2026-01-23 10:41:40.090661535 +0000 UTC m=+0.095433973 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 05:41:40 np0005593233 nova_compute[222017]: 2026-01-23 10:41:40.504 222021 DEBUG oslo_concurrency.lockutils [req-9bce3a56-fc39-49d4-8bdd-ad1d4a88a53d req-0b9b572d-0732-4936-88a9-231652af709d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:41 np0005593233 nova_compute[222017]: 2026-01-23 10:41:41.049 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:41.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:41.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:42 np0005593233 nova_compute[222017]: 2026-01-23 10:41:42.207 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:42.702 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:42.703 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:42.703 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:42 np0005593233 nova_compute[222017]: 2026-01-23 10:41:42.710 222021 DEBUG oslo_concurrency.lockutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:42 np0005593233 nova_compute[222017]: 2026-01-23 10:41:42.711 222021 DEBUG oslo_concurrency.lockutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:42 np0005593233 nova_compute[222017]: 2026-01-23 10:41:42.711 222021 DEBUG nova.network.neutron [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:41:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:43.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:43.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:44 np0005593233 nova_compute[222017]: 2026-01-23 10:41:44.424 222021 DEBUG nova.network.neutron [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:44 np0005593233 nova_compute[222017]: 2026-01-23 10:41:44.459 222021 DEBUG oslo_concurrency.lockutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:44 np0005593233 nova_compute[222017]: 2026-01-23 10:41:44.594 222021 DEBUG nova.virt.libvirt.driver [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 05:41:44 np0005593233 nova_compute[222017]: 2026-01-23 10:41:44.595 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Creating file /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/03518c5d82cc49aaa57a1935afdd043d.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 05:41:44 np0005593233 nova_compute[222017]: 2026-01-23 10:41:44.596 222021 DEBUG oslo_concurrency.processutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/03518c5d82cc49aaa57a1935afdd043d.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:45 np0005593233 nova_compute[222017]: 2026-01-23 10:41:45.147 222021 DEBUG oslo_concurrency.processutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/03518c5d82cc49aaa57a1935afdd043d.tmp" returned: 1 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:45 np0005593233 nova_compute[222017]: 2026-01-23 10:41:45.148 222021 DEBUG oslo_concurrency.processutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6/03518c5d82cc49aaa57a1935afdd043d.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:41:45 np0005593233 nova_compute[222017]: 2026-01-23 10:41:45.149 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Creating directory /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 05:41:45 np0005593233 nova_compute[222017]: 2026-01-23 10:41:45.149 222021 DEBUG oslo_concurrency.processutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:45 np0005593233 nova_compute[222017]: 2026-01-23 10:41:45.419 222021 DEBUG oslo_concurrency.processutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" returned: 0 in 0.270s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:45.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:45 np0005593233 nova_compute[222017]: 2026-01-23 10:41:45.428 222021 DEBUG nova.virt.libvirt.driver [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:41:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:45.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:46 np0005593233 nova_compute[222017]: 2026-01-23 10:41:46.060 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:47 np0005593233 nova_compute[222017]: 2026-01-23 10:41:47.251 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:47.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:47.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:48 np0005593233 kernel: tap3b1ac782-11 (unregistering): left promiscuous mode
Jan 23 05:41:48 np0005593233 NetworkManager[48871]: <info>  [1769164908.3613] device (tap3b1ac782-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:41:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:41:48Z|00835|binding|INFO|Releasing lport 3b1ac782-1188-42b9-a89f-eb26c7876140 from this chassis (sb_readonly=0)
Jan 23 05:41:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:41:48Z|00836|binding|INFO|Setting lport 3b1ac782-1188-42b9-a89f-eb26c7876140 down in Southbound
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:41:48Z|00837|binding|INFO|Removing iface tap3b1ac782-11 ovn-installed in OVS
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.431 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:36:e0 10.100.0.4'], port_security=['fa:16:3e:ee:36:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'f34f1af9-6c51-42ec-97f8-fb5bb146aeb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed138636-f650-4a09-b808-0b05f9067a5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=3b1ac782-1188-42b9-a89f-eb26c7876140) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.432 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 3b1ac782-1188-42b9-a89f-eb26c7876140 in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 unbound from our chassis#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.433 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.458 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0e72e4-53e0-4c52-8f51-d248d7fad0e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:48 np0005593233 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000c6.scope: Deactivated successfully.
Jan 23 05:41:48 np0005593233 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000c6.scope: Consumed 19.581s CPU time.
Jan 23 05:41:48 np0005593233 systemd-machined[190954]: Machine qemu-90-instance-000000c6 terminated.
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.518 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fb891684-7023-4b01-8bb6-a7706d6e9718]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.523 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a503ffb7-4417-42e2-b9f9-8582dca1e09b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.565 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a46e5d6f-7a12-433c-ae49-bc5c7c2ff349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.594 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[07564cd4-9772-4a5c-be6d-7ed21a0a88ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864227, 'reachable_time': 40884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299410, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.623 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f52d26d7-b943-4933-890e-9f0c2917cac1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 864249, 'tstamp': 864249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299411, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 864254, 'tstamp': 864254}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299411, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.625 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.627 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.635 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.636 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba2ba4a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.636 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.637 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfba2ba4a-d0, col_values=(('external_ids', {'iface-id': '2348ddba-3dc3-4456-a637-f3065ba0d8f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.637 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.664 222021 DEBUG nova.compute.manager [req-d73228b2-a896-490f-a4d2-d02433f02f23 req-28091c24-bedf-4910-afd0-1dff1eaf85d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-vif-unplugged-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.664 222021 DEBUG oslo_concurrency.lockutils [req-d73228b2-a896-490f-a4d2-d02433f02f23 req-28091c24-bedf-4910-afd0-1dff1eaf85d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.665 222021 DEBUG oslo_concurrency.lockutils [req-d73228b2-a896-490f-a4d2-d02433f02f23 req-28091c24-bedf-4910-afd0-1dff1eaf85d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.665 222021 DEBUG oslo_concurrency.lockutils [req-d73228b2-a896-490f-a4d2-d02433f02f23 req-28091c24-bedf-4910-afd0-1dff1eaf85d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.665 222021 DEBUG nova.compute.manager [req-d73228b2-a896-490f-a4d2-d02433f02f23 req-28091c24-bedf-4910-afd0-1dff1eaf85d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] No waiting events found dispatching network-vif-unplugged-3b1ac782-1188-42b9-a89f-eb26c7876140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.665 222021 WARNING nova.compute.manager [req-d73228b2-a896-490f-a4d2-d02433f02f23 req-28091c24-bedf-4910-afd0-1dff1eaf85d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received unexpected event network-vif-unplugged-3b1ac782-1188-42b9-a89f-eb26c7876140 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.668 222021 INFO nova.virt.libvirt.driver [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.675 222021 INFO nova.virt.libvirt.driver [-] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Instance destroyed successfully.#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.676 222021 DEBUG nova.virt.libvirt.vif [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:40:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-1',id=198,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:40:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1k48k1p2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:41:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=f34f1af9-6c51-42ec-97f8-fb5bb146aeb6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "vif_mac": "fa:16:3e:ee:36:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.677 222021 DEBUG nova.network.os_vif_util [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "vif_mac": "fa:16:3e:ee:36:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.678 222021 DEBUG nova.network.os_vif_util [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.678 222021 DEBUG os_vif [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.681 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1ac782-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.684 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.686 222021 INFO os_vif [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11')#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.849 222021 DEBUG nova.virt.libvirt.driver [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.849 222021 DEBUG nova.virt.libvirt.driver [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.850 222021 DEBUG nova.virt.libvirt.driver [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.972 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:48 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:48.972 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:41:48 np0005593233 nova_compute[222017]: 2026-01-23 10:41:48.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:49 np0005593233 nova_compute[222017]: 2026-01-23 10:41:49.421 222021 DEBUG neutronclient.v2_0.client [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3b1ac782-1188-42b9-a89f-eb26c7876140 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:41:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:49.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:49 np0005593233 nova_compute[222017]: 2026-01-23 10:41:49.537 222021 DEBUG oslo_concurrency.lockutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:49 np0005593233 nova_compute[222017]: 2026-01-23 10:41:49.538 222021 DEBUG oslo_concurrency.lockutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:49 np0005593233 nova_compute[222017]: 2026-01-23 10:41:49.538 222021 DEBUG oslo_concurrency.lockutils [None req-0e0edcb4-dfa1-45aa-97de-307869ae07c0 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:49.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.805 222021 DEBUG nova.compute.manager [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.806 222021 DEBUG oslo_concurrency.lockutils [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.807 222021 DEBUG oslo_concurrency.lockutils [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.807 222021 DEBUG oslo_concurrency.lockutils [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.808 222021 DEBUG nova.compute.manager [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] No waiting events found dispatching network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.808 222021 WARNING nova.compute.manager [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received unexpected event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.808 222021 DEBUG nova.compute.manager [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.809 222021 DEBUG nova.compute.manager [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing instance network info cache due to event network-changed-3b1ac782-1188-42b9-a89f-eb26c7876140. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.810 222021 DEBUG oslo_concurrency.lockutils [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.810 222021 DEBUG oslo_concurrency.lockutils [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:50 np0005593233 nova_compute[222017]: 2026-01-23 10:41:50.810 222021 DEBUG nova.network.neutron [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Refreshing network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:41:51 np0005593233 nova_compute[222017]: 2026-01-23 10:41:51.086 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:51.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:51.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:41:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:41:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:41:52 np0005593233 nova_compute[222017]: 2026-01-23 10:41:52.924 222021 DEBUG nova.network.neutron [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updated VIF entry in instance network info cache for port 3b1ac782-1188-42b9-a89f-eb26c7876140. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:41:52 np0005593233 nova_compute[222017]: 2026-01-23 10:41:52.924 222021 DEBUG nova.network.neutron [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:52 np0005593233 nova_compute[222017]: 2026-01-23 10:41:52.946 222021 DEBUG oslo_concurrency.lockutils [req-9b563e15-ac82-4079-83f9-a0260e79e7aa req-bfa8addd-cdd4-45fc-945e-4f86b59fc37c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:53.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:53 np0005593233 nova_compute[222017]: 2026-01-23 10:41:53.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:53.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e386 e386: 3 total, 3 up, 3 in
Jan 23 05:41:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:55 np0005593233 podman[299553]: 2026-01-23 10:41:55.138146879 +0000 UTC m=+0.121921866 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:41:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:55.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:41:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:55.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:41:56 np0005593233 nova_compute[222017]: 2026-01-23 10:41:56.089 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 82K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1584 writes, 8071 keys, 1584 commit groups, 1.0 writes per commit group, ingest: 16.52 MB, 0.03 MB/s#012Interval WAL: 1584 writes, 1584 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     53.2      1.93              0.47        54    0.036       0      0       0.0       0.0#012  L6      1/0   10.42 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.2     78.0     66.5      7.95              1.86        53    0.150    391K    28K       0.0       0.0#012 Sum      1/0   10.42 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.2     62.8     63.9      9.88              2.33       107    0.092    391K    28K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.1     82.6     79.8      0.97              0.36        12    0.081     61K   3099       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0     78.0     66.5      7.95              1.86        53    0.150    391K    28K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     53.3      1.92              0.47        53    0.036       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.100, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.62 GB write, 0.11 MB/s write, 0.61 GB read, 0.10 MB/s read, 9.9 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 67.77 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000474 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3859,64.98 MB,21.3737%) FilterBlock(107,1.06 MB,0.350325%) IndexBlock(107,1.73 MB,0.569449%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:41:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:41:56.974 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:41:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:57.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:41:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:57.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:58 np0005593233 nova_compute[222017]: 2026-01-23 10:41:58.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:59 np0005593233 nova_compute[222017]: 2026-01-23 10:41:59.094 222021 DEBUG nova.compute.manager [req-c986ae52-e071-4c92-b953-cbd564e39362 req-f2a225af-dc66-46bf-9e58-5df1e2e32335 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:59 np0005593233 nova_compute[222017]: 2026-01-23 10:41:59.094 222021 DEBUG oslo_concurrency.lockutils [req-c986ae52-e071-4c92-b953-cbd564e39362 req-f2a225af-dc66-46bf-9e58-5df1e2e32335 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:59 np0005593233 nova_compute[222017]: 2026-01-23 10:41:59.095 222021 DEBUG oslo_concurrency.lockutils [req-c986ae52-e071-4c92-b953-cbd564e39362 req-f2a225af-dc66-46bf-9e58-5df1e2e32335 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:59 np0005593233 nova_compute[222017]: 2026-01-23 10:41:59.095 222021 DEBUG oslo_concurrency.lockutils [req-c986ae52-e071-4c92-b953-cbd564e39362 req-f2a225af-dc66-46bf-9e58-5df1e2e32335 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:59 np0005593233 nova_compute[222017]: 2026-01-23 10:41:59.095 222021 DEBUG nova.compute.manager [req-c986ae52-e071-4c92-b953-cbd564e39362 req-f2a225af-dc66-46bf-9e58-5df1e2e32335 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] No waiting events found dispatching network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:41:59 np0005593233 nova_compute[222017]: 2026-01-23 10:41:59.096 222021 WARNING nova.compute.manager [req-c986ae52-e071-4c92-b953-cbd564e39362 req-f2a225af-dc66-46bf-9e58-5df1e2e32335 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received unexpected event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 05:41:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:41:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:41:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:41:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:59.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:42:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.179 222021 DEBUG nova.compute.manager [req-e83bee79-cc1c-4238-bcdd-1525aca5bb09 req-ecbf1275-c2e0-4dc0-90df-5a510f0df801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.179 222021 DEBUG oslo_concurrency.lockutils [req-e83bee79-cc1c-4238-bcdd-1525aca5bb09 req-ecbf1275-c2e0-4dc0-90df-5a510f0df801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.179 222021 DEBUG oslo_concurrency.lockutils [req-e83bee79-cc1c-4238-bcdd-1525aca5bb09 req-ecbf1275-c2e0-4dc0-90df-5a510f0df801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.179 222021 DEBUG oslo_concurrency.lockutils [req-e83bee79-cc1c-4238-bcdd-1525aca5bb09 req-ecbf1275-c2e0-4dc0-90df-5a510f0df801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.180 222021 DEBUG nova.compute.manager [req-e83bee79-cc1c-4238-bcdd-1525aca5bb09 req-ecbf1275-c2e0-4dc0-90df-5a510f0df801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] No waiting events found dispatching network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:42:01 np0005593233 nova_compute[222017]: 2026-01-23 10:42:01.180 222021 WARNING nova.compute.manager [req-e83bee79-cc1c-4238-bcdd-1525aca5bb09 req-ecbf1275-c2e0-4dc0-90df-5a510f0df801 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Received unexpected event network-vif-plugged-3b1ac782-1188-42b9-a89f-eb26c7876140 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:42:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:01.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:01.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:03.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.642 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.642 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.642 222021 DEBUG nova.compute.manager [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Going to confirm migration 20 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.668 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164908.667077, f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.668 222021 INFO nova.compute.manager [-] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.688 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.710 222021 DEBUG nova.compute.manager [None req-f16f3cd9-8b3d-4bd8-a041-246aa1f3eaba - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.714 222021 DEBUG nova.compute.manager [None req-f16f3cd9-8b3d-4bd8-a041-246aa1f3eaba - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:42:03 np0005593233 nova_compute[222017]: 2026-01-23 10:42:03.755 222021 INFO nova.compute.manager [None req-f16f3cd9-8b3d-4bd8-a041-246aa1f3eaba - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 23 05:42:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:03.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:04 np0005593233 nova_compute[222017]: 2026-01-23 10:42:04.093 222021 DEBUG neutronclient.v2_0.client [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3b1ac782-1188-42b9-a89f-eb26c7876140 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:42:04 np0005593233 nova_compute[222017]: 2026-01-23 10:42:04.095 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:42:04 np0005593233 nova_compute[222017]: 2026-01-23 10:42:04.095 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquired lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:42:04 np0005593233 nova_compute[222017]: 2026-01-23 10:42:04.096 222021 DEBUG nova.network.neutron [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:42:04 np0005593233 nova_compute[222017]: 2026-01-23 10:42:04.096 222021 DEBUG nova.objects.instance [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'info_cache' on Instance uuid f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:42:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.414 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.415 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:05.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:05 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4013191920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.902 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:05 np0005593233 nova_compute[222017]: 2026-01-23 10:42:05.943 222021 DEBUG nova.network.neutron [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating instance_info_cache with network_info: [{"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:42:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:05.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.096 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.194 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Releasing lock "refresh_cache-f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.195 222021 DEBUG nova.objects.instance [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'migration_context' on Instance uuid f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.205 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.206 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.211 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.212 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.213 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.315 222021 DEBUG nova.storage.rbd_utils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] removing snapshot(nova-resize) on rbd image(f34f1af9-6c51-42ec-97f8-fb5bb146aeb6_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.528 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.529 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4045MB free_disk=20.805679321289062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.529 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.530 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.582 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration for instance f34f1af9-6c51-42ec-97f8-fb5bb146aeb6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.617 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Updating resource usage from migration adbf924a-ffa1-465a-bc4b-f2bc3ae2e761#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.618 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Starting to track outgoing migration adbf924a-ffa1-465a-bc4b-f2bc3ae2e761 with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.749 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0a916952-341a-4caf-bf6f-6abe504830f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.750 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration adbf924a-ffa1-465a-bc4b-f2bc3ae2e761 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.750 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.750 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:42:06 np0005593233 nova_compute[222017]: 2026-01-23 10:42:06.897 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1185299055' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.361 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.368 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.386 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.410 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:07.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e387 e387: 3 total, 3 up, 3 in
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.826 222021 DEBUG nova.virt.libvirt.vif [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:40:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='multiattach-server-1',id=198,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:41:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1k48k1p2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:41:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=f34f1af9-6c51-42ec-97f8-fb5bb146aeb6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.827 222021 DEBUG nova.network.os_vif_util [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "3b1ac782-1188-42b9-a89f-eb26c7876140", "address": "fa:16:3e:ee:36:e0", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1ac782-11", "ovs_interfaceid": "3b1ac782-1188-42b9-a89f-eb26c7876140", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.828 222021 DEBUG nova.network.os_vif_util [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.829 222021 DEBUG os_vif [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.831 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.831 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1ac782-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.832 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.834 222021 INFO os_vif [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:36:e0,bridge_name='br-int',has_traffic_filtering=True,id=3b1ac782-1188-42b9-a89f-eb26c7876140,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1ac782-11')#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.835 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:07 np0005593233 nova_compute[222017]: 2026-01-23 10:42:07.835 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:07.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:08 np0005593233 nova_compute[222017]: 2026-01-23 10:42:08.539 222021 DEBUG oslo_concurrency.processutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:08 np0005593233 nova_compute[222017]: 2026-01-23 10:42:08.690 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3955969414' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.038 222021 DEBUG oslo_concurrency.processutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.045 222021 DEBUG nova.compute.provider_tree [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.071 222021 DEBUG nova.scheduler.client.report [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.132 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.272 222021 INFO nova.scheduler.client.report [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Deleted allocation for migration adbf924a-ffa1-465a-bc4b-f2bc3ae2e761#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.356 222021 DEBUG oslo_concurrency.lockutils [None req-0d7f493c-fd54-4747-b2a9-8e27d11f73cd 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "f34f1af9-6c51-42ec-97f8-fb5bb146aeb6" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.409 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.410 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.410 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:09 np0005593233 nova_compute[222017]: 2026-01-23 10:42:09.410 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:42:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:09.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:09.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:11 np0005593233 podman[299738]: 2026-01-23 10:42:11.086973081 +0000 UTC m=+0.088485365 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:42:11 np0005593233 nova_compute[222017]: 2026-01-23 10:42:11.101 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:11 np0005593233 nova_compute[222017]: 2026-01-23 10:42:11.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:11.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:11.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:12 np0005593233 nova_compute[222017]: 2026-01-23 10:42:12.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:13.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:13 np0005593233 nova_compute[222017]: 2026-01-23 10:42:13.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:14.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:15.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:16.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:16 np0005593233 nova_compute[222017]: 2026-01-23 10:42:16.153 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:17.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:18 np0005593233 nova_compute[222017]: 2026-01-23 10:42:18.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 e388: 3 total, 3 up, 3 in
Jan 23 05:42:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:19.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:20 np0005593233 nova_compute[222017]: 2026-01-23 10:42:20.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:20 np0005593233 nova_compute[222017]: 2026-01-23 10:42:20.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:20 np0005593233 nova_compute[222017]: 2026-01-23 10:42:20.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:42:20 np0005593233 nova_compute[222017]: 2026-01-23 10:42:20.422 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: f34f1af9-6c51-42ec-97f8-fb5bb146aeb6] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 23 05:42:20 np0005593233 nova_compute[222017]: 2026-01-23 10:42:20.422 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:42:21 np0005593233 nova_compute[222017]: 2026-01-23 10:42:21.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:21.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:22.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:23.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:23 np0005593233 nova_compute[222017]: 2026-01-23 10:42:23.696 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:24.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:25.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:26 np0005593233 podman[299760]: 2026-01-23 10:42:26.092904487 +0000 UTC m=+0.101583488 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:42:26 np0005593233 nova_compute[222017]: 2026-01-23 10:42:26.159 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:27.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:28 np0005593233 nova_compute[222017]: 2026-01-23 10:42:28.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:29.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:30.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:31 np0005593233 nova_compute[222017]: 2026-01-23 10:42:31.162 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:31.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:32.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:33.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:33 np0005593233 nova_compute[222017]: 2026-01-23 10:42:33.700 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:34.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.288 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.289 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.313 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.425 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.426 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.437 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.437 222021 INFO nova.compute.claims [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:42:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:34 np0005593233 nova_compute[222017]: 2026-01-23 10:42:34.874 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/316037187' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.431 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.441 222021 DEBUG nova.compute.provider_tree [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.458 222021 DEBUG nova.scheduler.client.report [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.480 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.481 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:42:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:35.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.532 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.533 222021 DEBUG nova.network.neutron [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.557 222021 INFO nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.580 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:42:35 np0005593233 nova_compute[222017]: 2026-01-23 10:42:35.628 222021 INFO nova.virt.block_device [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Booting with volume d77d9325-542a-4716-94b2-e66e8ceab532 at /dev/vda#033[00m
Jan 23 05:42:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:36.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.220 222021 DEBUG os_brick.utils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.222 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.240 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.241 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[0a19c0f4-3bea-4c16-a64a-371d7e5b5784]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.242 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.253 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.254 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c2390c93-f272-49b2-850d-61fa099355ed]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.255 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.271 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.271 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[c00f6262-cf29-4909-9b8a-f17909d0609a]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.273 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[b37a1e8b-18b2-4365-a9dc-b469b3b2b4ab]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.274 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.313 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "nvme version" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.316 222021 DEBUG os_brick.initiator.connectors.lightos [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.317 222021 DEBUG os_brick.initiator.connectors.lightos [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.317 222021 DEBUG os_brick.initiator.connectors.lightos [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.317 222021 DEBUG os_brick.utils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] <== get_connector_properties: return (97ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.318 222021 DEBUG nova.virt.block_device [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating existing volume attachment record: dc799e24-6978-49e0-b450-6019c423c769 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.418 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:36 np0005593233 nova_compute[222017]: 2026-01-23 10:42:36.970 222021 DEBUG nova.policy [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb70c3aee8b64273a1930c0c2c231aff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd27c5465284b48a5818ef931d6251c43', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:42:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:37.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.579 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.582 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.583 222021 INFO nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Creating image(s)#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.584 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.585 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Ensure instance console log exists: /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.585 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.586 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:37 np0005593233 nova_compute[222017]: 2026-01-23 10:42:37.587 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:38.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:38 np0005593233 nova_compute[222017]: 2026-01-23 10:42:38.170 222021 DEBUG nova.network.neutron [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Successfully created port: ec48fcb1-8f75-412d-a33c-7e0896158f9a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:42:38 np0005593233 nova_compute[222017]: 2026-01-23 10:42:38.701 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.084 222021 DEBUG nova.network.neutron [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Successfully updated port: ec48fcb1-8f75-412d-a33c-7e0896158f9a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.106 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.107 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquired lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.107 222021 DEBUG nova.network.neutron [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.239 222021 DEBUG nova.compute.manager [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-changed-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.240 222021 DEBUG nova.compute.manager [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Refreshing instance network info cache due to event network-changed-ec48fcb1-8f75-412d-a33c-7e0896158f9a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.240 222021 DEBUG oslo_concurrency.lockutils [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:42:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:39.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:39 np0005593233 nova_compute[222017]: 2026-01-23 10:42:39.984 222021 DEBUG nova.network.neutron [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:42:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:40.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.882 222021 DEBUG nova.network.neutron [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating instance_info_cache with network_info: [{"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.904 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Releasing lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.905 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Instance network_info: |[{"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.906 222021 DEBUG oslo_concurrency.lockutils [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.906 222021 DEBUG nova.network.neutron [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Refreshing network info cache for port ec48fcb1-8f75-412d-a33c-7e0896158f9a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.912 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Start _get_guest_xml network_info=[{"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-d77d9325-542a-4716-94b2-e66e8ceab532', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'd77d9325-542a-4716-94b2-e66e8ceab532', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'b23e0eb3-82ec-4c33-aedd-b815e9513866', 'attached_at': '', 'detached_at': '', 'volume_id': 'd77d9325-542a-4716-94b2-e66e8ceab532', 'serial': 'd77d9325-542a-4716-94b2-e66e8ceab532'}, 'delete_on_termination': True, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'dc799e24-6978-49e0-b450-6019c423c769', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.919 222021 WARNING nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.925 222021 DEBUG nova.virt.libvirt.host [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.926 222021 DEBUG nova.virt.libvirt.host [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.938 222021 DEBUG nova.virt.libvirt.host [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.939 222021 DEBUG nova.virt.libvirt.host [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.941 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.942 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.943 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.944 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.944 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.945 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.945 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.946 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.947 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.948 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.948 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.949 222021 DEBUG nova.virt.hardware [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:42:40 np0005593233 nova_compute[222017]: 2026-01-23 10:42:40.994 222021 DEBUG nova.storage.rbd_utils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image b23e0eb3-82ec-4c33-aedd-b815e9513866_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.000 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.222 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:42:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3167533987' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:42:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:41.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.770 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.771s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.801 222021 DEBUG nova.virt.libvirt.vif [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:42:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1926199215',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1926199215',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1926199215',id=201,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZxNVxO0IvqUKejsONm/M7JuNET1Clz+bIx75ZwPGtswiNNpJd3BcCEmXn9C+CF23N06TGmkRLx9ZMUWkiPaF8xgmBrvIR54FA+yZMRLTRlCEQbyzx6123Om6WuFPlz2w==',key_name='tempest-keypair-962541773',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-73n2hjou',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:42:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=b23e0eb3-82ec-4c33-aedd-b815e9513866,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.802 222021 DEBUG nova.network.os_vif_util [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.803 222021 DEBUG nova.network.os_vif_util [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.805 222021 DEBUG nova.objects.instance [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'pci_devices' on Instance uuid b23e0eb3-82ec-4c33-aedd-b815e9513866 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.825 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <uuid>b23e0eb3-82ec-4c33-aedd-b815e9513866</uuid>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <name>instance-000000c9</name>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestVolumeBootPattern-volume-backed-server-1926199215</nova:name>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:42:40</nova:creationTime>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:user uuid="eb70c3aee8b64273a1930c0c2c231aff">tempest-TestVolumeBootPattern-2139361132-project-member</nova:user>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:project uuid="d27c5465284b48a5818ef931d6251c43">tempest-TestVolumeBootPattern-2139361132</nova:project>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <nova:port uuid="ec48fcb1-8f75-412d-a33c-7e0896158f9a">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <entry name="serial">b23e0eb3-82ec-4c33-aedd-b815e9513866</entry>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <entry name="uuid">b23e0eb3-82ec-4c33-aedd-b815e9513866</entry>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/b23e0eb3-82ec-4c33-aedd-b815e9513866_disk.config">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-d77d9325-542a-4716-94b2-e66e8ceab532">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <serial>d77d9325-542a-4716-94b2-e66e8ceab532</serial>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:17:d5:ea"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <target dev="tapec48fcb1-8f"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/console.log" append="off"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:42:41 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:42:41 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:42:41 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:42:41 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.827 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Preparing to wait for external event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.828 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.829 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.830 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.831 222021 DEBUG nova.virt.libvirt.vif [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:42:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1926199215',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1926199215',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1926199215',id=201,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZxNVxO0IvqUKejsONm/M7JuNET1Clz+bIx75ZwPGtswiNNpJd3BcCEmXn9C+CF23N06TGmkRLx9ZMUWkiPaF8xgmBrvIR54FA+yZMRLTRlCEQbyzx6123Om6WuFPlz2w==',key_name='tempest-keypair-962541773',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-73n2hjou',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:42:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=b23e0eb3-82ec-4c33-aedd-b815e9513866,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.831 222021 DEBUG nova.network.os_vif_util [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.833 222021 DEBUG nova.network.os_vif_util [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.833 222021 DEBUG os_vif [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.835 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.836 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.841 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.841 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec48fcb1-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.842 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec48fcb1-8f, col_values=(('external_ids', {'iface-id': 'ec48fcb1-8f75-412d-a33c-7e0896158f9a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:d5:ea', 'vm-uuid': 'b23e0eb3-82ec-4c33-aedd-b815e9513866'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.869 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:41 np0005593233 NetworkManager[48871]: <info>  [1769164961.8704] manager: (tapec48fcb1-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.873 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.878 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:41 np0005593233 nova_compute[222017]: 2026-01-23 10:42:41.879 222021 INFO os_vif [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f')#033[00m
Jan 23 05:42:42 np0005593233 podman[299862]: 2026-01-23 10:42:42.055558242 +0000 UTC m=+0.057899506 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:42:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:42.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:42 np0005593233 nova_compute[222017]: 2026-01-23 10:42:42.080 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:42:42 np0005593233 nova_compute[222017]: 2026-01-23 10:42:42.080 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:42:42 np0005593233 nova_compute[222017]: 2026-01-23 10:42:42.081 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No VIF found with MAC fa:16:3e:17:d5:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:42:42 np0005593233 nova_compute[222017]: 2026-01-23 10:42:42.081 222021 INFO nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Using config drive#033[00m
Jan 23 05:42:42 np0005593233 nova_compute[222017]: 2026-01-23 10:42:42.280 222021 DEBUG nova.storage.rbd_utils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image b23e0eb3-82ec-4c33-aedd-b815e9513866_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:42.704 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:42.705 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:42.706 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.324 222021 INFO nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Creating config drive at /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/disk.config#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.330 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppvm1dpd4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.379 222021 DEBUG nova.network.neutron [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updated VIF entry in instance network info cache for port ec48fcb1-8f75-412d-a33c-7e0896158f9a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.380 222021 DEBUG nova.network.neutron [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating instance_info_cache with network_info: [{"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.422 222021 DEBUG oslo_concurrency.lockutils [req-fee246f7-9d6f-43fe-8418-25b108b1d79a req-5122220c-d447-4c69-8fdf-d0608fa90b30 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.488 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppvm1dpd4" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:43.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.938 222021 DEBUG nova.storage.rbd_utils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image b23e0eb3-82ec-4c33-aedd-b815e9513866_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:42:43 np0005593233 nova_compute[222017]: 2026-01-23 10:42:43.944 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/disk.config b23e0eb3-82ec-4c33-aedd-b815e9513866_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:44.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:45.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:45 np0005593233 nova_compute[222017]: 2026-01-23 10:42:45.835 222021 DEBUG oslo_concurrency.processutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/disk.config b23e0eb3-82ec-4c33-aedd-b815e9513866_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.891s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:45 np0005593233 nova_compute[222017]: 2026-01-23 10:42:45.836 222021 INFO nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Deleting local config drive /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866/disk.config because it was imported into RBD.#033[00m
Jan 23 05:42:45 np0005593233 virtqemud[221325]: End of file while reading data: Input/output error
Jan 23 05:42:45 np0005593233 kernel: tapec48fcb1-8f: entered promiscuous mode
Jan 23 05:42:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:42:45Z|00838|binding|INFO|Claiming lport ec48fcb1-8f75-412d-a33c-7e0896158f9a for this chassis.
Jan 23 05:42:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:42:45Z|00839|binding|INFO|ec48fcb1-8f75-412d-a33c-7e0896158f9a: Claiming fa:16:3e:17:d5:ea 10.100.0.4
Jan 23 05:42:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:42:45Z|00840|binding|INFO|Setting lport ec48fcb1-8f75-412d-a33c-7e0896158f9a ovn-installed in OVS
Jan 23 05:42:45 np0005593233 nova_compute[222017]: 2026-01-23 10:42:45.942 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:45 np0005593233 NetworkManager[48871]: <info>  [1769164965.9442] manager: (tapec48fcb1-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/381)
Jan 23 05:42:45 np0005593233 nova_compute[222017]: 2026-01-23 10:42:45.946 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:45 np0005593233 systemd-udevd[299951]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:42:45 np0005593233 systemd-machined[190954]: New machine qemu-91-instance-000000c9.
Jan 23 05:42:45 np0005593233 NetworkManager[48871]: <info>  [1769164965.9862] device (tapec48fcb1-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:42:45 np0005593233 NetworkManager[48871]: <info>  [1769164965.9871] device (tapec48fcb1-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:42:45 np0005593233 systemd[1]: Started Virtual Machine qemu-91-instance-000000c9.
Jan 23 05:42:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:46.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:46 np0005593233 ovn_controller[130653]: 2026-01-23T10:42:46Z|00841|binding|INFO|Setting lport ec48fcb1-8f75-412d-a33c-7e0896158f9a up in Southbound
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.153 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:d5:ea 10.100.0.4'], port_security=['fa:16:3e:17:d5:ea 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b23e0eb3-82ec-4c33-aedd-b815e9513866', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '2', 'neutron:security_group_ids': '491382a0-febf-49cb-a75d-e59f1bfedc5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=ec48fcb1-8f75-412d-a33c-7e0896158f9a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.155 140224 INFO neutron.agent.ovn.metadata.agent [-] Port ec48fcb1-8f75-412d-a33c-7e0896158f9a in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 bound to our chassis#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.156 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 72854481-c2f9-4651-8ba1-fe321a8a5546#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.174 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[70bd5477-338f-4f50-9ad8-cd9b5f23f348]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.175 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap72854481-c1 in ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.178 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap72854481-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.178 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[97305c16-16cd-41b2-9304-1ca9910c5082]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.179 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[478a1a05-27ad-42b5-9af6-f1bc916f5544]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.197 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[75519953-636d-4de7-a5f7-ad99a3753949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.220 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[65a5453c-91aa-4b71-9563-41d79e215c69]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.262 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[3afe9e8f-da59-428b-9ea3-e2ce4cefe91e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.268 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[defee647-11c5-460d-af66-c1981002aca6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 systemd-udevd[299954]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:42:46 np0005593233 NetworkManager[48871]: <info>  [1769164966.2707] manager: (tap72854481-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/382)
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.317 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[64040616-ce90-4c8b-8f9b-0716ed55585e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.323 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a439afe4-e2cf-426c-afb5-07e2908438ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 NetworkManager[48871]: <info>  [1769164966.3567] device (tap72854481-c0): carrier: link connected
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.364 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e70d7e14-9fb1-4eda-9ae3-08fb75cd6603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.539 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d0486abd-0649-4fd9-874c-cc8b7e4507d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887898, 'reachable_time': 37361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299985, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.564 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e81cbc-7048-45e9-805a-89dc91770b43]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:b660'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 887898, 'tstamp': 887898}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299986, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.588 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f37d79e9-9d9b-45e8-843e-663e0bf5993c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887898, 'reachable_time': 37361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299987, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.647 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ceaaab9c-3790-4486-9d30-12b84a05dd9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.752 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c5289d3a-635a-4e10-9d1d-dea4898af607]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.754 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.755 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.756 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72854481-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.759 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:46 np0005593233 kernel: tap72854481-c0: entered promiscuous mode
Jan 23 05:42:46 np0005593233 NetworkManager[48871]: <info>  [1769164966.7606] manager: (tap72854481-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/383)
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.765 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap72854481-c0, col_values=(('external_ids', {'iface-id': '6b08537e-a263-4eec-b987-1e42878f483a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:46 np0005593233 ovn_controller[130653]: 2026-01-23T10:42:46Z|00842|binding|INFO|Releasing lport 6b08537e-a263-4eec-b987-1e42878f483a from this chassis (sb_readonly=0)
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.798 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.800 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6d631b27-0e22-4023-a2f2-f1041419a8bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.802 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:42:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:46.803 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'env', 'PROCESS_TAG=haproxy-72854481-c2f9-4651-8ba1-fe321a8a5546', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/72854481-c2f9-4651-8ba1-fe321a8a5546.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.848 222021 DEBUG nova.compute.manager [req-65b9ca7f-edc9-49dc-bf73-c8d5ab795cf1 req-82957407-58be-4a53-8566-297c51f9abe9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.850 222021 DEBUG oslo_concurrency.lockutils [req-65b9ca7f-edc9-49dc-bf73-c8d5ab795cf1 req-82957407-58be-4a53-8566-297c51f9abe9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.851 222021 DEBUG oslo_concurrency.lockutils [req-65b9ca7f-edc9-49dc-bf73-c8d5ab795cf1 req-82957407-58be-4a53-8566-297c51f9abe9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.851 222021 DEBUG oslo_concurrency.lockutils [req-65b9ca7f-edc9-49dc-bf73-c8d5ab795cf1 req-82957407-58be-4a53-8566-297c51f9abe9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.852 222021 DEBUG nova.compute.manager [req-65b9ca7f-edc9-49dc-bf73-c8d5ab795cf1 req-82957407-58be-4a53-8566-297c51f9abe9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Processing event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:42:46 np0005593233 nova_compute[222017]: 2026-01-23 10:42:46.892 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:47 np0005593233 nova_compute[222017]: 2026-01-23 10:42:47.075 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:42:47 np0005593233 nova_compute[222017]: 2026-01-23 10:42:47.078 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164967.0772603, b23e0eb3-82ec-4c33-aedd-b815e9513866 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:42:47 np0005593233 nova_compute[222017]: 2026-01-23 10:42:47.078 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] VM Started (Lifecycle Event)#033[00m
Jan 23 05:42:47 np0005593233 nova_compute[222017]: 2026-01-23 10:42:47.088 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:42:47 np0005593233 nova_compute[222017]: 2026-01-23 10:42:47.093 222021 INFO nova.virt.libvirt.driver [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Instance spawned successfully.#033[00m
Jan 23 05:42:47 np0005593233 nova_compute[222017]: 2026-01-23 10:42:47.093 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:42:47 np0005593233 podman[300061]: 2026-01-23 10:42:47.400425458 +0000 UTC m=+0.042813508 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:42:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:47.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:47 np0005593233 podman[300061]: 2026-01-23 10:42:47.626280605 +0000 UTC m=+0.268668605 container create 90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:42:47 np0005593233 systemd[1]: Started libpod-conmon-90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a.scope.
Jan 23 05:42:47 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:42:47 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e3937f6583eefbafeef3cb98d49809559af97f43d9cddb55f9a719caa4a7594/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:42:47 np0005593233 podman[300061]: 2026-01-23 10:42:47.741032926 +0000 UTC m=+0.383420926 container init 90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:42:47 np0005593233 podman[300061]: 2026-01-23 10:42:47.748989462 +0000 UTC m=+0.391377462 container start 90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:42:47 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [NOTICE]   (300080) : New worker (300082) forked
Jan 23 05:42:47 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [NOTICE]   (300080) : Loading success.
Jan 23 05:42:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:48.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:49.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:50.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:51 np0005593233 nova_compute[222017]: 2026-01-23 10:42:51.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:51.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:51 np0005593233 nova_compute[222017]: 2026-01-23 10:42:51.894 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:52.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:53.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:54.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:42:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:55.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.889 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.894 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.895 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.895 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.896 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.896 222021 DEBUG nova.virt.libvirt.driver [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.901 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:42:55 np0005593233 nova_compute[222017]: 2026-01-23 10:42:55.907 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:42:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:42:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:56.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.896 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.898 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164967.077422, b23e0eb3-82ec-4c33-aedd-b815e9513866 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.899 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.947 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.953 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769164967.088232, b23e0eb3-82ec-4c33-aedd-b815e9513866 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:42:56 np0005593233 nova_compute[222017]: 2026-01-23 10:42:56.954 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.003 222021 INFO nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Took 19.42 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.005 222021 DEBUG nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.014 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.019 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.066 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.092 222021 INFO nova.compute.manager [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Took 22.70 seconds to build instance.#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.102 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:57.104 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:42:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:42:57.105 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.115 222021 DEBUG oslo_concurrency.lockutils [None req-4850a6b0-3ddb-40af-b742-8d5b023c288b eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:57 np0005593233 podman[300093]: 2026-01-23 10:42:57.150278264 +0000 UTC m=+0.153969266 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.417 222021 DEBUG nova.compute.manager [req-6ce9295b-325b-4c49-bdef-f5ef3c971b9f req-2a26a73e-1406-4f78-8fef-eea4fb8821f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.418 222021 DEBUG oslo_concurrency.lockutils [req-6ce9295b-325b-4c49-bdef-f5ef3c971b9f req-2a26a73e-1406-4f78-8fef-eea4fb8821f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.419 222021 DEBUG oslo_concurrency.lockutils [req-6ce9295b-325b-4c49-bdef-f5ef3c971b9f req-2a26a73e-1406-4f78-8fef-eea4fb8821f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.419 222021 DEBUG oslo_concurrency.lockutils [req-6ce9295b-325b-4c49-bdef-f5ef3c971b9f req-2a26a73e-1406-4f78-8fef-eea4fb8821f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.419 222021 DEBUG nova.compute.manager [req-6ce9295b-325b-4c49-bdef-f5ef3c971b9f req-2a26a73e-1406-4f78-8fef-eea4fb8821f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] No waiting events found dispatching network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:42:57 np0005593233 nova_compute[222017]: 2026-01-23 10:42:57.420 222021 WARNING nova.compute.manager [req-6ce9295b-325b-4c49-bdef-f5ef3c971b9f req-2a26a73e-1406-4f78-8fef-eea4fb8821f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received unexpected event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a for instance with vm_state active and task_state None.#033[00m
Jan 23 05:42:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:42:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:57.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:42:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:42:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:58.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:42:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:42:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:59.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:00.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:01 np0005593233 nova_compute[222017]: 2026-01-23 10:43:01.237 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:43:01Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:d5:ea 10.100.0.4
Jan 23 05:43:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:43:01Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:d5:ea 10.100.0.4
Jan 23 05:43:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:01.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:01 np0005593233 nova_compute[222017]: 2026-01-23 10:43:01.926 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:02.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.853385) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164982853488, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 1639, "num_deletes": 254, "total_data_size": 3661508, "memory_usage": 3715744, "flush_reason": "Manual Compaction"}
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164982879686, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 2406749, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 81965, "largest_seqno": 83599, "table_properties": {"data_size": 2399853, "index_size": 3966, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 15021, "raw_average_key_size": 20, "raw_value_size": 2385849, "raw_average_value_size": 3246, "num_data_blocks": 174, "num_entries": 735, "num_filter_entries": 735, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164849, "oldest_key_time": 1769164849, "file_creation_time": 1769164982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 26357 microseconds, and 7792 cpu microseconds.
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.879757) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 2406749 bytes OK
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.879794) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.885166) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.885185) EVENT_LOG_v1 {"time_micros": 1769164982885178, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.885211) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 3653861, prev total WAL file size 3674916, number of live WAL files 2.
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.887545) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(2350KB)], [171(10MB)]
Jan 23 05:43:02 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164982887608, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 13333890, "oldest_snapshot_seqno": -1}
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10176 keys, 11379681 bytes, temperature: kUnknown
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164983110162, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 11379681, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11316580, "index_size": 36556, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25477, "raw_key_size": 268700, "raw_average_key_size": 26, "raw_value_size": 11141149, "raw_average_value_size": 1094, "num_data_blocks": 1383, "num_entries": 10176, "num_filter_entries": 10176, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769164982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:43:03 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:03.111 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.110687) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 11379681 bytes
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.112337) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 59.9 rd, 51.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 10.4 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(10.3) write-amplify(4.7) OK, records in: 10701, records dropped: 525 output_compression: NoCompression
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.112360) EVENT_LOG_v1 {"time_micros": 1769164983112347, "job": 110, "event": "compaction_finished", "compaction_time_micros": 222730, "compaction_time_cpu_micros": 32543, "output_level": 6, "num_output_files": 1, "total_output_size": 11379681, "num_input_records": 10701, "num_output_records": 10176, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164983113130, "job": 110, "event": "table_file_deletion", "file_number": 173}
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164983115764, "job": 110, "event": "table_file_deletion", "file_number": 171}
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:02.887457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.115929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.115938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.115940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.115942) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:43:03.115950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:03.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:43:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:04.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:05 np0005593233 nova_compute[222017]: 2026-01-23 10:43:05.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:05 np0005593233 nova_compute[222017]: 2026-01-23 10:43:05.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:05.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:06.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.239 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.749 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.750 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.751 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.751 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.752 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:06 np0005593233 nova_compute[222017]: 2026-01-23 10:43:06.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.096 222021 DEBUG nova.compute.manager [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-changed-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.096 222021 DEBUG nova.compute.manager [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Refreshing instance network info cache due to event network-changed-ec48fcb1-8f75-412d-a33c-7e0896158f9a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.097 222021 DEBUG oslo_concurrency.lockutils [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.097 222021 DEBUG oslo_concurrency.lockutils [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.097 222021 DEBUG nova.network.neutron [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Refreshing network info cache for port ec48fcb1-8f75-412d-a33c-7e0896158f9a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:43:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:43:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2917644732' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.308 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.438 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.439 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.444 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.444 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:07.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.721 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.722 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3821MB free_disk=20.805469512939453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.723 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.723 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.845 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0a916952-341a-4caf-bf6f-6abe504830f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.845 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance b23e0eb3-82ec-4c33-aedd-b815e9513866 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.846 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.847 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:43:07 np0005593233 nova_compute[222017]: 2026-01-23 10:43:07.945 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:08.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:43:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4123812551' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:43:08 np0005593233 nova_compute[222017]: 2026-01-23 10:43:08.458 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:08 np0005593233 nova_compute[222017]: 2026-01-23 10:43:08.464 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:43:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:09.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:10.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:11 np0005593233 nova_compute[222017]: 2026-01-23 10:43:11.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:11 np0005593233 nova_compute[222017]: 2026-01-23 10:43:11.412 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:43:11 np0005593233 nova_compute[222017]: 2026-01-23 10:43:11.494 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:43:11 np0005593233 nova_compute[222017]: 2026-01-23 10:43:11.494 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:11.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:11 np0005593233 nova_compute[222017]: 2026-01-23 10:43:11.930 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:12.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:13 np0005593233 podman[300348]: 2026-01-23 10:43:13.068480993 +0000 UTC m=+0.067481858 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:43:13 np0005593233 nova_compute[222017]: 2026-01-23 10:43:13.495 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:13 np0005593233 nova_compute[222017]: 2026-01-23 10:43:13.496 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:13 np0005593233 nova_compute[222017]: 2026-01-23 10:43:13.497 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:13 np0005593233 nova_compute[222017]: 2026-01-23 10:43:13.497 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:13 np0005593233 nova_compute[222017]: 2026-01-23 10:43:13.497 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:43:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:13.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:14.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 05:43:14 np0005593233 nova_compute[222017]: 2026-01-23 10:43:14.391 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 05:43:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:15.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:16.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e389 e389: 3 total, 3 up, 3 in
Jan 23 05:43:16 np0005593233 nova_compute[222017]: 2026-01-23 10:43:16.245 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:16 np0005593233 nova_compute[222017]: 2026-01-23 10:43:16.523 222021 DEBUG nova.network.neutron [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updated VIF entry in instance network info cache for port ec48fcb1-8f75-412d-a33c-7e0896158f9a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:43:16 np0005593233 nova_compute[222017]: 2026-01-23 10:43:16.523 222021 DEBUG nova.network.neutron [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating instance_info_cache with network_info: [{"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:43:16 np0005593233 nova_compute[222017]: 2026-01-23 10:43:16.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:17 np0005593233 nova_compute[222017]: 2026-01-23 10:43:17.085 222021 DEBUG oslo_concurrency.lockutils [req-77029224-9eb5-47c2-aa31-788cf1dfa0f2 req-9d118b1c-d7de-46ac-8089-8aa0a25d55c5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:43:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:17.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:43:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1909575877' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:43:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:18.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:19.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:20.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:20 np0005593233 nova_compute[222017]: 2026-01-23 10:43:20.383 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e390 e390: 3 total, 3 up, 3 in
Jan 23 05:43:21 np0005593233 nova_compute[222017]: 2026-01-23 10:43:21.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:21 np0005593233 nova_compute[222017]: 2026-01-23 10:43:21.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:21 np0005593233 nova_compute[222017]: 2026-01-23 10:43:21.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:43:21 np0005593233 nova_compute[222017]: 2026-01-23 10:43:21.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:43:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:21.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:21 np0005593233 nova_compute[222017]: 2026-01-23 10:43:21.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:22.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:23.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:24 np0005593233 nova_compute[222017]: 2026-01-23 10:43:24.619 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:43:24 np0005593233 nova_compute[222017]: 2026-01-23 10:43:24.619 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:43:24 np0005593233 nova_compute[222017]: 2026-01-23 10:43:24.620 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:43:24 np0005593233 nova_compute[222017]: 2026-01-23 10:43:24.620 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:43:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:25.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:26.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:26 np0005593233 nova_compute[222017]: 2026-01-23 10:43:26.289 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:26 np0005593233 nova_compute[222017]: 2026-01-23 10:43:26.958 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:27 np0005593233 nova_compute[222017]: 2026-01-23 10:43:27.563 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [{"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:43:27 np0005593233 nova_compute[222017]: 2026-01-23 10:43:27.587 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-0a916952-341a-4caf-bf6f-6abe504830f9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:43:27 np0005593233 nova_compute[222017]: 2026-01-23 10:43:27.587 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:43:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:27.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:28 np0005593233 podman[300371]: 2026-01-23 10:43:28.089134293 +0000 UTC m=+0.098552113 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:43:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:28.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:29.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:30.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:31 np0005593233 nova_compute[222017]: 2026-01-23 10:43:31.292 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:31.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:31 np0005593233 nova_compute[222017]: 2026-01-23 10:43:31.960 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:32.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:33.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:34.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.334 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.334 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.360 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.493 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.493 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.520 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.520 222021 INFO nova.compute.claims [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:43:34 np0005593233 nova_compute[222017]: 2026-01-23 10:43:34.782 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:43:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2288853674' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.276 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.286 222021 DEBUG nova.compute.provider_tree [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.340 222021 DEBUG nova.scheduler.client.report [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.403 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.404 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.512 222021 INFO nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.517 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.519 222021 DEBUG nova.network.neutron [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:43:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:35.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.830 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:43:35 np0005593233 nova_compute[222017]: 2026-01-23 10:43:35.991 222021 INFO nova.virt.block_device [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Booting with volume snapshot aad073c6-3850-4a37-969d-b73d0ec7219a at /dev/vda#033[00m
Jan 23 05:43:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:36.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:36 np0005593233 nova_compute[222017]: 2026-01-23 10:43:36.294 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:36 np0005593233 nova_compute[222017]: 2026-01-23 10:43:36.963 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:37 np0005593233 nova_compute[222017]: 2026-01-23 10:43:37.039 222021 DEBUG nova.policy [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb70c3aee8b64273a1930c0c2c231aff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd27c5465284b48a5818ef931d6251c43', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:43:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e391 e391: 3 total, 3 up, 3 in
Jan 23 05:43:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:37.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:38 np0005593233 nova_compute[222017]: 2026-01-23 10:43:38.043 222021 DEBUG nova.network.neutron [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Successfully created port: c51d2544-11d1-4cec-9c25-5547325dcf9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:43:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:38.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e392 e392: 3 total, 3 up, 3 in
Jan 23 05:43:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:39.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:40.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.362 222021 DEBUG nova.network.neutron [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Successfully updated port: c51d2544-11d1-4cec-9c25-5547325dcf9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.398 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.399 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquired lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.399 222021 DEBUG nova.network.neutron [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.542 222021 DEBUG nova.compute.manager [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-changed-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.543 222021 DEBUG nova.compute.manager [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Refreshing instance network info cache due to event network-changed-c51d2544-11d1-4cec-9c25-5547325dcf9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.543 222021 DEBUG oslo_concurrency.lockutils [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:43:40 np0005593233 nova_compute[222017]: 2026-01-23 10:43:40.618 222021 DEBUG nova.network.neutron [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.295 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.322 222021 DEBUG os_brick.utils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.323 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.345 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.346 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[ba18f766-953e-47d0-92f1-bf9cf4ff6eef]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.348 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.368 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.369 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[0838011b-e275-4609-962d-524b5a98be06]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.371 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.382 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.382 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[406f8485-044c-4711-8eba-846659f8029d]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.384 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3cd2c0-4784-4db9-943f-22302d380221]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.385 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.427 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "nvme version" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.430 222021 DEBUG os_brick.initiator.connectors.lightos [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.430 222021 DEBUG os_brick.initiator.connectors.lightos [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.431 222021 DEBUG os_brick.initiator.connectors.lightos [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.431 222021 DEBUG os_brick.utils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] <== get_connector_properties: return (108ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:43:41 np0005593233 nova_compute[222017]: 2026-01-23 10:43:41.431 222021 DEBUG nova.virt.block_device [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updating existing volume attachment record: efe8287a-fffc-4b9f-aac5-bcbdcffec9ef _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:43:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:41.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.105 222021 DEBUG nova.network.neutron [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updating instance_info_cache with network_info: [{"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.145 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Releasing lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.146 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Instance network_info: |[{"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.147 222021 DEBUG oslo_concurrency.lockutils [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.148 222021 DEBUG nova.network.neutron [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Refreshing network info cache for port c51d2544-11d1-4cec-9c25-5547325dcf9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:43:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:42.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:42.705 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:42.706 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:42.707 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.931 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.934 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.935 222021 INFO nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Creating image(s)#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.936 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.936 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Ensure instance console log exists: /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.937 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.938 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.938 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.943 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Start _get_guest_xml network_info=[{"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='d41d8cd98f00b204e9800998ecf8427e',container_format='bare',created_at=2026-01-23T10:43:16Z,direct_url=<?>,disk_format='qcow2',id=a757e68d-9be5-4d20-8679-fbaf68ed4300,min_disk=1,min_ram=0,name='tempest-TestVolumeBootPatternsnapshot-349323527',owner='d27c5465284b48a5818ef931d6251c43',properties=ImageMetaProps,protected=<?>,size=0,status='active',tags=<?>,updated_at=2026-01-23T10:43:23Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-837a8189-567b-4aae-b83b-e7aa7a4ff9a7', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '837a8189-567b-4aae-b83b-e7aa7a4ff9a7', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c23f23ba-4d94-44c7-9460-bcce38d2bd70', 'attached_at': '', 'detached_at': '', 'volume_id': '837a8189-567b-4aae-b83b-e7aa7a4ff9a7', 'serial': '837a8189-567b-4aae-b83b-e7aa7a4ff9a7'}, 'delete_on_termination': True, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'efe8287a-fffc-4b9f-aac5-bcbdcffec9ef', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.951 222021 WARNING nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.959 222021 DEBUG nova.virt.libvirt.host [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.960 222021 DEBUG nova.virt.libvirt.host [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.965 222021 DEBUG nova.virt.libvirt.host [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.965 222021 DEBUG nova.virt.libvirt.host [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.967 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.968 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='d41d8cd98f00b204e9800998ecf8427e',container_format='bare',created_at=2026-01-23T10:43:16Z,direct_url=<?>,disk_format='qcow2',id=a757e68d-9be5-4d20-8679-fbaf68ed4300,min_disk=1,min_ram=0,name='tempest-TestVolumeBootPatternsnapshot-349323527',owner='d27c5465284b48a5818ef931d6251c43',properties=ImageMetaProps,protected=<?>,size=0,status='active',tags=<?>,updated_at=2026-01-23T10:43:23Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.969 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.969 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.969 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.970 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.970 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.971 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.971 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.972 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.972 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:43:42 np0005593233 nova_compute[222017]: 2026-01-23 10:43:42.972 222021 DEBUG nova.virt.hardware [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.020 222021 DEBUG nova.storage.rbd_utils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image c23f23ba-4d94-44c7-9460-bcce38d2bd70_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.027 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.554 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.604 222021 DEBUG nova.virt.libvirt.vif [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-image-snapshot-server-2116043864',display_name='tempest-TestVolumeBootPattern-image-snapshot-server-2116043864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-image-snapshot-server-2116043864',id=203,image_ref='a757e68d-9be5-4d20-8679-fbaf68ed4300',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgAQo8u/CV+3N+87bvlKWa4Fqpcg5EB+o6nRcd+hVPajdpptYt80WMyt7heF2UpvoIIljsKeEMpkK+rRq1634ep/Zodvw8Rhw3wIItJTsgqQ1SlMbzcmLca40P0C+VTIg==',key_name='tempest-keypair-244048739',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-xua44np4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_bdm_v2='True',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_project_name='tempest-TestVolumeBootPattern-2139361132',image_owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member',image_root_device_name='/dev/vda',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:43:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=c23f23ba-4d94-44c7-9460-bcce38d2bd70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.605 222021 DEBUG nova.network.os_vif_util [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.607 222021 DEBUG nova.network.os_vif_util [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.609 222021 DEBUG nova.objects.instance [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'pci_devices' on Instance uuid c23f23ba-4d94-44c7-9460-bcce38d2bd70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.636 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <uuid>c23f23ba-4d94-44c7-9460-bcce38d2bd70</uuid>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <name>instance-000000cb</name>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestVolumeBootPattern-image-snapshot-server-2116043864</nova:name>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:43:42</nova:creationTime>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:user uuid="eb70c3aee8b64273a1930c0c2c231aff">tempest-TestVolumeBootPattern-2139361132-project-member</nova:user>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:project uuid="d27c5465284b48a5818ef931d6251c43">tempest-TestVolumeBootPattern-2139361132</nova:project>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="a757e68d-9be5-4d20-8679-fbaf68ed4300"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <nova:port uuid="c51d2544-11d1-4cec-9c25-5547325dcf9b">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <entry name="serial">c23f23ba-4d94-44c7-9460-bcce38d2bd70</entry>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <entry name="uuid">c23f23ba-4d94-44c7-9460-bcce38d2bd70</entry>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/c23f23ba-4d94-44c7-9460-bcce38d2bd70_disk.config">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-837a8189-567b-4aae-b83b-e7aa7a4ff9a7">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <serial>837a8189-567b-4aae-b83b-e7aa7a4ff9a7</serial>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:cb:9c:82"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <target dev="tapc51d2544-11"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/console.log" append="off"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:43:43 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:43:43 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:43:43 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:43:43 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.638 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Preparing to wait for external event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.640 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.640 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.641 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.643 222021 DEBUG nova.virt.libvirt.vif [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-image-snapshot-server-2116043864',display_name='tempest-TestVolumeBootPattern-image-snapshot-server-2116043864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-image-snapshot-server-2116043864',id=203,image_ref='a757e68d-9be5-4d20-8679-fbaf68ed4300',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgAQo8u/CV+3N+87bvlKWa4Fqpcg5EB+o6nRcd+hVPajdpptYt80WMyt7heF2UpvoIIljsKeEMpkK+rRq1634ep/Zodvw8Rhw3wIItJTsgqQ1SlMbzcmLca40P0C+VTIg==',key_name='tempest-keypair-244048739',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-xua44np4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_bdm_v2='True',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_project_name='tempest-TestVolumeBootPattern-2139361132',image_owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member',image_root_device_name='/dev/vda',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:43:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=c23f23ba-4d94-44c7-9460-bcce38d2bd70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.644 222021 DEBUG nova.network.os_vif_util [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.645 222021 DEBUG nova.network.os_vif_util [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.646 222021 DEBUG os_vif [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.653 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.654 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.655 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:43:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:43.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.662 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.662 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc51d2544-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.663 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc51d2544-11, col_values=(('external_ids', {'iface-id': 'c51d2544-11d1-4cec-9c25-5547325dcf9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:9c:82', 'vm-uuid': 'c23f23ba-4d94-44c7-9460-bcce38d2bd70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:43 np0005593233 NetworkManager[48871]: <info>  [1769165023.6671] manager: (tapc51d2544-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.675 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.676 222021 INFO os_vif [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11')#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.779 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.780 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.780 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No VIF found with MAC fa:16:3e:cb:9c:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.781 222021 INFO nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Using config drive#033[00m
Jan 23 05:43:43 np0005593233 nova_compute[222017]: 2026-01-23 10:43:43.826 222021 DEBUG nova.storage.rbd_utils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image c23f23ba-4d94-44c7-9460-bcce38d2bd70_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:43:44 np0005593233 podman[300490]: 2026-01-23 10:43:44.071544668 +0000 UTC m=+0.070885361 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:43:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:44.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:43:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1361605914' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:43:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:43:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1361605914' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.051 222021 INFO nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Creating config drive at /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/disk.config#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.065 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_m_mqtvs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.113 222021 DEBUG nova.network.neutron [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updated VIF entry in instance network info cache for port c51d2544-11d1-4cec-9c25-5547325dcf9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.114 222021 DEBUG nova.network.neutron [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updating instance_info_cache with network_info: [{"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.229 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_m_mqtvs" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.263 222021 DEBUG nova.storage.rbd_utils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image c23f23ba-4d94-44c7-9460-bcce38d2bd70_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.268 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/disk.config c23f23ba-4d94-44c7-9460-bcce38d2bd70_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.421 222021 DEBUG oslo_concurrency.lockutils [req-04bd2f6d-c832-49a6-9938-f2ba443c969d req-d5d58c00-6fac-48fc-bf71-69fa03978017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.476 222021 DEBUG oslo_concurrency.processutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/disk.config c23f23ba-4d94-44c7-9460-bcce38d2bd70_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.477 222021 INFO nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Deleting local config drive /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70/disk.config because it was imported into RBD.#033[00m
Jan 23 05:43:45 np0005593233 kernel: tapc51d2544-11: entered promiscuous mode
Jan 23 05:43:45 np0005593233 NetworkManager[48871]: <info>  [1769165025.5611] manager: (tapc51d2544-11): new Tun device (/org/freedesktop/NetworkManager/Devices/385)
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.564 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:43:45Z|00843|binding|INFO|Claiming lport c51d2544-11d1-4cec-9c25-5547325dcf9b for this chassis.
Jan 23 05:43:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:43:45Z|00844|binding|INFO|c51d2544-11d1-4cec-9c25-5547325dcf9b: Claiming fa:16:3e:cb:9c:82 10.100.0.14
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.582 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:9c:82 10.100.0.14'], port_security=['fa:16:3e:cb:9c:82 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c23f23ba-4d94-44c7-9460-bcce38d2bd70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8141fa29-65a4-4cf1-8cb8-2be33bb9e1f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=c51d2544-11d1-4cec-9c25-5547325dcf9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.585 140224 INFO neutron.agent.ovn.metadata.agent [-] Port c51d2544-11d1-4cec-9c25-5547325dcf9b in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 bound to our chassis#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.588 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 72854481-c2f9-4651-8ba1-fe321a8a5546#033[00m
Jan 23 05:43:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:43:45Z|00845|binding|INFO|Setting lport c51d2544-11d1-4cec-9c25-5547325dcf9b ovn-installed in OVS
Jan 23 05:43:45 np0005593233 ovn_controller[130653]: 2026-01-23T10:43:45Z|00846|binding|INFO|Setting lport c51d2544-11d1-4cec-9c25-5547325dcf9b up in Southbound
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.600 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.604 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.621 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec48bc4-5978-4e70-8558-f613031bebe7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:45 np0005593233 systemd-udevd[300566]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:43:45 np0005593233 systemd-machined[190954]: New machine qemu-92-instance-000000cb.
Jan 23 05:43:45 np0005593233 systemd[1]: Started Virtual Machine qemu-92-instance-000000cb.
Jan 23 05:43:45 np0005593233 NetworkManager[48871]: <info>  [1769165025.6527] device (tapc51d2544-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:43:45 np0005593233 NetworkManager[48871]: <info>  [1769165025.6551] device (tapc51d2544-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.657 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[88a44081-b7cd-4704-b5e7-08c006172b0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.662 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b08b35ca-af7f-4cbb-9b61-59d6674f3256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:45.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.707 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d28304-90af-4a6c-8beb-7073945b9a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.733 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d742eb8f-4a75-4a4d-af0d-87e04185bb96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887898, 'reachable_time': 37361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300577, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.758 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9604b25d-521e-4d5c-b365-d3c8046c729c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap72854481-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 887932, 'tstamp': 887932}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300579, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap72854481-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 887937, 'tstamp': 887937}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300579, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.760 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:45 np0005593233 nova_compute[222017]: 2026-01-23 10:43:45.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.764 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72854481-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.765 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.765 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap72854481-c0, col_values=(('external_ids', {'iface-id': '6b08537e-a263-4eec-b987-1e42878f483a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:43:45 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:43:45.766 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:43:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:46.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.265 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165026.2643564, c23f23ba-4d94-44c7-9460-bcce38d2bd70 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.266 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] VM Started (Lifecycle Event)#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.298 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.315 222021 DEBUG nova.compute.manager [req-88d73a1c-5e30-4502-9436-b854b8454cd3 req-29467eec-f0db-4b4a-b1ab-37290049990b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.315 222021 DEBUG oslo_concurrency.lockutils [req-88d73a1c-5e30-4502-9436-b854b8454cd3 req-29467eec-f0db-4b4a-b1ab-37290049990b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.315 222021 DEBUG oslo_concurrency.lockutils [req-88d73a1c-5e30-4502-9436-b854b8454cd3 req-29467eec-f0db-4b4a-b1ab-37290049990b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.316 222021 DEBUG oslo_concurrency.lockutils [req-88d73a1c-5e30-4502-9436-b854b8454cd3 req-29467eec-f0db-4b4a-b1ab-37290049990b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.316 222021 DEBUG nova.compute.manager [req-88d73a1c-5e30-4502-9436-b854b8454cd3 req-29467eec-f0db-4b4a-b1ab-37290049990b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Processing event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.317 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.322 222021 DEBUG nova.virt.libvirt.driver [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.329 222021 INFO nova.virt.libvirt.driver [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Instance spawned successfully.#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.331 222021 INFO nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Took 3.40 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.332 222021 DEBUG nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.334 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.350 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.415 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.416 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165026.2646558, c23f23ba-4d94-44c7-9460-bcce38d2bd70 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.416 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.476 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.480 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165026.3213487, c23f23ba-4d94-44c7-9460-bcce38d2bd70 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.481 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.495 222021 INFO nova.compute.manager [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Took 12.08 seconds to build instance.#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.527 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.532 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:43:46 np0005593233 nova_compute[222017]: 2026-01-23 10:43:46.536 222021 DEBUG oslo_concurrency.lockutils [None req-64f91d1b-c736-41d8-a209-06b74c9b52a4 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e393 e393: 3 total, 3 up, 3 in
Jan 23 05:43:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:47.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:48.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.459 222021 DEBUG nova.compute.manager [req-b5d62bfd-a720-470b-858e-be1466ec4df3 req-ebb87ed5-33a3-4ce2-a6fb-a7e984a2847e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.461 222021 DEBUG oslo_concurrency.lockutils [req-b5d62bfd-a720-470b-858e-be1466ec4df3 req-ebb87ed5-33a3-4ce2-a6fb-a7e984a2847e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.461 222021 DEBUG oslo_concurrency.lockutils [req-b5d62bfd-a720-470b-858e-be1466ec4df3 req-ebb87ed5-33a3-4ce2-a6fb-a7e984a2847e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.462 222021 DEBUG oslo_concurrency.lockutils [req-b5d62bfd-a720-470b-858e-be1466ec4df3 req-ebb87ed5-33a3-4ce2-a6fb-a7e984a2847e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.463 222021 DEBUG nova.compute.manager [req-b5d62bfd-a720-470b-858e-be1466ec4df3 req-ebb87ed5-33a3-4ce2-a6fb-a7e984a2847e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] No waiting events found dispatching network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.463 222021 WARNING nova.compute.manager [req-b5d62bfd-a720-470b-858e-be1466ec4df3 req-ebb87ed5-33a3-4ce2-a6fb-a7e984a2847e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received unexpected event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b for instance with vm_state active and task_state None.#033[00m
Jan 23 05:43:48 np0005593233 nova_compute[222017]: 2026-01-23 10:43:48.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e394 e394: 3 total, 3 up, 3 in
Jan 23 05:43:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:49.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:50.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.389 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.425 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 0a916952-341a-4caf-bf6f-6abe504830f9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.427 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid b23e0eb3-82ec-4c33-aedd-b815e9513866 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.427 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid c23f23ba-4d94-44c7-9460-bcce38d2bd70 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.429 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.432 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.433 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.435 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.436 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.490 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.491 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.492 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.519 222021 DEBUG nova.compute.manager [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-changed-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.520 222021 DEBUG nova.compute.manager [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Refreshing instance network info cache due to event network-changed-c51d2544-11d1-4cec-9c25-5547325dcf9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.520 222021 DEBUG oslo_concurrency.lockutils [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.521 222021 DEBUG oslo_concurrency.lockutils [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:43:50 np0005593233 nova_compute[222017]: 2026-01-23 10:43:50.521 222021 DEBUG nova.network.neutron [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Refreshing network info cache for port c51d2544-11d1-4cec-9c25-5547325dcf9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:43:51 np0005593233 nova_compute[222017]: 2026-01-23 10:43:51.301 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:43:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/397253804' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:43:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:43:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/397253804' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:43:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:51.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:52.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:52 np0005593233 nova_compute[222017]: 2026-01-23 10:43:52.692 222021 DEBUG nova.network.neutron [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updated VIF entry in instance network info cache for port c51d2544-11d1-4cec-9c25-5547325dcf9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:43:52 np0005593233 nova_compute[222017]: 2026-01-23 10:43:52.693 222021 DEBUG nova.network.neutron [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updating instance_info_cache with network_info: [{"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:43:52 np0005593233 nova_compute[222017]: 2026-01-23 10:43:52.716 222021 DEBUG oslo_concurrency.lockutils [req-e0757c34-b57a-4367-8ba1-045bca4ba256 req-51a9bbcd-8e54-4b55-899c-1b16a251551e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c23f23ba-4d94-44c7-9460-bcce38d2bd70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:43:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:43:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:53.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:43:53 np0005593233 nova_compute[222017]: 2026-01-23 10:43:53.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:54.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e395 e395: 3 total, 3 up, 3 in
Jan 23 05:43:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e395 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:55.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:56.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:56 np0005593233 nova_compute[222017]: 2026-01-23 10:43:56.307 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:57.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:43:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:58.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:43:58 np0005593233 nova_compute[222017]: 2026-01-23 10:43:58.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:59 np0005593233 podman[300626]: 2026-01-23 10:43:59.142248043 +0000 UTC m=+0.143938624 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:43:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 e396: 3 total, 3 up, 3 in
Jan 23 05:43:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:43:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:59.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:00.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:01 np0005593233 nova_compute[222017]: 2026-01-23 10:44:01.308 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:01.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:02.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:02Z|00116|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.4 does not match offer 10.100.0.14
Jan 23 05:44:02 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:02Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:cb:9c:82 10.100.0.14
Jan 23 05:44:03 np0005593233 nova_compute[222017]: 2026-01-23 10:44:03.724 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:03.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:04.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:05.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:06.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:06 np0005593233 nova_compute[222017]: 2026-01-23 10:44:06.310 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:06Z|00118|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.4 does not match offer 10.100.0.14
Jan 23 05:44:06 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:06Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:cb:9c:82 10.100.0.14
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.525 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.526 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.527 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.527 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:44:07 np0005593233 nova_compute[222017]: 2026-01-23 10:44:07.528 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:07.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:07Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:9c:82 10.100.0.14
Jan 23 05:44:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:07Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:9c:82 10.100.0.14
Jan 23 05:44:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/517481607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.057 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.169 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.170 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.176 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.176 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000c9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.182 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000cb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.183 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000cb as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:08.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.536 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.537 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3659MB free_disk=20.896648406982422GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.538 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.538 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:08.609 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.610 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:08.611 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.635 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 0a916952-341a-4caf-bf6f-6abe504830f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.636 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance b23e0eb3-82ec-4c33-aedd-b815e9513866 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.636 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance c23f23ba-4d94-44c7-9460-bcce38d2bd70 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.636 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.637 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.743 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:08 np0005593233 nova_compute[222017]: 2026-01-23 10:44:08.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2526847382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:09 np0005593233 nova_compute[222017]: 2026-01-23 10:44:09.260 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:09 np0005593233 nova_compute[222017]: 2026-01-23 10:44:09.269 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:09 np0005593233 nova_compute[222017]: 2026-01-23 10:44:09.443 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:09.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:10.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:10 np0005593233 nova_compute[222017]: 2026-01-23 10:44:10.600 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:44:10 np0005593233 nova_compute[222017]: 2026-01-23 10:44:10.601 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:11 np0005593233 nova_compute[222017]: 2026-01-23 10:44:11.313 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:11 np0005593233 nova_compute[222017]: 2026-01-23 10:44:11.602 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:11 np0005593233 nova_compute[222017]: 2026-01-23 10:44:11.603 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:11 np0005593233 nova_compute[222017]: 2026-01-23 10:44:11.603 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:44:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:11.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:44:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:44:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:12.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:13 np0005593233 nova_compute[222017]: 2026-01-23 10:44:13.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:13.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:13 np0005593233 nova_compute[222017]: 2026-01-23 10:44:13.802 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:14.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:15 np0005593233 podman[300834]: 2026-01-23 10:44:15.072519789 +0000 UTC m=+0.072923520 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:44:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.385 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.387 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.388 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.388 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.388 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:15 np0005593233 nova_compute[222017]: 2026-01-23 10:44:15.388 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:15.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.002 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.003 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Image id a757e68d-9be5-4d20-8679-fbaf68ed4300 yields fingerprint 3bf43c8a55a13a5f5388c6e8f7ee7575b917762d _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.005 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.005 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Image id 84c0ef19-7f67-4bd3-95d8-507c3e0942ed yields fingerprint a6f655456a04e1d13ef2e44ed4544c38917863a2 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.006 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): checking#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.006 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.008 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] 0a916952-341a-4caf-bf6f-6abe504830f9 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.009 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] b23e0eb3-82ec-4c33-aedd-b815e9513866 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.009 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] c23f23ba-4d94-44c7-9460-bcce38d2bd70 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.009 222021 WARNING nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.009 222021 WARNING nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.010 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Active base files: /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.010 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Removable base files: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.010 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.010 222021 INFO nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c4be56f0f0c1fc933935bae72309434102ff9887#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.011 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.011 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.011 222021 DEBUG nova.virt.libvirt.imagecache [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 23 05:44:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:16.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.315 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.316 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.316 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.317 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.317 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.319 222021 INFO nova.compute.manager [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Terminating instance#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.320 222021 DEBUG nova.compute.manager [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.320 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 kernel: tapf6b87cda-0b (unregistering): left promiscuous mode
Jan 23 05:44:16 np0005593233 NetworkManager[48871]: <info>  [1769165056.5167] device (tapf6b87cda-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.526 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:16Z|00847|binding|INFO|Releasing lport f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 from this chassis (sb_readonly=0)
Jan 23 05:44:16 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:16Z|00848|binding|INFO|Setting lport f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 down in Southbound
Jan 23 05:44:16 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:16Z|00849|binding|INFO|Removing iface tapf6b87cda-0b ovn-installed in OVS
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.536 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:1c:0c 10.100.0.6'], port_security=['fa:16:3e:78:1c:0c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0a916952-341a-4caf-bf6f-6abe504830f9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed138636-f650-4a09-b808-0b05f9067a5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.537 140224 INFO neutron.agent.ovn.metadata.agent [-] Port f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 unbound from our chassis#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.539 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.541 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[84d2e2cb-1126-4085-8bd6-b4ea3d18a0d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.543 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 namespace which is not needed anymore#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.551 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000c1.scope: Deactivated successfully.
Jan 23 05:44:16 np0005593233 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000c1.scope: Consumed 31.082s CPU time.
Jan 23 05:44:16 np0005593233 systemd-machined[190954]: Machine qemu-88-instance-000000c1 terminated.
Jan 23 05:44:16 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [NOTICE]   (297280) : haproxy version is 2.8.14-c23fe91
Jan 23 05:44:16 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [NOTICE]   (297280) : path to executable is /usr/sbin/haproxy
Jan 23 05:44:16 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [WARNING]  (297280) : Exiting Master process...
Jan 23 05:44:16 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [ALERT]    (297280) : Current worker (297282) exited with code 143 (Terminated)
Jan 23 05:44:16 np0005593233 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[297276]: [WARNING]  (297280) : All workers exited. Exiting... (0)
Jan 23 05:44:16 np0005593233 systemd[1]: libpod-f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd.scope: Deactivated successfully.
Jan 23 05:44:16 np0005593233 podman[300879]: 2026-01-23 10:44:16.737637832 +0000 UTC m=+0.055279482 container died f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.776 222021 INFO nova.virt.libvirt.driver [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Instance destroyed successfully.#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.777 222021 DEBUG nova.objects.instance [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'resources' on Instance uuid 0a916952-341a-4caf-bf6f-6abe504830f9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:44:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd-userdata-shm.mount: Deactivated successfully.
Jan 23 05:44:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay-b379705ba850a21a308304eb46ee60d17a454a13ff22fd8029d3867119a50f92-merged.mount: Deactivated successfully.
Jan 23 05:44:16 np0005593233 podman[300879]: 2026-01-23 10:44:16.794898688 +0000 UTC m=+0.112540348 container cleanup f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:44:16 np0005593233 systemd[1]: libpod-conmon-f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd.scope: Deactivated successfully.
Jan 23 05:44:16 np0005593233 podman[300920]: 2026-01-23 10:44:16.877045667 +0000 UTC m=+0.055740745 container remove f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.884 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0c81d41d-9e2f-4cae-a5c9-8b93442c7ec9]: (4, ('Fri Jan 23 10:44:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 (f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd)\nf6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd\nFri Jan 23 10:44:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 (f6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd)\nf6f4951dfad06dfceb6b61b76f82bfe8102cd0e7b16dda5fd32570839815f5cd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.886 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[58add8fc-d0ee-4e49-849a-20e4611cfdeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.888 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:16 np0005593233 kernel: tapfba2ba4a-d0: left promiscuous mode
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.891 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.908 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 nova_compute[222017]: 2026-01-23 10:44:16.910 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.913 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6aa368-47d0-4848-afe4-a53628a5226a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.929 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[00801d8e-1746-4705-bb81-06af31672407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.931 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bab9e38d-4ffe-48d5-8e9f-a4a7e7d36d5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.951 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c1b5aa-a0ba-41ff-807d-f8797c79d720]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864215, 'reachable_time': 40762, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300938, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:16 np0005593233 systemd[1]: run-netns-ovnmeta\x2dfba2ba4a\x2dd82c\x2d4f8b\x2d9754\x2dc13fbec41a04.mount: Deactivated successfully.
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.957 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:44:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:16.958 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbb2de8-1c5c-4fca-9691-69fc7371cac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.012 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.299 222021 DEBUG nova.virt.libvirt.vif [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:38:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='multiattach-server-0',id=193,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:38:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-ubbwika9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:38:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=0a916952-341a-4caf-bf6f-6abe504830f9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.300 222021 DEBUG nova.network.os_vif_util [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "address": "fa:16:3e:78:1c:0c", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6b87cda-0b", "ovs_interfaceid": "f6b87cda-0bd8-4fbb-a92e-b86c0a65df79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.301 222021 DEBUG nova.network.os_vif_util [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.302 222021 DEBUG os_vif [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.305 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.306 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6b87cda-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.308 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.310 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.315 222021 INFO os_vif [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:1c:0c,bridge_name='br-int',has_traffic_filtering=True,id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6b87cda-0b')#033[00m
Jan 23 05:44:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:17.614 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.743 222021 INFO nova.virt.libvirt.driver [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Deleting instance files /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9_del#033[00m
Jan 23 05:44:17 np0005593233 nova_compute[222017]: 2026-01-23 10:44:17.744 222021 INFO nova.virt.libvirt.driver [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Deletion of /var/lib/nova/instances/0a916952-341a-4caf-bf6f-6abe504830f9_del complete#033[00m
Jan 23 05:44:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:17.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:18.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.242 222021 INFO nova.compute.manager [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Took 1.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.243 222021 DEBUG oslo.service.loopingcall [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.243 222021 DEBUG nova.compute.manager [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.243 222021 DEBUG nova.network.neutron [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.467 222021 DEBUG nova.compute.manager [req-1e1b32f3-6ba0-4214-93f2-1de28a45bc99 req-01338ee5-71e0-4b11-81ea-2f63c0224858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-vif-unplugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.467 222021 DEBUG oslo_concurrency.lockutils [req-1e1b32f3-6ba0-4214-93f2-1de28a45bc99 req-01338ee5-71e0-4b11-81ea-2f63c0224858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.467 222021 DEBUG oslo_concurrency.lockutils [req-1e1b32f3-6ba0-4214-93f2-1de28a45bc99 req-01338ee5-71e0-4b11-81ea-2f63c0224858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.468 222021 DEBUG oslo_concurrency.lockutils [req-1e1b32f3-6ba0-4214-93f2-1de28a45bc99 req-01338ee5-71e0-4b11-81ea-2f63c0224858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.468 222021 DEBUG nova.compute.manager [req-1e1b32f3-6ba0-4214-93f2-1de28a45bc99 req-01338ee5-71e0-4b11-81ea-2f63c0224858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] No waiting events found dispatching network-vif-unplugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:18 np0005593233 nova_compute[222017]: 2026-01-23 10:44:18.468 222021 DEBUG nova.compute.manager [req-1e1b32f3-6ba0-4214-93f2-1de28a45bc99 req-01338ee5-71e0-4b11-81ea-2f63c0224858 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-vif-unplugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.387 222021 DEBUG nova.compute.manager [req-18737469-de6f-439b-b5d4-2ef89050dd04 req-a94e2710-5cc3-427b-afbc-d33f3519b2cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-vif-deleted-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.387 222021 INFO nova.compute.manager [req-18737469-de6f-439b-b5d4-2ef89050dd04 req-a94e2710-5cc3-427b-afbc-d33f3519b2cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Neutron deleted interface f6b87cda-0bd8-4fbb-a92e-b86c0a65df79; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.388 222021 DEBUG nova.network.neutron [req-18737469-de6f-439b-b5d4-2ef89050dd04 req-a94e2710-5cc3-427b-afbc-d33f3519b2cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.395 222021 DEBUG nova.network.neutron [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.410 222021 INFO nova.compute.manager [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Took 1.17 seconds to deallocate network for instance.#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.418 222021 DEBUG nova.compute.manager [req-18737469-de6f-439b-b5d4-2ef89050dd04 req-a94e2710-5cc3-427b-afbc-d33f3519b2cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Detach interface failed, port_id=f6b87cda-0bd8-4fbb-a92e-b86c0a65df79, reason: Instance 0a916952-341a-4caf-bf6f-6abe504830f9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.449 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.450 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:19 np0005593233 nova_compute[222017]: 2026-01-23 10:44:19.541 222021 DEBUG oslo_concurrency.processutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:19.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3054399090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.050 222021 DEBUG oslo_concurrency.processutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.059 222021 DEBUG nova.compute.provider_tree [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:20.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.536 222021 DEBUG nova.scheduler.client.report [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.548 222021 DEBUG nova.compute.manager [req-f4377395-929f-4213-8a28-2b73df7370c6 req-f1af5401-9c61-4081-a478-2eb89159bcbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.549 222021 DEBUG oslo_concurrency.lockutils [req-f4377395-929f-4213-8a28-2b73df7370c6 req-f1af5401-9c61-4081-a478-2eb89159bcbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.549 222021 DEBUG oslo_concurrency.lockutils [req-f4377395-929f-4213-8a28-2b73df7370c6 req-f1af5401-9c61-4081-a478-2eb89159bcbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.550 222021 DEBUG oslo_concurrency.lockutils [req-f4377395-929f-4213-8a28-2b73df7370c6 req-f1af5401-9c61-4081-a478-2eb89159bcbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.550 222021 DEBUG nova.compute.manager [req-f4377395-929f-4213-8a28-2b73df7370c6 req-f1af5401-9c61-4081-a478-2eb89159bcbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] No waiting events found dispatching network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.550 222021 WARNING nova.compute.manager [req-f4377395-929f-4213-8a28-2b73df7370c6 req-f1af5401-9c61-4081-a478-2eb89159bcbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Received unexpected event network-vif-plugged-f6b87cda-0bd8-4fbb-a92e-b86c0a65df79 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.567 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.596 222021 INFO nova.scheduler.client.report [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Deleted allocations for instance 0a916952-341a-4caf-bf6f-6abe504830f9#033[00m
Jan 23 05:44:20 np0005593233 nova_compute[222017]: 2026-01-23 10:44:20.900 222021 DEBUG oslo_concurrency.lockutils [None req-c170fafb-36db-473a-bc52-7471f744a245 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "0a916952-341a-4caf-bf6f-6abe504830f9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:21 np0005593233 nova_compute[222017]: 2026-01-23 10:44:21.319 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:21 np0005593233 nova_compute[222017]: 2026-01-23 10:44:21.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:21 np0005593233 nova_compute[222017]: 2026-01-23 10:44:21.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:44:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:21.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:22.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:22 np0005593233 nova_compute[222017]: 2026-01-23 10:44:22.273 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:44:22 np0005593233 nova_compute[222017]: 2026-01-23 10:44:22.273 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:44:22 np0005593233 nova_compute[222017]: 2026-01-23 10:44:22.273 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:44:22 np0005593233 nova_compute[222017]: 2026-01-23 10:44:22.308 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.702 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.703 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.703 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.704 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.704 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.706 222021 INFO nova.compute.manager [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Terminating instance#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.708 222021 DEBUG nova.compute.manager [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:44:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:23.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:23 np0005593233 kernel: tapc51d2544-11 (unregistering): left promiscuous mode
Jan 23 05:44:23 np0005593233 NetworkManager[48871]: <info>  [1769165063.7975] device (tapc51d2544-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:44:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:23Z|00850|binding|INFO|Releasing lport c51d2544-11d1-4cec-9c25-5547325dcf9b from this chassis (sb_readonly=0)
Jan 23 05:44:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:23Z|00851|binding|INFO|Setting lport c51d2544-11d1-4cec-9c25-5547325dcf9b down in Southbound
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.810 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:23 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:23Z|00852|binding|INFO|Removing iface tapc51d2544-11 ovn-installed in OVS
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.813 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.819 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:9c:82 10.100.0.14'], port_security=['fa:16:3e:cb:9c:82 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c23f23ba-4d94-44c7-9460-bcce38d2bd70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8141fa29-65a4-4cf1-8cb8-2be33bb9e1f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=c51d2544-11d1-4cec-9c25-5547325dcf9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.820 140224 INFO neutron.agent.ovn.metadata.agent [-] Port c51d2544-11d1-4cec-9c25-5547325dcf9b in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 unbound from our chassis#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.822 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 72854481-c2f9-4651-8ba1-fe321a8a5546#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.845 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.851 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[07e96246-82d5-41ae-980f-0c745d40ade5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:23 np0005593233 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000cb.scope: Deactivated successfully.
Jan 23 05:44:23 np0005593233 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000cb.scope: Consumed 18.564s CPU time.
Jan 23 05:44:23 np0005593233 systemd-machined[190954]: Machine qemu-92-instance-000000cb terminated.
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.892 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[b98ca4c8-cfaa-46dc-a4ca-5d688b6b1ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.899 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[edd6784c-3284-4e5c-9ce9-a7c31a1b6c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.944 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5f4433-1629-4dec-abb8-b5885a1ea6c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.957 222021 INFO nova.virt.libvirt.driver [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Instance destroyed successfully.#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.958 222021 DEBUG nova.objects.instance [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'resources' on Instance uuid c23f23ba-4d94-44c7-9460-bcce38d2bd70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:44:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.970 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fc573ab2-8983-4d7f-96a3-9c54a3145f92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 251], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887898, 'reachable_time': 37361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301051, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.981 222021 DEBUG nova.virt.libvirt.vif [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-image-snapshot-server-2116043864',display_name='tempest-TestVolumeBootPattern-image-snapshot-server-2116043864',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-image-snapshot-server-2116043864',id=203,image_ref='a757e68d-9be5-4d20-8679-fbaf68ed4300',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgAQo8u/CV+3N+87bvlKWa4Fqpcg5EB+o6nRcd+hVPajdpptYt80WMyt7heF2UpvoIIljsKeEMpkK+rRq1634ep/Zodvw8Rhw3wIItJTsgqQ1SlMbzcmLca40P0C+VTIg==',key_name='tempest-keypair-244048739',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:43:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-xua44np4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_bdm_v2='True',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_project_name='tempest-TestVolumeBootPattern-2139361132',image_owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member',image_root_device_name='/dev/vda',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:43:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=c23f23ba-4d94-44c7-9460-bcce38d2bd70,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.982 222021 DEBUG nova.network.os_vif_util [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "address": "fa:16:3e:cb:9c:82", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc51d2544-11", "ovs_interfaceid": "c51d2544-11d1-4cec-9c25-5547325dcf9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.984 222021 DEBUG nova.network.os_vif_util [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.985 222021 DEBUG os_vif [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.986 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:23 np0005593233 nova_compute[222017]: 2026-01-23 10:44:23.987 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc51d2544-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.993 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[479b12c0-1f9f-4c0a-835b-24dec3344a9a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap72854481-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 887932, 'tstamp': 887932}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 301054, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap72854481-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 887937, 'tstamp': 887937}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 301054, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:23.995 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:24.044 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72854481-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:24.045 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:44:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:24.045 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap72854481-c0, col_values=(('external_ids', {'iface-id': '6b08537e-a263-4eec-b987-1e42878f483a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:24.045 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.046 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.048 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.051 222021 INFO os_vif [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:9c:82,bridge_name='br-int',has_traffic_filtering=True,id=c51d2544-11d1-4cec-9c25-5547325dcf9b,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc51d2544-11')#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.183 222021 DEBUG nova.compute.manager [req-a9083509-ad17-480b-8c7f-4f02c9a1c051 req-9bbd7e8b-569e-437b-886d-d2f7b8396b57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-vif-unplugged-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.184 222021 DEBUG oslo_concurrency.lockutils [req-a9083509-ad17-480b-8c7f-4f02c9a1c051 req-9bbd7e8b-569e-437b-886d-d2f7b8396b57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.184 222021 DEBUG oslo_concurrency.lockutils [req-a9083509-ad17-480b-8c7f-4f02c9a1c051 req-9bbd7e8b-569e-437b-886d-d2f7b8396b57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.185 222021 DEBUG oslo_concurrency.lockutils [req-a9083509-ad17-480b-8c7f-4f02c9a1c051 req-9bbd7e8b-569e-437b-886d-d2f7b8396b57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.185 222021 DEBUG nova.compute.manager [req-a9083509-ad17-480b-8c7f-4f02c9a1c051 req-9bbd7e8b-569e-437b-886d-d2f7b8396b57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] No waiting events found dispatching network-vif-unplugged-c51d2544-11d1-4cec-9c25-5547325dcf9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.185 222021 DEBUG nova.compute.manager [req-a9083509-ad17-480b-8c7f-4f02c9a1c051 req-9bbd7e8b-569e-437b-886d-d2f7b8396b57 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-vif-unplugged-c51d2544-11d1-4cec-9c25-5547325dcf9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:44:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:24.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.740 222021 INFO nova.virt.libvirt.driver [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Deleting instance files /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70_del#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.741 222021 INFO nova.virt.libvirt.driver [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Deletion of /var/lib/nova/instances/c23f23ba-4d94-44c7-9460-bcce38d2bd70_del complete#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.803 222021 INFO nova.compute.manager [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Took 1.09 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.804 222021 DEBUG oslo.service.loopingcall [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.805 222021 DEBUG nova.compute.manager [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:44:24 np0005593233 nova_compute[222017]: 2026-01-23 10:44:24.805 222021 DEBUG nova.network.neutron [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:44:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:25 np0005593233 nova_compute[222017]: 2026-01-23 10:44:25.348 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating instance_info_cache with network_info: [{"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:25 np0005593233 nova_compute[222017]: 2026-01-23 10:44:25.568 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-b23e0eb3-82ec-4c33-aedd-b815e9513866" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:44:25 np0005593233 nova_compute[222017]: 2026-01-23 10:44:25.569 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:44:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:25.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:26.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.254 222021 DEBUG nova.compute.manager [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.255 222021 DEBUG oslo_concurrency.lockutils [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.256 222021 DEBUG oslo_concurrency.lockutils [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.256 222021 DEBUG oslo_concurrency.lockutils [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.257 222021 DEBUG nova.compute.manager [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] No waiting events found dispatching network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.257 222021 WARNING nova.compute.manager [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received unexpected event network-vif-plugged-c51d2544-11d1-4cec-9c25-5547325dcf9b for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:44:26 np0005593233 nova_compute[222017]: 2026-01-23 10:44:26.322 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.003 222021 DEBUG nova.network.neutron [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.025 222021 INFO nova.compute.manager [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Took 2.22 seconds to deallocate network for instance.#033[00m
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.223 222021 INFO nova.compute.manager [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Took 0.20 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.227 222021 DEBUG nova.compute.manager [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Deleting volume: 837a8189-567b-4aae-b83b-e7aa7a4ff9a7 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 05:44:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:27.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.816 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.817 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:27 np0005593233 nova_compute[222017]: 2026-01-23 10:44:27.885 222021 DEBUG oslo_concurrency.processutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:28.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4245779937' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.376 222021 DEBUG oslo_concurrency.processutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.384 222021 DEBUG nova.compute.provider_tree [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.413 222021 DEBUG nova.compute.manager [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Received event network-vif-deleted-c51d2544-11d1-4cec-9c25-5547325dcf9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.422 222021 DEBUG nova.scheduler.client.report [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.450 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.488 222021 INFO nova.scheduler.client.report [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Deleted allocations for instance c23f23ba-4d94-44c7-9460-bcce38d2bd70#033[00m
Jan 23 05:44:28 np0005593233 nova_compute[222017]: 2026-01-23 10:44:28.557 222021 DEBUG oslo_concurrency.lockutils [None req-4a3e08e6-68c1-40fe-8f01-b85b94af5e67 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "c23f23ba-4d94-44c7-9460-bcce38d2bd70" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:29 np0005593233 nova_compute[222017]: 2026-01-23 10:44:29.041 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:29.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:30 np0005593233 podman[301097]: 2026-01-23 10:44:30.128281441 +0000 UTC m=+0.127280394 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:44:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:30.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e397 e397: 3 total, 3 up, 3 in
Jan 23 05:44:31 np0005593233 nova_compute[222017]: 2026-01-23 10:44:31.324 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:31.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:31 np0005593233 nova_compute[222017]: 2026-01-23 10:44:31.774 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165056.7721298, 0a916952-341a-4caf-bf6f-6abe504830f9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:44:31 np0005593233 nova_compute[222017]: 2026-01-23 10:44:31.775 222021 INFO nova.compute.manager [-] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:44:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:32.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:33 np0005593233 nova_compute[222017]: 2026-01-23 10:44:33.366 222021 DEBUG nova.compute.manager [None req-fb3d901e-d902-4eb7-bdea-0b2ace8b1000 - - - - - -] [instance: 0a916952-341a-4caf-bf6f-6abe504830f9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:44:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:33.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.043 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:34.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.336 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.338 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.338 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.339 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.339 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.341 222021 INFO nova.compute.manager [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Terminating instance#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.344 222021 DEBUG nova.compute.manager [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:44:34 np0005593233 kernel: tapec48fcb1-8f (unregistering): left promiscuous mode
Jan 23 05:44:34 np0005593233 NetworkManager[48871]: <info>  [1769165074.7386] device (tapec48fcb1-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:44:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:34Z|00853|binding|INFO|Releasing lport ec48fcb1-8f75-412d-a33c-7e0896158f9a from this chassis (sb_readonly=0)
Jan 23 05:44:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:34Z|00854|binding|INFO|Setting lport ec48fcb1-8f75-412d-a33c-7e0896158f9a down in Southbound
Jan 23 05:44:34 np0005593233 ovn_controller[130653]: 2026-01-23T10:44:34Z|00855|binding|INFO|Removing iface tapec48fcb1-8f ovn-installed in OVS
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.756 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:34 np0005593233 nova_compute[222017]: 2026-01-23 10:44:34.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:34 np0005593233 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c9.scope: Deactivated successfully.
Jan 23 05:44:34 np0005593233 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000c9.scope: Consumed 19.904s CPU time.
Jan 23 05:44:34 np0005593233 systemd-machined[190954]: Machine qemu-91-instance-000000c9 terminated.
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.009 222021 INFO nova.virt.libvirt.driver [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Instance destroyed successfully.#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.010 222021 DEBUG nova.objects.instance [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'resources' on Instance uuid b23e0eb3-82ec-4c33-aedd-b815e9513866 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:44:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.355 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:d5:ea 10.100.0.4'], port_security=['fa:16:3e:17:d5:ea 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b23e0eb3-82ec-4c33-aedd-b815e9513866', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '4', 'neutron:security_group_ids': '491382a0-febf-49cb-a75d-e59f1bfedc5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=ec48fcb1-8f75-412d-a33c-7e0896158f9a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.358 140224 INFO neutron.agent.ovn.metadata.agent [-] Port ec48fcb1-8f75-412d-a33c-7e0896158f9a in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 unbound from our chassis#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.360 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 72854481-c2f9-4651-8ba1-fe321a8a5546, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.362 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8f33e2af-ff89-4a04-bc82-9f337941b700]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.363 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace which is not needed anymore#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.390 222021 DEBUG nova.virt.libvirt.vif [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:42:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-volume-backed-server-1926199215',display_name='tempest-TestVolumeBootPattern-volume-backed-server-1926199215',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-volume-backed-server-1926199215',id=201,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZxNVxO0IvqUKejsONm/M7JuNET1Clz+bIx75ZwPGtswiNNpJd3BcCEmXn9C+CF23N06TGmkRLx9ZMUWkiPaF8xgmBrvIR54FA+yZMRLTRlCEQbyzx6123Om6WuFPlz2w==',key_name='tempest-keypair-962541773',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:42:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-73n2hjou',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:42:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=b23e0eb3-82ec-4c33-aedd-b815e9513866,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.391 222021 DEBUG nova.network.os_vif_util [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "address": "fa:16:3e:17:d5:ea", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec48fcb1-8f", "ovs_interfaceid": "ec48fcb1-8f75-412d-a33c-7e0896158f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.392 222021 DEBUG nova.network.os_vif_util [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.393 222021 DEBUG os_vif [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.398 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec48fcb1-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.417 222021 INFO os_vif [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:d5:ea,bridge_name='br-int',has_traffic_filtering=True,id=ec48fcb1-8f75-412d-a33c-7e0896158f9a,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec48fcb1-8f')#033[00m
Jan 23 05:44:35 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [NOTICE]   (300080) : haproxy version is 2.8.14-c23fe91
Jan 23 05:44:35 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [NOTICE]   (300080) : path to executable is /usr/sbin/haproxy
Jan 23 05:44:35 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [WARNING]  (300080) : Exiting Master process...
Jan 23 05:44:35 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [WARNING]  (300080) : Exiting Master process...
Jan 23 05:44:35 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [ALERT]    (300080) : Current worker (300082) exited with code 143 (Terminated)
Jan 23 05:44:35 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[300076]: [WARNING]  (300080) : All workers exited. Exiting... (0)
Jan 23 05:44:35 np0005593233 systemd[1]: libpod-90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a.scope: Deactivated successfully.
Jan 23 05:44:35 np0005593233 podman[301177]: 2026-01-23 10:44:35.641207468 +0000 UTC m=+0.145698164 container died 90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:44:35 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a-userdata-shm.mount: Deactivated successfully.
Jan 23 05:44:35 np0005593233 systemd[1]: var-lib-containers-storage-overlay-7e3937f6583eefbafeef3cb98d49809559af97f43d9cddb55f9a719caa4a7594-merged.mount: Deactivated successfully.
Jan 23 05:44:35 np0005593233 podman[301177]: 2026-01-23 10:44:35.719167229 +0000 UTC m=+0.223657955 container cleanup 90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:44:35 np0005593233 systemd[1]: libpod-conmon-90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a.scope: Deactivated successfully.
Jan 23 05:44:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:35.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:35 np0005593233 podman[301205]: 2026-01-23 10:44:35.871208201 +0000 UTC m=+0.124484856 container remove 90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.879 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6fe1ee-bdf9-4c8e-a9c8-7f6e8ad175a1]: (4, ('Fri Jan 23 10:44:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a)\n90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a\nFri Jan 23 10:44:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a)\n90c64c006d2e04c186f50ddcf176e552baab0da2a39acb902be481eb178f5a1a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.881 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6c35fdaf-1862-48b1-93d9-c93caa365576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.883 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.885 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:35 np0005593233 kernel: tap72854481-c0: left promiscuous mode
Jan 23 05:44:35 np0005593233 nova_compute[222017]: 2026-01-23 10:44:35.898 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.904 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[be51a869-2e5c-470a-b6e1-26447311f486]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.924 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5926a0-f168-4e37-9b2c-8a319adfa3d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.926 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b62932-7592-427e-acd5-54283f0c29b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.947 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c29ce64b-5958-4dda-9f75-691977bc16f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 887888, 'reachable_time': 31844, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301222, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:35 np0005593233 systemd[1]: run-netns-ovnmeta\x2d72854481\x2dc2f9\x2d4651\x2d8ba1\x2dfe321a8a5546.mount: Deactivated successfully.
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.950 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:44:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:35.950 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[ff012129-780a-4a09-ad97-a91bd50e5091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:36.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:36 np0005593233 nova_compute[222017]: 2026-01-23 10:44:36.326 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:36 np0005593233 nova_compute[222017]: 2026-01-23 10:44:36.977 222021 INFO nova.virt.libvirt.driver [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Deleting instance files /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866_del#033[00m
Jan 23 05:44:36 np0005593233 nova_compute[222017]: 2026-01-23 10:44:36.979 222021 INFO nova.virt.libvirt.driver [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Deletion of /var/lib/nova/instances/b23e0eb3-82ec-4c33-aedd-b815e9513866_del complete#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.358 222021 DEBUG nova.compute.manager [req-8801c15f-ec2b-4187-b295-ff21994a453f req-d3d25bd8-b423-4f8f-84a3-cbe5c276a224 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-vif-unplugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.360 222021 DEBUG oslo_concurrency.lockutils [req-8801c15f-ec2b-4187-b295-ff21994a453f req-d3d25bd8-b423-4f8f-84a3-cbe5c276a224 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.361 222021 DEBUG oslo_concurrency.lockutils [req-8801c15f-ec2b-4187-b295-ff21994a453f req-d3d25bd8-b423-4f8f-84a3-cbe5c276a224 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.361 222021 DEBUG oslo_concurrency.lockutils [req-8801c15f-ec2b-4187-b295-ff21994a453f req-d3d25bd8-b423-4f8f-84a3-cbe5c276a224 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.362 222021 DEBUG nova.compute.manager [req-8801c15f-ec2b-4187-b295-ff21994a453f req-d3d25bd8-b423-4f8f-84a3-cbe5c276a224 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] No waiting events found dispatching network-vif-unplugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.362 222021 DEBUG nova.compute.manager [req-8801c15f-ec2b-4187-b295-ff21994a453f req-d3d25bd8-b423-4f8f-84a3-cbe5c276a224 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-vif-unplugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.616 222021 INFO nova.compute.manager [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Took 3.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.618 222021 DEBUG oslo.service.loopingcall [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.619 222021 DEBUG nova.compute.manager [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:44:37 np0005593233 nova_compute[222017]: 2026-01-23 10:44:37.619 222021 DEBUG nova.network.neutron [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:44:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:37.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:38.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:38 np0005593233 nova_compute[222017]: 2026-01-23 10:44:38.955 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165063.9532776, c23f23ba-4d94-44c7-9460-bcce38d2bd70 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:44:38 np0005593233 nova_compute[222017]: 2026-01-23 10:44:38.956 222021 INFO nova.compute.manager [-] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:44:39 np0005593233 nova_compute[222017]: 2026-01-23 10:44:39.256 222021 DEBUG nova.compute.manager [None req-1f4b167b-8851-4bf6-9e56-cd61242183c1 - - - - - -] [instance: c23f23ba-4d94-44c7-9460-bcce38d2bd70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:44:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e398 e398: 3 total, 3 up, 3 in
Jan 23 05:44:39 np0005593233 nova_compute[222017]: 2026-01-23 10:44:39.661 222021 DEBUG nova.network.neutron [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:39.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:40.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:40 np0005593233 nova_compute[222017]: 2026-01-23 10:44:40.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:40 np0005593233 nova_compute[222017]: 2026-01-23 10:44:40.741 222021 DEBUG nova.compute.manager [req-7836700d-2d66-4455-9909-2a1232b790b6 req-e1b7884d-91b2-42ab-be76-dacb990c040e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-vif-deleted-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:40 np0005593233 nova_compute[222017]: 2026-01-23 10:44:40.742 222021 INFO nova.compute.manager [req-7836700d-2d66-4455-9909-2a1232b790b6 req-e1b7884d-91b2-42ab-be76-dacb990c040e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Neutron deleted interface ec48fcb1-8f75-412d-a33c-7e0896158f9a; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:44:40 np0005593233 nova_compute[222017]: 2026-01-23 10:44:40.742 222021 DEBUG nova.network.neutron [req-7836700d-2d66-4455-9909-2a1232b790b6 req-e1b7884d-91b2-42ab-be76-dacb990c040e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:40 np0005593233 nova_compute[222017]: 2026-01-23 10:44:40.780 222021 INFO nova.compute.manager [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Took 3.16 seconds to deallocate network for instance.#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.002 222021 DEBUG nova.compute.manager [req-d6272283-3986-4796-bed2-479df75590d0 req-8bd37444-4d4d-443b-aa51-ad3183f96490 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.003 222021 DEBUG oslo_concurrency.lockutils [req-d6272283-3986-4796-bed2-479df75590d0 req-8bd37444-4d4d-443b-aa51-ad3183f96490 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.005 222021 DEBUG oslo_concurrency.lockutils [req-d6272283-3986-4796-bed2-479df75590d0 req-8bd37444-4d4d-443b-aa51-ad3183f96490 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.006 222021 DEBUG oslo_concurrency.lockutils [req-d6272283-3986-4796-bed2-479df75590d0 req-8bd37444-4d4d-443b-aa51-ad3183f96490 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.006 222021 DEBUG nova.compute.manager [req-d6272283-3986-4796-bed2-479df75590d0 req-8bd37444-4d4d-443b-aa51-ad3183f96490 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] No waiting events found dispatching network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.006 222021 WARNING nova.compute.manager [req-d6272283-3986-4796-bed2-479df75590d0 req-8bd37444-4d4d-443b-aa51-ad3183f96490 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Received unexpected event network-vif-plugged-ec48fcb1-8f75-412d-a33c-7e0896158f9a for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.330 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.765 222021 DEBUG nova.compute.manager [req-7836700d-2d66-4455-9909-2a1232b790b6 req-e1b7884d-91b2-42ab-be76-dacb990c040e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Detach interface failed, port_id=ec48fcb1-8f75-412d-a33c-7e0896158f9a, reason: Instance b23e0eb3-82ec-4c33-aedd-b815e9513866 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:44:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:41.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.970 222021 INFO nova.compute.manager [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Took 1.19 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:44:41 np0005593233 nova_compute[222017]: 2026-01-23 10:44:41.972 222021 DEBUG nova.compute.manager [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Deleting volume: d77d9325-542a-4716-94b2-e66e8ceab532 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 05:44:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:42.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:42.707 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:42.707 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:44:42.708 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:43 np0005593233 nova_compute[222017]: 2026-01-23 10:44:43.568 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:43.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:44.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.586218) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084586393, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 1426, "num_deletes": 259, "total_data_size": 3053109, "memory_usage": 3090816, "flush_reason": "Manual Compaction"}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084648451, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 1992865, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 83604, "largest_seqno": 85025, "table_properties": {"data_size": 1986614, "index_size": 3453, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13643, "raw_average_key_size": 20, "raw_value_size": 1973875, "raw_average_value_size": 2911, "num_data_blocks": 152, "num_entries": 678, "num_filter_entries": 678, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164982, "oldest_key_time": 1769164982, "file_creation_time": 1769165084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 62265 microseconds, and 7823 cpu microseconds.
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.648511) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 1992865 bytes OK
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.648541) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.693070) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.693171) EVENT_LOG_v1 {"time_micros": 1769165084693151, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.693215) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 3046240, prev total WAL file size 3046240, number of live WAL files 2.
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.694787) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323637' seq:72057594037927935, type:22 .. '6C6F676D0033353230' seq:0, type:0; will stop at (end)
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(1946KB)], [174(10MB)]
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084694889, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 13372546, "oldest_snapshot_seqno": -1}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10319 keys, 13231324 bytes, temperature: kUnknown
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084843302, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 13231324, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13164924, "index_size": 39471, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25861, "raw_key_size": 272832, "raw_average_key_size": 26, "raw_value_size": 12984715, "raw_average_value_size": 1258, "num_data_blocks": 1504, "num_entries": 10319, "num_filter_entries": 10319, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.843811) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 13231324 bytes
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.845695) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.0 rd, 89.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 10.9 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.3) write-amplify(6.6) OK, records in: 10854, records dropped: 535 output_compression: NoCompression
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.845734) EVENT_LOG_v1 {"time_micros": 1769165084845719, "job": 112, "event": "compaction_finished", "compaction_time_micros": 148518, "compaction_time_cpu_micros": 82724, "output_level": 6, "num_output_files": 1, "total_output_size": 13231324, "num_input_records": 10854, "num_output_records": 10319, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084846623, "job": 112, "event": "table_file_deletion", "file_number": 176}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084848974, "job": 112, "event": "table_file_deletion", "file_number": 174}
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.694570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.849130) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.849141) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.849144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.849148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:44:44.849151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:45 np0005593233 nova_compute[222017]: 2026-01-23 10:44:45.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:45.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:46 np0005593233 podman[301227]: 2026-01-23 10:44:46.093639679 +0000 UTC m=+0.089856067 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:44:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:46.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:46 np0005593233 nova_compute[222017]: 2026-01-23 10:44:46.333 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:47 np0005593233 nova_compute[222017]: 2026-01-23 10:44:47.417 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:47 np0005593233 nova_compute[222017]: 2026-01-23 10:44:47.417 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:47 np0005593233 nova_compute[222017]: 2026-01-23 10:44:47.494 222021 DEBUG oslo_concurrency.processutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:47.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3985152859' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:48 np0005593233 nova_compute[222017]: 2026-01-23 10:44:48.079 222021 DEBUG oslo_concurrency.processutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:48 np0005593233 nova_compute[222017]: 2026-01-23 10:44:48.090 222021 DEBUG nova.compute.provider_tree [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:48.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:49 np0005593233 nova_compute[222017]: 2026-01-23 10:44:49.070 222021 DEBUG nova.scheduler.client.report [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:49 np0005593233 nova_compute[222017]: 2026-01-23 10:44:49.111 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:49 np0005593233 nova_compute[222017]: 2026-01-23 10:44:49.287 222021 INFO nova.scheduler.client.report [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Deleted allocations for instance b23e0eb3-82ec-4c33-aedd-b815e9513866#033[00m
Jan 23 05:44:49 np0005593233 nova_compute[222017]: 2026-01-23 10:44:49.381 222021 DEBUG oslo_concurrency.lockutils [None req-bfabbf49-cf95-407b-a6f1-579c5f0a6642 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "b23e0eb3-82ec-4c33-aedd-b815e9513866" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 15.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:49 np0005593233 nova_compute[222017]: 2026-01-23 10:44:49.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:44:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:49.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:44:50 np0005593233 nova_compute[222017]: 2026-01-23 10:44:50.006 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165075.0040498, b23e0eb3-82ec-4c33-aedd-b815e9513866 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:44:50 np0005593233 nova_compute[222017]: 2026-01-23 10:44:50.006 222021 INFO nova.compute.manager [-] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:44:50 np0005593233 nova_compute[222017]: 2026-01-23 10:44:50.034 222021 DEBUG nova.compute.manager [None req-6f6c9759-502c-416f-9e16-c8647c1686ed - - - - - -] [instance: b23e0eb3-82ec-4c33-aedd-b815e9513866] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:44:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:50.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:50 np0005593233 nova_compute[222017]: 2026-01-23 10:44:50.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:50 np0005593233 nova_compute[222017]: 2026-01-23 10:44:50.505 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:50 np0005593233 nova_compute[222017]: 2026-01-23 10:44:50.702 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:51 np0005593233 nova_compute[222017]: 2026-01-23 10:44:51.336 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:51.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:52.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e399 e399: 3 total, 3 up, 3 in
Jan 23 05:44:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:53.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:54.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:55 np0005593233 nova_compute[222017]: 2026-01-23 10:44:55.408 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:55 np0005593233 nova_compute[222017]: 2026-01-23 10:44:55.409 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:44:55 np0005593233 nova_compute[222017]: 2026-01-23 10:44:55.426 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:44:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:55.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:44:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:56.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:56 np0005593233 nova_compute[222017]: 2026-01-23 10:44:56.339 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:57.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:44:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:58.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:58 np0005593233 nova_compute[222017]: 2026-01-23 10:44:58.414 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:58 np0005593233 nova_compute[222017]: 2026-01-23 10:44:58.415 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:44:58 np0005593233 nova_compute[222017]: 2026-01-23 10:44:58.594 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:44:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 e400: 3 total, 3 up, 3 in
Jan 23 05:44:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:44:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:44:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:59.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:00.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:00 np0005593233 nova_compute[222017]: 2026-01-23 10:45:00.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:01 np0005593233 podman[301271]: 2026-01-23 10:45:01.115322878 +0000 UTC m=+0.124493925 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:45:01 np0005593233 nova_compute[222017]: 2026-01-23 10:45:01.343 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:01.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:02.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:03.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:04.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:05 np0005593233 nova_compute[222017]: 2026-01-23 10:45:05.432 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:45:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:05.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:45:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:06.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:06 np0005593233 nova_compute[222017]: 2026-01-23 10:45:06.346 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:07 np0005593233 nova_compute[222017]: 2026-01-23 10:45:07.566 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:07 np0005593233 nova_compute[222017]: 2026-01-23 10:45:07.604 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:07 np0005593233 nova_compute[222017]: 2026-01-23 10:45:07.604 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:07 np0005593233 nova_compute[222017]: 2026-01-23 10:45:07.605 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:07 np0005593233 nova_compute[222017]: 2026-01-23 10:45:07.605 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:45:07 np0005593233 nova_compute[222017]: 2026-01-23 10:45:07.606 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:07.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:45:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1830729616' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.079 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:08.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.396 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.397 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4267MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.397 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.398 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.485 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.485 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.502 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:45:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/702559663' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.971 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:08 np0005593233 nova_compute[222017]: 2026-01-23 10:45:08.978 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:45:09 np0005593233 nova_compute[222017]: 2026-01-23 10:45:09.049 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:45:09 np0005593233 nova_compute[222017]: 2026-01-23 10:45:09.393 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:45:09 np0005593233 nova_compute[222017]: 2026-01-23 10:45:09.394 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:10.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:10 np0005593233 nova_compute[222017]: 2026-01-23 10:45:10.435 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:11 np0005593233 nova_compute[222017]: 2026-01-23 10:45:11.214 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:11 np0005593233 nova_compute[222017]: 2026-01-23 10:45:11.215 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:11 np0005593233 nova_compute[222017]: 2026-01-23 10:45:11.215 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:11 np0005593233 nova_compute[222017]: 2026-01-23 10:45:11.215 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:45:11 np0005593233 nova_compute[222017]: 2026-01-23 10:45:11.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:12.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:45:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:14.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:45:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:15 np0005593233 nova_compute[222017]: 2026-01-23 10:45:15.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:15 np0005593233 nova_compute[222017]: 2026-01-23 10:45:15.445 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:15.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:16.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:16 np0005593233 nova_compute[222017]: 2026-01-23 10:45:16.364 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:16 np0005593233 nova_compute[222017]: 2026-01-23 10:45:16.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:17 np0005593233 podman[301347]: 2026-01-23 10:45:17.056872094 +0000 UTC m=+0.069667727 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 05:45:17 np0005593233 nova_compute[222017]: 2026-01-23 10:45:17.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:17.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:18.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:19 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:19.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:20.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:20 np0005593233 nova_compute[222017]: 2026-01-23 10:45:20.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:20 np0005593233 nova_compute[222017]: 2026-01-23 10:45:20.448 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:45:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:45:21 np0005593233 nova_compute[222017]: 2026-01-23 10:45:21.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:21.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:23 np0005593233 nova_compute[222017]: 2026-01-23 10:45:23.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:23 np0005593233 nova_compute[222017]: 2026-01-23 10:45:23.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:45:23 np0005593233 nova_compute[222017]: 2026-01-23 10:45:23.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:45:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:23.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:24 np0005593233 nova_compute[222017]: 2026-01-23 10:45:24.238 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:45:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:24.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:24 np0005593233 nova_compute[222017]: 2026-01-23 10:45:24.584 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:45:24.584 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:45:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:45:24.587 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:45:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:25 np0005593233 nova_compute[222017]: 2026-01-23 10:45:25.452 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:45:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:25.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:45:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:26.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:26 np0005593233 nova_compute[222017]: 2026-01-23 10:45:26.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:27.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:28.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:45:28.590 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:29.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:30.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:30 np0005593233 nova_compute[222017]: 2026-01-23 10:45:30.455 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:31 np0005593233 nova_compute[222017]: 2026-01-23 10:45:31.417 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:31.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:32 np0005593233 podman[301670]: 2026-01-23 10:45:32.084205093 +0000 UTC m=+0.090231558 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:45:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:32.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:45:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:33.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:45:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:35 np0005593233 nova_compute[222017]: 2026-01-23 10:45:35.480 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:35.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:36.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:36 np0005593233 nova_compute[222017]: 2026-01-23 10:45:36.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:37.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:38.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:39.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:40.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:40 np0005593233 nova_compute[222017]: 2026-01-23 10:45:40.483 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:41 np0005593233 nova_compute[222017]: 2026-01-23 10:45:41.490 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:41.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:42.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:45:42.708 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:45:42.709 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:45:42.709 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:43.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:44.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:45 np0005593233 nova_compute[222017]: 2026-01-23 10:45:45.487 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:46.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:46 np0005593233 nova_compute[222017]: 2026-01-23 10:45:46.528 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:47.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:48 np0005593233 podman[301699]: 2026-01-23 10:45:48.058891366 +0000 UTC m=+0.062869346 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:45:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:48.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:49.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:50.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:50 np0005593233 nova_compute[222017]: 2026-01-23 10:45:50.490 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:51 np0005593233 nova_compute[222017]: 2026-01-23 10:45:51.570 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:45:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:45:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:45:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:52.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:45:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:53.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:54.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:55 np0005593233 nova_compute[222017]: 2026-01-23 10:45:55.494 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:55.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:56.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:56 np0005593233 nova_compute[222017]: 2026-01-23 10:45:56.655 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:57.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:45:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:58.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:45:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:45:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:59.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:00.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:00 np0005593233 nova_compute[222017]: 2026-01-23 10:46:00.498 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:01 np0005593233 nova_compute[222017]: 2026-01-23 10:46:01.656 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e401 e401: 3 total, 3 up, 3 in
Jan 23 05:46:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:02.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:03 np0005593233 podman[301720]: 2026-01-23 10:46:03.132357899 +0000 UTC m=+0.137577354 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:46:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:03.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:04.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e401 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:05 np0005593233 nova_compute[222017]: 2026-01-23 10:46:05.503 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:46:05.918 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:46:05 np0005593233 nova_compute[222017]: 2026-01-23 10:46:05.918 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:46:05.921 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:46:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:46:05.922 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:06.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:06 np0005593233 nova_compute[222017]: 2026-01-23 10:46:06.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e402 e402: 3 total, 3 up, 3 in
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.435 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.435 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.435 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e403 e403: 3 total, 3 up, 3 in
Jan 23 05:46:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:46:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1638950774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:46:07 np0005593233 nova_compute[222017]: 2026-01-23 10:46:07.906 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:07.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.146 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.148 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4219MB free_disk=20.942649841308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.149 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.149 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.251 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.252 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.272 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.305 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.306 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.327 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.359 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.390 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:46:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:08.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:46:08 np0005593233 ovn_controller[130653]: 2026-01-23T10:46:08Z|00856|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 23 05:46:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:46:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/502297218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.884 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.894 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.944 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.948 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:46:08 np0005593233 nova_compute[222017]: 2026-01-23 10:46:08.949 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e404 e404: 3 total, 3 up, 3 in
Jan 23 05:46:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:46:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:09.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:46:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:10.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:10 np0005593233 nova_compute[222017]: 2026-01-23 10:46:10.506 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:11 np0005593233 nova_compute[222017]: 2026-01-23 10:46:11.662 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:11.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:11 np0005593233 nova_compute[222017]: 2026-01-23 10:46:11.950 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:11 np0005593233 nova_compute[222017]: 2026-01-23 10:46:11.951 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:11 np0005593233 nova_compute[222017]: 2026-01-23 10:46:11.951 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:11 np0005593233 nova_compute[222017]: 2026-01-23 10:46:11.951 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:46:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:12.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:13.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:14.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:15 np0005593233 nova_compute[222017]: 2026-01-23 10:46:15.511 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:15.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:16.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:16 np0005593233 nova_compute[222017]: 2026-01-23 10:46:16.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:17 np0005593233 nova_compute[222017]: 2026-01-23 10:46:17.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.914667) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165177914702, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1261, "num_deletes": 254, "total_data_size": 2630623, "memory_usage": 2675472, "flush_reason": "Manual Compaction"}
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165177930689, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 1736357, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 85030, "largest_seqno": 86286, "table_properties": {"data_size": 1730819, "index_size": 2932, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12456, "raw_average_key_size": 20, "raw_value_size": 1719490, "raw_average_value_size": 2814, "num_data_blocks": 128, "num_entries": 611, "num_filter_entries": 611, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165085, "oldest_key_time": 1769165085, "file_creation_time": 1769165177, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 16143 microseconds, and 5433 cpu microseconds.
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.930798) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 1736357 bytes OK
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.930835) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.933366) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.933396) EVENT_LOG_v1 {"time_micros": 1769165177933384, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.933424) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 2624506, prev total WAL file size 2624506, number of live WAL files 2.
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.935094) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(1695KB)], [177(12MB)]
Jan 23 05:46:17 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165177935142, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 14967681, "oldest_snapshot_seqno": -1}
Jan 23 05:46:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:17.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10403 keys, 13036219 bytes, temperature: kUnknown
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165178103254, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 13036219, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12969485, "index_size": 39613, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 275328, "raw_average_key_size": 26, "raw_value_size": 12787862, "raw_average_value_size": 1229, "num_data_blocks": 1504, "num_entries": 10403, "num_filter_entries": 10403, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165177, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.103557) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 13036219 bytes
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.104932) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 89.0 rd, 77.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 12.6 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(16.1) write-amplify(7.5) OK, records in: 10930, records dropped: 527 output_compression: NoCompression
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.104949) EVENT_LOG_v1 {"time_micros": 1769165178104941, "job": 114, "event": "compaction_finished", "compaction_time_micros": 168214, "compaction_time_cpu_micros": 36628, "output_level": 6, "num_output_files": 1, "total_output_size": 13036219, "num_input_records": 10930, "num_output_records": 10403, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165178105310, "job": 114, "event": "table_file_deletion", "file_number": 179}
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165178107201, "job": 114, "event": "table_file_deletion", "file_number": 177}
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:17.935003) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.107267) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.107273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.107276) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.107278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:46:18.107280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593233 nova_compute[222017]: 2026-01-23 10:46:18.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:18 np0005593233 nova_compute[222017]: 2026-01-23 10:46:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:18.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:19 np0005593233 podman[301794]: 2026-01-23 10:46:19.05712097 +0000 UTC m=+0.071436008 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:46:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:19.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:20.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:20 np0005593233 nova_compute[222017]: 2026-01-23 10:46:20.514 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:21 np0005593233 nova_compute[222017]: 2026-01-23 10:46:21.669 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:46:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:21.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:46:22 np0005593233 nova_compute[222017]: 2026-01-23 10:46:22.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:22.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:23 np0005593233 nova_compute[222017]: 2026-01-23 10:46:23.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:23 np0005593233 nova_compute[222017]: 2026-01-23 10:46:23.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:46:23 np0005593233 nova_compute[222017]: 2026-01-23 10:46:23.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:46:23 np0005593233 nova_compute[222017]: 2026-01-23 10:46:23.407 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:46:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:23.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:24.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:25 np0005593233 nova_compute[222017]: 2026-01-23 10:46:25.518 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:25.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:26.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:26 np0005593233 nova_compute[222017]: 2026-01-23 10:46:26.671 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:27.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:28.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:46:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:29.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e405 e405: 3 total, 3 up, 3 in
Jan 23 05:46:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e405 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:46:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:30.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:46:30 np0005593233 nova_compute[222017]: 2026-01-23 10:46:30.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:31 np0005593233 nova_compute[222017]: 2026-01-23 10:46:31.673 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:31.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:32.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:33.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:34 np0005593233 podman[301945]: 2026-01-23 10:46:34.115904217 +0000 UTC m=+0.117323973 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:46:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:34.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e406 e406: 3 total, 3 up, 3 in
Jan 23 05:46:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:35 np0005593233 nova_compute[222017]: 2026-01-23 10:46:35.526 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:35.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:36.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:36 np0005593233 nova_compute[222017]: 2026-01-23 10:46:36.675 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:37.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:38.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:39.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:40.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:40 np0005593233 nova_compute[222017]: 2026-01-23 10:46:40.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:41 np0005593233 nova_compute[222017]: 2026-01-23 10:46:41.402 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:41 np0005593233 nova_compute[222017]: 2026-01-23 10:46:41.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:41.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:42.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:46:42.709 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:46:42.709 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:46:42.710 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:43.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:44.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:45 np0005593233 nova_compute[222017]: 2026-01-23 10:46:45.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:45.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:46.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:46 np0005593233 nova_compute[222017]: 2026-01-23 10:46:46.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:48.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:48.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:50.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:50 np0005593233 podman[302021]: 2026-01-23 10:46:50.073514296 +0000 UTC m=+0.073609739 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:46:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:50.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:50 np0005593233 nova_compute[222017]: 2026-01-23 10:46:50.535 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 e407: 3 total, 3 up, 3 in
Jan 23 05:46:51 np0005593233 nova_compute[222017]: 2026-01-23 10:46:51.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:52.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:52.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:54.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:54.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:55 np0005593233 nova_compute[222017]: 2026-01-23 10:46:55.538 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:56.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:46:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:56.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:46:56 np0005593233 nova_compute[222017]: 2026-01-23 10:46:56.735 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.826 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.827 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.845 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.922 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.923 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.932 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:46:57 np0005593233 nova_compute[222017]: 2026-01-23 10:46:57.932 222021 INFO nova.compute.claims [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:46:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:58 np0005593233 nova_compute[222017]: 2026-01-23 10:46:58.053 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:46:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:46:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:58.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.160 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.169 222021 DEBUG nova.compute.provider_tree [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.190 222021 DEBUG nova.scheduler.client.report [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.214 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.215 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.355 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.356 222021 DEBUG nova.network.neutron [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.390 222021 INFO nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.417 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.479 222021 INFO nova.virt.block_device [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Booting with volume 5e6f1d6f-902b-4bb5-b9af-c82516af3452 at /dev/vda#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.614 222021 DEBUG nova.policy [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb70c3aee8b64273a1930c0c2c231aff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd27c5465284b48a5818ef931d6251c43', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.648 222021 DEBUG os_brick.utils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.651 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.670 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.671 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0f5db9-9023-4e13-84ba-d55ba628f91d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.673 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.688 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.689 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5b917b-ab7f-441f-9c91-c6644dbacb0b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.691 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.704 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.704 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb5b63a-f8ff-4add-8069-d6104b9db326]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.706 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2a69f1-66e7-4841-b35a-064b9e906126]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.707 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.761 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "nvme version" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.763 222021 DEBUG os_brick.initiator.connectors.lightos [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.764 222021 DEBUG os_brick.initiator.connectors.lightos [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.764 222021 DEBUG os_brick.initiator.connectors.lightos [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.765 222021 DEBUG os_brick.utils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] <== get_connector_properties: return (116ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:46:59 np0005593233 nova_compute[222017]: 2026-01-23 10:46:59.765 222021 DEBUG nova.virt.block_device [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating existing volume attachment record: d54e8e76-7777-400a-a968-2fa448f2339a _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:47:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:00.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:00.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:00 np0005593233 nova_compute[222017]: 2026-01-23 10:47:00.541 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:00.916 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:47:00 np0005593233 nova_compute[222017]: 2026-01-23 10:47:00.917 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:00 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:00.918 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.154 222021 DEBUG nova.network.neutron [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Successfully created port: fc0b98b6-65f9-4127-b86d-9dd37c53eb20 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.181 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.182 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.182 222021 INFO nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Creating image(s)#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.183 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.183 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Ensure instance console log exists: /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.184 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.184 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.184 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.737 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.878 222021 DEBUG nova.network.neutron [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Successfully updated port: fc0b98b6-65f9-4127-b86d-9dd37c53eb20 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.894 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.895 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquired lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.895 222021 DEBUG nova.network.neutron [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:47:01 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:01.920 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.983 222021 DEBUG nova.compute.manager [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-changed-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.983 222021 DEBUG nova.compute.manager [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Refreshing instance network info cache due to event network-changed-fc0b98b6-65f9-4127-b86d-9dd37c53eb20. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:47:01 np0005593233 nova_compute[222017]: 2026-01-23 10:47:01.984 222021 DEBUG oslo_concurrency.lockutils [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:02.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:02 np0005593233 nova_compute[222017]: 2026-01-23 10:47:02.476 222021 DEBUG nova.network.neutron [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:47:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:02.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.514 222021 DEBUG nova.network.neutron [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating instance_info_cache with network_info: [{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.539 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Releasing lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.539 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Instance network_info: |[{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.539 222021 DEBUG oslo_concurrency.lockutils [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.540 222021 DEBUG nova.network.neutron [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Refreshing network info cache for port fc0b98b6-65f9-4127-b86d-9dd37c53eb20 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.543 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Start _get_guest_xml network_info=[{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-5e6f1d6f-902b-4bb5-b9af-c82516af3452', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '5e6f1d6f-902b-4bb5-b9af-c82516af3452', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '5fde5da5-03bb-4c07-bd48-b634468000ac', 'attached_at': '', 'detached_at': '', 'volume_id': '5e6f1d6f-902b-4bb5-b9af-c82516af3452', 'serial': '5e6f1d6f-902b-4bb5-b9af-c82516af3452'}, 'delete_on_termination': False, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'd54e8e76-7777-400a-a968-2fa448f2339a', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.552 222021 WARNING nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.559 222021 DEBUG nova.virt.libvirt.host [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.559 222021 DEBUG nova.virt.libvirt.host [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.567 222021 DEBUG nova.virt.libvirt.host [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.567 222021 DEBUG nova.virt.libvirt.host [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.569 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.569 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.570 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.570 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.571 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.571 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.571 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.571 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.572 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.572 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.572 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.573 222021 DEBUG nova.virt.hardware [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.608 222021 DEBUG nova.storage.rbd_utils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 5fde5da5-03bb-4c07-bd48-b634468000ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:03 np0005593233 nova_compute[222017]: 2026-01-23 10:47:03.613 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:04.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:47:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1259140529' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.078 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.108 222021 DEBUG nova.virt.libvirt.vif [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:46:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-2039370606',display_name='tempest-TestVolumeBootPattern-server-2039370606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-2039370606',id=207,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIC+r2L0+q9PBY99aBdo5XmxGSuvAYYpgonPD6n/fI0f2obfQ97Vwf3Ee9Eeaa+EcBJLE3HyG34aomsAcNn3g+1JMNr5TfA5Vs6CyFMbO26Wy1l0eWLB9DEd+vOA6lCnxQ==',key_name='tempest-TestVolumeBootPattern-909159234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-jyw18g1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:46:59Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=5fde5da5-03bb-4c07-bd48-b634468000ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.109 222021 DEBUG nova.network.os_vif_util [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.110 222021 DEBUG nova.network.os_vif_util [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.111 222021 DEBUG nova.objects.instance [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5fde5da5-03bb-4c07-bd48-b634468000ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.130 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <uuid>5fde5da5-03bb-4c07-bd48-b634468000ac</uuid>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <name>instance-000000cf</name>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestVolumeBootPattern-server-2039370606</nova:name>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:47:03</nova:creationTime>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:user uuid="eb70c3aee8b64273a1930c0c2c231aff">tempest-TestVolumeBootPattern-2139361132-project-member</nova:user>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:project uuid="d27c5465284b48a5818ef931d6251c43">tempest-TestVolumeBootPattern-2139361132</nova:project>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <nova:port uuid="fc0b98b6-65f9-4127-b86d-9dd37c53eb20">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <entry name="serial">5fde5da5-03bb-4c07-bd48-b634468000ac</entry>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <entry name="uuid">5fde5da5-03bb-4c07-bd48-b634468000ac</entry>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/5fde5da5-03bb-4c07-bd48-b634468000ac_disk.config">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-5e6f1d6f-902b-4bb5-b9af-c82516af3452">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <serial>5e6f1d6f-902b-4bb5-b9af-c82516af3452</serial>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:fc:33:7a"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <target dev="tapfc0b98b6-65"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/console.log" append="off"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:47:04 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:47:04 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:47:04 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:47:04 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.132 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Preparing to wait for external event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.132 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.132 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.133 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.134 222021 DEBUG nova.virt.libvirt.vif [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:46:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-2039370606',display_name='tempest-TestVolumeBootPattern-server-2039370606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-2039370606',id=207,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIC+r2L0+q9PBY99aBdo5XmxGSuvAYYpgonPD6n/fI0f2obfQ97Vwf3Ee9Eeaa+EcBJLE3HyG34aomsAcNn3g+1JMNr5TfA5Vs6CyFMbO26Wy1l0eWLB9DEd+vOA6lCnxQ==',key_name='tempest-TestVolumeBootPattern-909159234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-jyw18g1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:46:59Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=5fde5da5-03bb-4c07-bd48-b634468000ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.134 222021 DEBUG nova.network.os_vif_util [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.135 222021 DEBUG nova.network.os_vif_util [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.135 222021 DEBUG os_vif [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.136 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.137 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.137 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.141 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.142 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc0b98b6-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.143 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc0b98b6-65, col_values=(('external_ids', {'iface-id': 'fc0b98b6-65f9-4127-b86d-9dd37c53eb20', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:33:7a', 'vm-uuid': '5fde5da5-03bb-4c07-bd48-b634468000ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.145 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:04 np0005593233 NetworkManager[48871]: <info>  [1769165224.1467] manager: (tapfc0b98b6-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.148 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.157 222021 INFO os_vif [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65')#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.207 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.207 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.207 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No VIF found with MAC fa:16:3e:fc:33:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.208 222021 INFO nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Using config drive#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.232 222021 DEBUG nova.storage.rbd_utils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 5fde5da5-03bb-4c07-bd48-b634468000ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:04.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.666 222021 INFO nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Creating config drive at /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/disk.config#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.671 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4mhy594 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.814 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4mhy594" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.847 222021 DEBUG nova.storage.rbd_utils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 5fde5da5-03bb-4c07-bd48-b634468000ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:04 np0005593233 nova_compute[222017]: 2026-01-23 10:47:04.852 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/disk.config 5fde5da5-03bb-4c07-bd48-b634468000ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.089 222021 DEBUG nova.network.neutron [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updated VIF entry in instance network info cache for port fc0b98b6-65f9-4127-b86d-9dd37c53eb20. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.090 222021 DEBUG nova.network.neutron [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating instance_info_cache with network_info: [{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.115 222021 DEBUG oslo_concurrency.lockutils [req-b380b10a-0aed-41e3-9ea9-c10aa7f601da req-8dd5fb0f-269e-41ad-af23-6ea0397666ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.119 222021 DEBUG oslo_concurrency.processutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/disk.config 5fde5da5-03bb-4c07-bd48-b634468000ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.266s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.120 222021 INFO nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Deleting local config drive /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac/disk.config because it was imported into RBD.#033[00m
Jan 23 05:47:05 np0005593233 podman[302167]: 2026-01-23 10:47:05.13120294 +0000 UTC m=+0.143131901 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:47:05 np0005593233 kernel: tapfc0b98b6-65: entered promiscuous mode
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.2196] manager: (tapfc0b98b6-65): new Tun device (/org/freedesktop/NetworkManager/Devices/387)
Jan 23 05:47:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:05Z|00857|binding|INFO|Claiming lport fc0b98b6-65f9-4127-b86d-9dd37c53eb20 for this chassis.
Jan 23 05:47:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:05Z|00858|binding|INFO|fc0b98b6-65f9-4127-b86d-9dd37c53eb20: Claiming fa:16:3e:fc:33:7a 10.100.0.4
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.236 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.239 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.2424] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.2447] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.242 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:33:7a 10.100.0.4'], port_security=['fa:16:3e:fc:33:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5fde5da5-03bb-4c07-bd48-b634468000ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dbb54d44-fc85-485d-96a6-e6e12258a95a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=fc0b98b6-65f9-4127-b86d-9dd37c53eb20) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.244 140224 INFO neutron.agent.ovn.metadata.agent [-] Port fc0b98b6-65f9-4127-b86d-9dd37c53eb20 in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 bound to our chassis#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.245 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 72854481-c2f9-4651-8ba1-fe321a8a5546#033[00m
Jan 23 05:47:05 np0005593233 systemd-udevd[302208]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.267 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a02f7e0-d77d-4b06-870d-cde8c7a331db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.269 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap72854481-c1 in ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.271 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap72854481-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.272 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1aaaca-076b-40c0-8d89-c3ceb383243d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.273 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[401040c3-9995-4394-9a14-aaed0d6b5886]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 systemd-machined[190954]: New machine qemu-93-instance-000000cf.
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.2788] device (tapfc0b98b6-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:47:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.2796] device (tapfc0b98b6-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.288 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[4d845fc8-4b13-4063-ad40-844f8726206a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.324 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b2027592-3131-476a-8f96-96f46e60734c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.362 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f9caad-8a41-4d7d-b993-1ac94f6cb537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.372 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0c86faa4-21b4-4441-89a2-83bf039c08fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.3755] manager: (tap72854481-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/390)
Jan 23 05:47:05 np0005593233 systemd[1]: Started Virtual Machine qemu-93-instance-000000cf.
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.413 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.431 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[0079dc35-e15e-4ea4-932f-4c1d9921d7c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.437 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.437 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ff83c8d1-b8db-4741-a13e-514c66be3717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:05Z|00859|binding|INFO|Setting lport fc0b98b6-65f9-4127-b86d-9dd37c53eb20 ovn-installed in OVS
Jan 23 05:47:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:05Z|00860|binding|INFO|Setting lport fc0b98b6-65f9-4127-b86d-9dd37c53eb20 up in Southbound
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.449 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.4705] device (tap72854481-c0): carrier: link connected
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.480 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[77b6bab6-b138-4f96-a305-bca075f504bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.511 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f687fa09-b943-404a-85c6-b774b0b43497]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 913810, 'reachable_time': 31273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302242, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.533 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[47e81246-75e0-4e66-aef1-6ff8fc017fd1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:b660'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 913810, 'tstamp': 913810}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302243, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.557 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f232c3e3-5d72-4fdc-96c3-0fd913c0db12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 257], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 913810, 'reachable_time': 31273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302244, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.599 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[29ea2b52-5504-4a3c-8cbe-27874b2b7713]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.688 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d3da6edd-2992-49db-9b1d-1e411d840d97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.690 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.691 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.691 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72854481-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.693 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 NetworkManager[48871]: <info>  [1769165225.6946] manager: (tap72854481-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Jan 23 05:47:05 np0005593233 kernel: tap72854481-c0: entered promiscuous mode
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.708 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap72854481-c0, col_values=(('external_ids', {'iface-id': '6b08537e-a263-4eec-b987-1e42878f483a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.710 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:05Z|00861|binding|INFO|Releasing lport 6b08537e-a263-4eec-b987-1e42878f483a from this chassis (sb_readonly=0)
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.712 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.734 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.735 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[677e56c1-7130-4579-b07f-3c4864564d25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.737 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.737 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:47:05 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:05.738 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'env', 'PROCESS_TAG=haproxy-72854481-c2f9-4651-8ba1-fe321a8a5546', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/72854481-c2f9-4651-8ba1-fe321a8a5546.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.868 222021 DEBUG nova.compute.manager [req-eff564ff-60df-432e-aa36-719ed5c3f087 req-33fb1358-aede-42a8-9a24-3658cd4a16c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.869 222021 DEBUG oslo_concurrency.lockutils [req-eff564ff-60df-432e-aa36-719ed5c3f087 req-33fb1358-aede-42a8-9a24-3658cd4a16c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.869 222021 DEBUG oslo_concurrency.lockutils [req-eff564ff-60df-432e-aa36-719ed5c3f087 req-33fb1358-aede-42a8-9a24-3658cd4a16c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.869 222021 DEBUG oslo_concurrency.lockutils [req-eff564ff-60df-432e-aa36-719ed5c3f087 req-33fb1358-aede-42a8-9a24-3658cd4a16c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:05 np0005593233 nova_compute[222017]: 2026-01-23 10:47:05.869 222021 DEBUG nova.compute.manager [req-eff564ff-60df-432e-aa36-719ed5c3f087 req-33fb1358-aede-42a8-9a24-3658cd4a16c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Processing event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:47:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:06.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:06 np0005593233 podman[302295]: 2026-01-23 10:47:06.176847597 +0000 UTC m=+0.076268924 container create b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:47:06 np0005593233 podman[302295]: 2026-01-23 10:47:06.135299694 +0000 UTC m=+0.034721101 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:47:06 np0005593233 systemd[1]: Started libpod-conmon-b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e.scope.
Jan 23 05:47:06 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:47:06 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/669a3dd7a125cc4c3738a13d746b4c0309b8be9ff8b0a45001073576afacfe31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:47:06 np0005593233 podman[302295]: 2026-01-23 10:47:06.311764555 +0000 UTC m=+0.211185972 container init b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:47:06 np0005593233 podman[302295]: 2026-01-23 10:47:06.318787163 +0000 UTC m=+0.218208520 container start b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:47:06 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [NOTICE]   (302314) : New worker (302316) forked
Jan 23 05:47:06 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [NOTICE]   (302314) : Loading success.
Jan 23 05:47:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:06.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:06 np0005593233 nova_compute[222017]: 2026-01-23 10:47:06.740 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.168 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.170 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165227.1679585, 5fde5da5-03bb-4c07-bd48-b634468000ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.170 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] VM Started (Lifecycle Event)#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.181 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.186 222021 INFO nova.virt.libvirt.driver [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Instance spawned successfully.#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.186 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.224 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.231 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.234 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.235 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.235 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.236 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.236 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.236 222021 DEBUG nova.virt.libvirt.driver [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.269 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.270 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165227.1725152, 5fde5da5-03bb-4c07-bd48-b634468000ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.270 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.293 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.298 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165227.1737418, 5fde5da5-03bb-4c07-bd48-b634468000ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.298 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.309 222021 INFO nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Took 6.13 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.309 222021 DEBUG nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.339 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.344 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.376 222021 INFO nova.compute.manager [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Took 9.48 seconds to build instance.#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.390 222021 DEBUG oslo_concurrency.lockutils [None req-5a480a21-c54b-4254-a24c-af8680198248 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.958 222021 DEBUG nova.compute.manager [req-46efdbda-6baa-4b6c-9894-967ca360dbaf req-f20e5874-36e7-49bf-ac99-784cec5f297e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.959 222021 DEBUG oslo_concurrency.lockutils [req-46efdbda-6baa-4b6c-9894-967ca360dbaf req-f20e5874-36e7-49bf-ac99-784cec5f297e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.959 222021 DEBUG oslo_concurrency.lockutils [req-46efdbda-6baa-4b6c-9894-967ca360dbaf req-f20e5874-36e7-49bf-ac99-784cec5f297e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.960 222021 DEBUG oslo_concurrency.lockutils [req-46efdbda-6baa-4b6c-9894-967ca360dbaf req-f20e5874-36e7-49bf-ac99-784cec5f297e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.960 222021 DEBUG nova.compute.manager [req-46efdbda-6baa-4b6c-9894-967ca360dbaf req-f20e5874-36e7-49bf-ac99-784cec5f297e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] No waiting events found dispatching network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:47:07 np0005593233 nova_compute[222017]: 2026-01-23 10:47:07.960 222021 WARNING nova.compute.manager [req-46efdbda-6baa-4b6c-9894-967ca360dbaf req-f20e5874-36e7-49bf-ac99-784cec5f297e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received unexpected event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:47:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:08.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.412 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.414 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.414 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:08.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2787387171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.881 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.977 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000cf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:47:08 np0005593233 nova_compute[222017]: 2026-01-23 10:47:08.978 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000cf as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.147 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.183 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.184 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4141MB free_disk=20.94232177734375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.185 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.185 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.355 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 5fde5da5-03bb-4c07-bd48-b634468000ac actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.355 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.356 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.519 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1938978645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.943 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.952 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:47:09 np0005593233 nova_compute[222017]: 2026-01-23 10:47:09.973 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:47:10 np0005593233 nova_compute[222017]: 2026-01-23 10:47:10.000 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:47:10 np0005593233 nova_compute[222017]: 2026-01-23 10:47:10.001 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:10.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:10.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:11 np0005593233 nova_compute[222017]: 2026-01-23 10:47:11.741 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.002 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:12.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.152 222021 DEBUG nova.compute.manager [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-changed-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.152 222021 DEBUG nova.compute.manager [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Refreshing instance network info cache due to event network-changed-fc0b98b6-65f9-4127-b86d-9dd37c53eb20. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.152 222021 DEBUG oslo_concurrency.lockutils [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.153 222021 DEBUG oslo_concurrency.lockutils [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.153 222021 DEBUG nova.network.neutron [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Refreshing network info cache for port fc0b98b6-65f9-4127-b86d-9dd37c53eb20 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:12 np0005593233 nova_compute[222017]: 2026-01-23 10:47:12.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:47:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:12.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:13 np0005593233 nova_compute[222017]: 2026-01-23 10:47:13.530 222021 DEBUG nova.network.neutron [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updated VIF entry in instance network info cache for port fc0b98b6-65f9-4127-b86d-9dd37c53eb20. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:47:13 np0005593233 nova_compute[222017]: 2026-01-23 10:47:13.531 222021 DEBUG nova.network.neutron [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating instance_info_cache with network_info: [{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:13 np0005593233 nova_compute[222017]: 2026-01-23 10:47:13.554 222021 DEBUG oslo_concurrency.lockutils [req-ef198f52-6de6-4529-95f5-e2bb821479ca req-f12f8a95-0cff-4db9-a317-0cafc6d0f2a0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:14.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:14 np0005593233 nova_compute[222017]: 2026-01-23 10:47:14.151 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:14.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:16.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:16.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:16 np0005593233 nova_compute[222017]: 2026-01-23 10:47:16.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:17 np0005593233 nova_compute[222017]: 2026-01-23 10:47:17.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:18.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:18 np0005593233 nova_compute[222017]: 2026-01-23 10:47:18.393 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:18.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:19 np0005593233 nova_compute[222017]: 2026-01-23 10:47:19.157 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:20.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:20 np0005593233 nova_compute[222017]: 2026-01-23 10:47:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:20.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:21 np0005593233 podman[302397]: 2026-01-23 10:47:21.071049689 +0000 UTC m=+0.079416683 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:47:21 np0005593233 nova_compute[222017]: 2026-01-23 10:47:21.787 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:22.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:22 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:22Z|00122|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.6 does not match offer 10.100.0.4
Jan 23 05:47:22 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:22Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:fc:33:7a 10.100.0.4
Jan 23 05:47:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:22.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:23 np0005593233 nova_compute[222017]: 2026-01-23 10:47:23.382 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:24.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.162 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:47:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:24.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.606 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.606 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.607 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:47:24 np0005593233 nova_compute[222017]: 2026-01-23 10:47:24.607 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5fde5da5-03bb-4c07-bd48-b634468000ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:25 np0005593233 nova_compute[222017]: 2026-01-23 10:47:25.827 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating instance_info_cache with network_info: [{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:25 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:25Z|00124|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.6 does not match offer 10.100.0.4
Jan 23 05:47:25 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:25Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:fc:33:7a 10.100.0.4
Jan 23 05:47:25 np0005593233 nova_compute[222017]: 2026-01-23 10:47:25.843 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:25 np0005593233 nova_compute[222017]: 2026-01-23 10:47:25.844 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:47:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:26.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:26 np0005593233 nova_compute[222017]: 2026-01-23 10:47:26.789 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:27Z|00126|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:33:7a 10.100.0.4
Jan 23 05:47:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:27Z|00127|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:33:7a 10.100.0.4
Jan 23 05:47:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:28.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:28.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:29 np0005593233 nova_compute[222017]: 2026-01-23 10:47:29.164 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:30.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:30.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:31 np0005593233 nova_compute[222017]: 2026-01-23 10:47:31.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:32.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:32.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:34.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:34 np0005593233 nova_compute[222017]: 2026-01-23 10:47:34.193 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:34.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:36.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:36 np0005593233 podman[302418]: 2026-01-23 10:47:36.083458918 +0000 UTC m=+0.095139277 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:47:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:36.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:36 np0005593233 nova_compute[222017]: 2026-01-23 10:47:36.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:47:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:38.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:47:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:47:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:47:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:38.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:39 np0005593233 nova_compute[222017]: 2026-01-23 10:47:39.197 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:40.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:40.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:41 np0005593233 nova_compute[222017]: 2026-01-23 10:47:41.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:42.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:42.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:42.710 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:42.711 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:42.713 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:44.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:44 np0005593233 nova_compute[222017]: 2026-01-23 10:47:44.224 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:44.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:47:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:47:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:46.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:46.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:46 np0005593233 nova_compute[222017]: 2026-01-23 10:47:46.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:47 np0005593233 nova_compute[222017]: 2026-01-23 10:47:47.904 222021 DEBUG nova.compute.manager [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-changed-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:47 np0005593233 nova_compute[222017]: 2026-01-23 10:47:47.905 222021 DEBUG nova.compute.manager [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Refreshing instance network info cache due to event network-changed-fc0b98b6-65f9-4127-b86d-9dd37c53eb20. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:47:47 np0005593233 nova_compute[222017]: 2026-01-23 10:47:47.905 222021 DEBUG oslo_concurrency.lockutils [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:47 np0005593233 nova_compute[222017]: 2026-01-23 10:47:47.906 222021 DEBUG oslo_concurrency.lockutils [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:47 np0005593233 nova_compute[222017]: 2026-01-23 10:47:47.906 222021 DEBUG nova.network.neutron [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Refreshing network info cache for port fc0b98b6-65f9-4127-b86d-9dd37c53eb20 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:47:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:48.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.490 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.491 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.492 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.493 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.493 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.497 222021 INFO nova.compute.manager [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Terminating instance#033[00m
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.500 222021 DEBUG nova.compute.manager [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:47:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:48.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:48 np0005593233 kernel: tapfc0b98b6-65 (unregistering): left promiscuous mode
Jan 23 05:47:48 np0005593233 NetworkManager[48871]: <info>  [1769165268.9737] device (tapfc0b98b6-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:47:48 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.996 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:48Z|00862|binding|INFO|Releasing lport fc0b98b6-65f9-4127-b86d-9dd37c53eb20 from this chassis (sb_readonly=0)
Jan 23 05:47:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:48Z|00863|binding|INFO|Setting lport fc0b98b6-65f9-4127-b86d-9dd37c53eb20 down in Southbound
Jan 23 05:47:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:47:48Z|00864|binding|INFO|Removing iface tapfc0b98b6-65 ovn-installed in OVS
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:48.999 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.019 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:33:7a 10.100.0.4'], port_security=['fa:16:3e:fc:33:7a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5fde5da5-03bb-4c07-bd48-b634468000ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbb54d44-fc85-485d-96a6-e6e12258a95a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=fc0b98b6-65f9-4127-b86d-9dd37c53eb20) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.021 140224 INFO neutron.agent.ovn.metadata.agent [-] Port fc0b98b6-65f9-4127-b86d-9dd37c53eb20 in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 unbound from our chassis#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.024 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 72854481-c2f9-4651-8ba1-fe321a8a5546, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.027 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3633ec-8fe9-496c-849d-c8a2304fd4d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.028 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace which is not needed anymore#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000cf.scope: Deactivated successfully.
Jan 23 05:47:49 np0005593233 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000cf.scope: Consumed 16.337s CPU time.
Jan 23 05:47:49 np0005593233 systemd-machined[190954]: Machine qemu-93-instance-000000cf terminated.
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.144 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.153 222021 INFO nova.virt.libvirt.driver [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Instance destroyed successfully.#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.154 222021 DEBUG nova.objects.instance [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'resources' on Instance uuid 5fde5da5-03bb-4c07-bd48-b634468000ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.203 222021 DEBUG nova.virt.libvirt.vif [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:46:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-2039370606',display_name='tempest-TestVolumeBootPattern-server-2039370606',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-2039370606',id=207,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIC+r2L0+q9PBY99aBdo5XmxGSuvAYYpgonPD6n/fI0f2obfQ97Vwf3Ee9Eeaa+EcBJLE3HyG34aomsAcNn3g+1JMNr5TfA5Vs6CyFMbO26Wy1l0eWLB9DEd+vOA6lCnxQ==',key_name='tempest-TestVolumeBootPattern-909159234',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:47:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-jyw18g1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:47:07Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=5fde5da5-03bb-4c07-bd48-b634468000ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.205 222021 DEBUG nova.network.os_vif_util [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.207 222021 DEBUG nova.network.os_vif_util [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.208 222021 DEBUG os_vif [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.212 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.213 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc0b98b6-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.217 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.225 222021 INFO os_vif [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:33:7a,bridge_name='br-int',has_traffic_filtering=True,id=fc0b98b6-65f9-4127-b86d-9dd37c53eb20,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0b98b6-65')#033[00m
Jan 23 05:47:49 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [NOTICE]   (302314) : haproxy version is 2.8.14-c23fe91
Jan 23 05:47:49 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [NOTICE]   (302314) : path to executable is /usr/sbin/haproxy
Jan 23 05:47:49 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [WARNING]  (302314) : Exiting Master process...
Jan 23 05:47:49 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [ALERT]    (302314) : Current worker (302316) exited with code 143 (Terminated)
Jan 23 05:47:49 np0005593233 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[302310]: [WARNING]  (302314) : All workers exited. Exiting... (0)
Jan 23 05:47:49 np0005593233 systemd[1]: libpod-b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e.scope: Deactivated successfully.
Jan 23 05:47:49 np0005593233 podman[302661]: 2026-01-23 10:47:49.252082851 +0000 UTC m=+0.061484487 container died b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:47:49 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e-userdata-shm.mount: Deactivated successfully.
Jan 23 05:47:49 np0005593233 systemd[1]: var-lib-containers-storage-overlay-669a3dd7a125cc4c3738a13d746b4c0309b8be9ff8b0a45001073576afacfe31-merged.mount: Deactivated successfully.
Jan 23 05:47:49 np0005593233 podman[302661]: 2026-01-23 10:47:49.297963896 +0000 UTC m=+0.107365522 container cleanup b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:47:49 np0005593233 systemd[1]: libpod-conmon-b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e.scope: Deactivated successfully.
Jan 23 05:47:49 np0005593233 podman[302711]: 2026-01-23 10:47:49.384695024 +0000 UTC m=+0.059363076 container remove b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.393 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d0344157-67e1-4732-bdad-cd562c08f2a8]: (4, ('Fri Jan 23 10:47:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e)\nb080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e\nFri Jan 23 10:47:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (b080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e)\nb080877ed8d8d443ef25292d8f62ee3604efcfc7cf9e1b309a9c065dd83e429e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.395 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[95f49b7d-069e-4487-973c-481d76f2e66d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.396 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.398 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 kernel: tap72854481-c0: left promiscuous mode
Jan 23 05:47:49 np0005593233 nova_compute[222017]: 2026-01-23 10:47:49.410 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.413 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4125bd-9fed-4c4d-86a9-7a496314c1ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.434 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8bfb0f-3865-474e-a978-f5df858a8166]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.436 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b21635b6-a4d3-42aa-ab34-730bddc94411]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.463 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[267f3aa0-3159-4c41-be4f-0cf3efb8cad8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 913798, 'reachable_time': 37600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302726, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.467 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:47:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:47:49.468 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[2723e110-3a7f-413c-bf01-68f1fa2bb24f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:49 np0005593233 systemd[1]: run-netns-ovnmeta\x2d72854481\x2dc2f9\x2d4651\x2d8ba1\x2dfe321a8a5546.mount: Deactivated successfully.
Jan 23 05:47:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:50.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:50.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.763 222021 INFO nova.virt.libvirt.driver [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Deleting instance files /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac_del#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.764 222021 INFO nova.virt.libvirt.driver [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Deletion of /var/lib/nova/instances/5fde5da5-03bb-4c07-bd48-b634468000ac_del complete#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.879 222021 DEBUG nova.compute.manager [req-219df8bf-b0e1-4e69-9cec-966f2c94024a req-a1d808a0-71a3-49fa-a517-1ba539c377d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-vif-unplugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.880 222021 DEBUG oslo_concurrency.lockutils [req-219df8bf-b0e1-4e69-9cec-966f2c94024a req-a1d808a0-71a3-49fa-a517-1ba539c377d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.881 222021 DEBUG oslo_concurrency.lockutils [req-219df8bf-b0e1-4e69-9cec-966f2c94024a req-a1d808a0-71a3-49fa-a517-1ba539c377d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.881 222021 DEBUG oslo_concurrency.lockutils [req-219df8bf-b0e1-4e69-9cec-966f2c94024a req-a1d808a0-71a3-49fa-a517-1ba539c377d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.881 222021 DEBUG nova.compute.manager [req-219df8bf-b0e1-4e69-9cec-966f2c94024a req-a1d808a0-71a3-49fa-a517-1ba539c377d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] No waiting events found dispatching network-vif-unplugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.881 222021 DEBUG nova.compute.manager [req-219df8bf-b0e1-4e69-9cec-966f2c94024a req-a1d808a0-71a3-49fa-a517-1ba539c377d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-vif-unplugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.883 222021 INFO nova.compute.manager [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Took 2.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.883 222021 DEBUG oslo.service.loopingcall [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.883 222021 DEBUG nova.compute.manager [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:47:50 np0005593233 nova_compute[222017]: 2026-01-23 10:47:50.884 222021 DEBUG nova.network.neutron [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:47:51 np0005593233 nova_compute[222017]: 2026-01-23 10:47:51.802 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:52 np0005593233 podman[302728]: 2026-01-23 10:47:52.072972239 +0000 UTC m=+0.070522732 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 05:47:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:52.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:52.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.346 222021 DEBUG nova.compute.manager [req-f40dff53-204f-4fc9-8a78-ba15a71e9565 req-f03e0fdd-522a-452c-823e-10ab03e06154 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.347 222021 DEBUG oslo_concurrency.lockutils [req-f40dff53-204f-4fc9-8a78-ba15a71e9565 req-f03e0fdd-522a-452c-823e-10ab03e06154 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.347 222021 DEBUG oslo_concurrency.lockutils [req-f40dff53-204f-4fc9-8a78-ba15a71e9565 req-f03e0fdd-522a-452c-823e-10ab03e06154 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.347 222021 DEBUG oslo_concurrency.lockutils [req-f40dff53-204f-4fc9-8a78-ba15a71e9565 req-f03e0fdd-522a-452c-823e-10ab03e06154 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.347 222021 DEBUG nova.compute.manager [req-f40dff53-204f-4fc9-8a78-ba15a71e9565 req-f03e0fdd-522a-452c-823e-10ab03e06154 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] No waiting events found dispatching network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.347 222021 WARNING nova.compute.manager [req-f40dff53-204f-4fc9-8a78-ba15a71e9565 req-f03e0fdd-522a-452c-823e-10ab03e06154 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received unexpected event network-vif-plugged-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.416 222021 DEBUG nova.network.neutron [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.453 222021 INFO nova.compute.manager [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Took 2.57 seconds to deallocate network for instance.#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.611 222021 DEBUG nova.compute.manager [req-b4035e93-f396-48f2-8ea7-31b56fb6d2db req-6433a985-e505-4a6b-82d7-a8764f526393 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Received event network-vif-deleted-fc0b98b6-65f9-4127-b86d-9dd37c53eb20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:53 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.927 222021 INFO nova.compute.manager [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Took 0.47 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:53.999 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.001 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.079 222021 DEBUG oslo_concurrency.processutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:54.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.220 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:54 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/159730390' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:54.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.582 222021 DEBUG oslo_concurrency.processutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.593 222021 DEBUG nova.compute.provider_tree [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.627 222021 DEBUG nova.scheduler.client.report [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.672 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.695 222021 DEBUG nova.network.neutron [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updated VIF entry in instance network info cache for port fc0b98b6-65f9-4127-b86d-9dd37c53eb20. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.696 222021 DEBUG nova.network.neutron [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Updating instance_info_cache with network_info: [{"id": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "address": "fa:16:3e:fc:33:7a", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0b98b6-65", "ovs_interfaceid": "fc0b98b6-65f9-4127-b86d-9dd37c53eb20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.745 222021 DEBUG oslo_concurrency.lockutils [req-2f26a65b-a995-4873-9397-8079310bcff0 req-2712e864-68fe-4f37-9ee2-e7f1a8c1acbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5fde5da5-03bb-4c07-bd48-b634468000ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.772 222021 INFO nova.scheduler.client.report [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Deleted allocations for instance 5fde5da5-03bb-4c07-bd48-b634468000ac#033[00m
Jan 23 05:47:54 np0005593233 nova_compute[222017]: 2026-01-23 10:47:54.864 222021 DEBUG oslo_concurrency.lockutils [None req-87777d93-0398-4461-9b0a-ca719ef9adb0 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "5fde5da5-03bb-4c07-bd48-b634468000ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:56.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:56 np0005593233 nova_compute[222017]: 2026-01-23 10:47:56.843 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:47:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:58.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:47:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:47:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:47:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:58.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:47:59 np0005593233 nova_compute[222017]: 2026-01-23 10:47:59.257 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:00.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:00.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e408 e408: 3 total, 3 up, 3 in
Jan 23 05:48:01 np0005593233 nova_compute[222017]: 2026-01-23 10:48:01.846 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:02.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:48:02.517 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:48:02 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:48:02.519 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:48:02 np0005593233 nova_compute[222017]: 2026-01-23 10:48:02.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:04.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:04 np0005593233 nova_compute[222017]: 2026-01-23 10:48:04.150 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165269.1491637, 5fde5da5-03bb-4c07-bd48-b634468000ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:48:04 np0005593233 nova_compute[222017]: 2026-01-23 10:48:04.151 222021 INFO nova.compute.manager [-] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:48:04 np0005593233 nova_compute[222017]: 2026-01-23 10:48:04.193 222021 DEBUG nova.compute.manager [None req-21a62a23-6a68-4601-a531-6b5f9b678f31 - - - - - -] [instance: 5fde5da5-03bb-4c07-bd48-b634468000ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:04 np0005593233 nova_compute[222017]: 2026-01-23 10:48:04.261 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:04.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e408 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:06.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:06.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:06 np0005593233 nova_compute[222017]: 2026-01-23 10:48:06.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:07 np0005593233 podman[302773]: 2026-01-23 10:48:07.182454327 +0000 UTC m=+0.171128172 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:48:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:48:07.522 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:08.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:08.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:09 np0005593233 nova_compute[222017]: 2026-01-23 10:48:09.304 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e409 e409: 3 total, 3 up, 3 in
Jan 23 05:48:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:10.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e409 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.451 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.451 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.452 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.452 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.453 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:10.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e410 e410: 3 total, 3 up, 3 in
Jan 23 05:48:10 np0005593233 nova_compute[222017]: 2026-01-23 10:48:10.975 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.220 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.222 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4268MB free_disk=20.942638397216797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.223 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.223 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.346 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.346 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.390 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.896 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:48:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2697468869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.943 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.951 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:48:11 np0005593233 nova_compute[222017]: 2026-01-23 10:48:11.970 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:48:12 np0005593233 nova_compute[222017]: 2026-01-23 10:48:12.000 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:48:12 np0005593233 nova_compute[222017]: 2026-01-23 10:48:12.000 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:12.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:12.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:13 np0005593233 nova_compute[222017]: 2026-01-23 10:48:13.000 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:14.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:14 np0005593233 nova_compute[222017]: 2026-01-23 10:48:14.309 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:14 np0005593233 nova_compute[222017]: 2026-01-23 10:48:14.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:14 np0005593233 nova_compute[222017]: 2026-01-23 10:48:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:14 np0005593233 nova_compute[222017]: 2026-01-23 10:48:14.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:48:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:14.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e411 e411: 3 total, 3 up, 3 in
Jan 23 05:48:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:16.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:16.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e412 e412: 3 total, 3 up, 3 in
Jan 23 05:48:16 np0005593233 nova_compute[222017]: 2026-01-23 10:48:16.921 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:18.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:18 np0005593233 nova_compute[222017]: 2026-01-23 10:48:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:18.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:19 np0005593233 nova_compute[222017]: 2026-01-23 10:48:19.359 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:20.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:20 np0005593233 nova_compute[222017]: 2026-01-23 10:48:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:20.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:21 np0005593233 nova_compute[222017]: 2026-01-23 10:48:21.564 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:21 np0005593233 nova_compute[222017]: 2026-01-23 10:48:21.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:21 np0005593233 nova_compute[222017]: 2026-01-23 10:48:21.955 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:22.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:22 np0005593233 nova_compute[222017]: 2026-01-23 10:48:22.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:22.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:23 np0005593233 podman[302848]: 2026-01-23 10:48:23.090448739 +0000 UTC m=+0.092737599 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:48:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:24.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:24 np0005593233 nova_compute[222017]: 2026-01-23 10:48:24.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:24 np0005593233 nova_compute[222017]: 2026-01-23 10:48:24.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:24.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e413 e413: 3 total, 3 up, 3 in
Jan 23 05:48:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:26.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:26 np0005593233 nova_compute[222017]: 2026-01-23 10:48:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:26 np0005593233 nova_compute[222017]: 2026-01-23 10:48:26.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:48:26 np0005593233 nova_compute[222017]: 2026-01-23 10:48:26.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:48:26 np0005593233 nova_compute[222017]: 2026-01-23 10:48:26.416 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:48:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.005000142s ======
Jan 23 05:48:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:26.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000142s
Jan 23 05:48:26 np0005593233 nova_compute[222017]: 2026-01-23 10:48:26.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:28.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:28.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:29 np0005593233 nova_compute[222017]: 2026-01-23 10:48:29.439 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:30.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:30.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:48:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1395396203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:48:32 np0005593233 nova_compute[222017]: 2026-01-23 10:48:32.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:32.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:32.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:34.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:34 np0005593233 nova_compute[222017]: 2026-01-23 10:48:34.484 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:34.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:36.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:36.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:37 np0005593233 nova_compute[222017]: 2026-01-23 10:48:37.009 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:38.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:38 np0005593233 podman[302867]: 2026-01-23 10:48:38.193440504 +0000 UTC m=+0.190177490 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:48:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:38.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:39 np0005593233 nova_compute[222017]: 2026-01-23 10:48:39.486 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:40.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:40.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:42 np0005593233 nova_compute[222017]: 2026-01-23 10:48:42.011 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:42.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:42.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:48:42.711 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:48:42.711 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:48:42.712 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:43 np0005593233 nova_compute[222017]: 2026-01-23 10:48:43.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:44.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:44 np0005593233 nova_compute[222017]: 2026-01-23 10:48:44.490 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e414 e414: 3 total, 3 up, 3 in
Jan 23 05:48:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:44.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e414 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:48:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:48:45 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:48:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:46.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:46.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:47 np0005593233 nova_compute[222017]: 2026-01-23 10:48:47.015 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:48:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:48.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:48:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:48.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:49 np0005593233 nova_compute[222017]: 2026-01-23 10:48:49.494 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 e415: 3 total, 3 up, 3 in
Jan 23 05:48:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:50.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:50.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:48:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:48:52 np0005593233 nova_compute[222017]: 2026-01-23 10:48:52.018 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:52.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:52.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:54 np0005593233 podman[303076]: 2026-01-23 10:48:54.087320285 +0000 UTC m=+0.086599735 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:48:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:54.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:54 np0005593233 nova_compute[222017]: 2026-01-23 10:48:54.519 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:54.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:48:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:56.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:48:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:56.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:57 np0005593233 nova_compute[222017]: 2026-01-23 10:48:57.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:57 np0005593233 ovn_controller[130653]: 2026-01-23T10:48:57Z|00865|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 23 05:48:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:58.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:48:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:58.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:59 np0005593233 nova_compute[222017]: 2026-01-23 10:48:59.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:00.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:00.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:02 np0005593233 nova_compute[222017]: 2026-01-23 10:49:02.061 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:02.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:02.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:04.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:04 np0005593233 nova_compute[222017]: 2026-01-23 10:49:04.564 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:04.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:06.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:06.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:07 np0005593233 nova_compute[222017]: 2026-01-23 10:49:07.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:07 np0005593233 nova_compute[222017]: 2026-01-23 10:49:07.862 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:49:07.862 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:49:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:49:07.864 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:49:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:08.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:08.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:09 np0005593233 podman[303096]: 2026-01-23 10:49:09.10457651 +0000 UTC m=+0.116680605 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:49:09 np0005593233 nova_compute[222017]: 2026-01-23 10:49:09.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.673590) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349673658, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 2086, "num_deletes": 254, "total_data_size": 4891565, "memory_usage": 4965448, "flush_reason": "Manual Compaction"}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349694013, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 1972973, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 86291, "largest_seqno": 88372, "table_properties": {"data_size": 1966413, "index_size": 3441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17287, "raw_average_key_size": 21, "raw_value_size": 1951870, "raw_average_value_size": 2409, "num_data_blocks": 152, "num_entries": 810, "num_filter_entries": 810, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165178, "oldest_key_time": 1769165178, "file_creation_time": 1769165349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 20597 microseconds, and 9849 cpu microseconds.
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.694176) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 1972973 bytes OK
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.694214) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.695616) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.695633) EVENT_LOG_v1 {"time_micros": 1769165349695628, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.695653) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 4882049, prev total WAL file size 4882049, number of live WAL files 2.
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.697181) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303039' seq:72057594037927935, type:22 .. '6D6772737461740033323630' seq:0, type:0; will stop at (end)
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(1926KB)], [180(12MB)]
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349697294, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 15009192, "oldest_snapshot_seqno": -1}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 10767 keys, 12382095 bytes, temperature: kUnknown
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349818885, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 12382095, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12315056, "index_size": 39022, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26949, "raw_key_size": 283242, "raw_average_key_size": 26, "raw_value_size": 12129291, "raw_average_value_size": 1126, "num_data_blocks": 1483, "num_entries": 10767, "num_filter_entries": 10767, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.819316) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 12382095 bytes
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.820875) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 123.3 rd, 101.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.4 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(13.9) write-amplify(6.3) OK, records in: 11213, records dropped: 446 output_compression: NoCompression
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.820898) EVENT_LOG_v1 {"time_micros": 1769165349820886, "job": 116, "event": "compaction_finished", "compaction_time_micros": 121768, "compaction_time_cpu_micros": 56385, "output_level": 6, "num_output_files": 1, "total_output_size": 12382095, "num_input_records": 11213, "num_output_records": 10767, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349821849, "job": 116, "event": "table_file_deletion", "file_number": 182}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349825077, "job": 116, "event": "table_file_deletion", "file_number": 180}
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.697036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.825119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.825125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.825127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.825129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:09.825131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:10.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.430 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.431 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.431 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.432 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:49:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:10.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:49:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3001044165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:49:10 np0005593233 nova_compute[222017]: 2026-01-23 10:49:10.938 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.165 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.167 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4285MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.167 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.167 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.292 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.293 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.315 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:49:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:49:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2455323866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.812 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.819 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.841 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.842 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:49:11 np0005593233 nova_compute[222017]: 2026-01-23 10:49:11.843 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:49:11.866 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:49:12 np0005593233 nova_compute[222017]: 2026-01-23 10:49:12.067 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:12.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:12.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:13 np0005593233 nova_compute[222017]: 2026-01-23 10:49:13.843 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:14.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:14 np0005593233 nova_compute[222017]: 2026-01-23 10:49:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:14 np0005593233 nova_compute[222017]: 2026-01-23 10:49:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:14 np0005593233 nova_compute[222017]: 2026-01-23 10:49:14.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:49:14 np0005593233 nova_compute[222017]: 2026-01-23 10:49:14.637 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:14.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:16.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:16.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:17 np0005593233 nova_compute[222017]: 2026-01-23 10:49:17.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:18.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:18 np0005593233 nova_compute[222017]: 2026-01-23 10:49:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:18.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:19 np0005593233 nova_compute[222017]: 2026-01-23 10:49:19.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:20.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:20 np0005593233 nova_compute[222017]: 2026-01-23 10:49:20.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:20.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:22 np0005593233 nova_compute[222017]: 2026-01-23 10:49:22.073 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:22.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:22.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:24.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:24 np0005593233 nova_compute[222017]: 2026-01-23 10:49:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:24 np0005593233 nova_compute[222017]: 2026-01-23 10:49:24.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:24.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:25 np0005593233 podman[303167]: 2026-01-23 10:49:25.081758681 +0000 UTC m=+0.086201814 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:49:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:25 np0005593233 nova_compute[222017]: 2026-01-23 10:49:25.381 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:26.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:26.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:27 np0005593233 nova_compute[222017]: 2026-01-23 10:49:27.076 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:28.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:28 np0005593233 nova_compute[222017]: 2026-01-23 10:49:28.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:28 np0005593233 nova_compute[222017]: 2026-01-23 10:49:28.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:49:28 np0005593233 nova_compute[222017]: 2026-01-23 10:49:28.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:49:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:49:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:28.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:49:29 np0005593233 nova_compute[222017]: 2026-01-23 10:49:29.397 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:49:29 np0005593233 nova_compute[222017]: 2026-01-23 10:49:29.714 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:30.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:30.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:32 np0005593233 nova_compute[222017]: 2026-01-23 10:49:32.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:32.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:32.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:34.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000057s ======
Jan 23 05:49:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:34.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Jan 23 05:49:34 np0005593233 nova_compute[222017]: 2026-01-23 10:49:34.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:36.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:36.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:37 np0005593233 nova_compute[222017]: 2026-01-23 10:49:37.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:38.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:38.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:39 np0005593233 nova_compute[222017]: 2026-01-23 10:49:39.906 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:40 np0005593233 podman[303186]: 2026-01-23 10:49:40.15784581 +0000 UTC m=+0.164779792 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 23 05:49:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:40.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:40.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e416 e416: 3 total, 3 up, 3 in
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.705726) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381705771, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 567, "num_deletes": 251, "total_data_size": 835513, "memory_usage": 846728, "flush_reason": "Manual Compaction"}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381712142, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 551401, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 88377, "largest_seqno": 88939, "table_properties": {"data_size": 548474, "index_size": 898, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7009, "raw_average_key_size": 19, "raw_value_size": 542600, "raw_average_value_size": 1478, "num_data_blocks": 40, "num_entries": 367, "num_filter_entries": 367, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165349, "oldest_key_time": 1769165349, "file_creation_time": 1769165381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 6466 microseconds, and 3131 cpu microseconds.
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.712189) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 551401 bytes OK
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.712212) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.714239) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.714748) EVENT_LOG_v1 {"time_micros": 1769165381714252, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.714768) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 832241, prev total WAL file size 832241, number of live WAL files 2.
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.715302) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(538KB)], [183(11MB)]
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381715328, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 12933496, "oldest_snapshot_seqno": -1}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 10620 keys, 11027383 bytes, temperature: kUnknown
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381804280, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11027383, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10962514, "index_size": 37236, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 280889, "raw_average_key_size": 26, "raw_value_size": 10780469, "raw_average_value_size": 1015, "num_data_blocks": 1401, "num_entries": 10620, "num_filter_entries": 10620, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.804738) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11027383 bytes
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.806530) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.2 rd, 123.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 11.8 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(43.5) write-amplify(20.0) OK, records in: 11134, records dropped: 514 output_compression: NoCompression
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.806572) EVENT_LOG_v1 {"time_micros": 1769165381806553, "job": 118, "event": "compaction_finished", "compaction_time_micros": 89094, "compaction_time_cpu_micros": 36209, "output_level": 6, "num_output_files": 1, "total_output_size": 11027383, "num_input_records": 11134, "num_output_records": 10620, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381807214, "job": 118, "event": "table_file_deletion", "file_number": 185}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381813000, "job": 118, "event": "table_file_deletion", "file_number": 183}
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.715254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.813173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.813184) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.813187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.813191) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:49:41.813194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:42 np0005593233 nova_compute[222017]: 2026-01-23 10:49:42.282 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:42.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:49:42.712 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:49:42.713 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:49:42.713 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e417 e417: 3 total, 3 up, 3 in
Jan 23 05:49:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:42.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:44.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:44 np0005593233 nova_compute[222017]: 2026-01-23 10:49:44.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:44.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:46.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:46.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:47 np0005593233 nova_compute[222017]: 2026-01-23 10:49:47.285 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:48.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 e418: 3 total, 3 up, 3 in
Jan 23 05:49:49 np0005593233 nova_compute[222017]: 2026-01-23 10:49:49.910 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:50.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:50.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:52 np0005593233 nova_compute[222017]: 2026-01-23 10:49:52.287 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:52.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:52 np0005593233 podman[303384]: 2026-01-23 10:49:52.726725961 +0000 UTC m=+0.087055777 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 23 05:49:52 np0005593233 podman[303384]: 2026-01-23 10:49:52.835526292 +0000 UTC m=+0.195856128 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:49:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:52.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:49:53 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:49:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:54.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:49:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:49:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:49:54 np0005593233 nova_compute[222017]: 2026-01-23 10:49:54.914 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:54.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:55 np0005593233 nova_compute[222017]: 2026-01-23 10:49:55.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:56 np0005593233 podman[303640]: 2026-01-23 10:49:56.089267319 +0000 UTC m=+0.087076859 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:49:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:56.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:56.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:57 np0005593233 nova_compute[222017]: 2026-01-23 10:49:57.325 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:58 np0005593233 ovn_controller[130653]: 2026-01-23T10:49:58Z|00866|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Jan 23 05:49:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:49:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:58.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:49:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:49:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:49:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:58.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:49:59 np0005593233 nova_compute[222017]: 2026-01-23 10:49:59.919 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 05:50:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:00.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:00.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:02.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:02 np0005593233 nova_compute[222017]: 2026-01-23 10:50:02.328 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:50:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:50:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:02.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:03 np0005593233 nova_compute[222017]: 2026-01-23 10:50:03.406 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:03 np0005593233 nova_compute[222017]: 2026-01-23 10:50:03.407 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:50:03 np0005593233 nova_compute[222017]: 2026-01-23 10:50:03.465 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:50:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:04.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:04 np0005593233 nova_compute[222017]: 2026-01-23 10:50:04.923 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:04.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:06.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:06 np0005593233 nova_compute[222017]: 2026-01-23 10:50:06.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:06 np0005593233 nova_compute[222017]: 2026-01-23 10:50:06.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:50:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:06.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:07 np0005593233 nova_compute[222017]: 2026-01-23 10:50:07.384 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:08.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:08.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:09 np0005593233 nova_compute[222017]: 2026-01-23 10:50:09.926 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:10.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:10 np0005593233 nova_compute[222017]: 2026-01-23 10:50:10.404 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:10 np0005593233 nova_compute[222017]: 2026-01-23 10:50:10.618 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:10 np0005593233 nova_compute[222017]: 2026-01-23 10:50:10.620 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:10 np0005593233 nova_compute[222017]: 2026-01-23 10:50:10.621 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:10 np0005593233 nova_compute[222017]: 2026-01-23 10:50:10.621 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:50:10 np0005593233 nova_compute[222017]: 2026-01-23 10:50:10.622 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:50:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3029450616' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:50:11 np0005593233 podman[303729]: 2026-01-23 10:50:11.181942914 +0000 UTC m=+0.179177459 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.191 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.569s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.434 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.435 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4289MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.436 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.436 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.541 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.542 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:50:11 np0005593233 nova_compute[222017]: 2026-01-23 10:50:11.572 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:50:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3262512820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:50:12 np0005593233 nova_compute[222017]: 2026-01-23 10:50:12.094 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:12 np0005593233 nova_compute[222017]: 2026-01-23 10:50:12.102 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:50:12 np0005593233 nova_compute[222017]: 2026-01-23 10:50:12.144 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:50:12 np0005593233 nova_compute[222017]: 2026-01-23 10:50:12.147 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:50:12 np0005593233 nova_compute[222017]: 2026-01-23 10:50:12.147 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:12.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:12 np0005593233 nova_compute[222017]: 2026-01-23 10:50:12.388 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:12.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:14 np0005593233 nova_compute[222017]: 2026-01-23 10:50:14.128 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:14.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:14 np0005593233 nova_compute[222017]: 2026-01-23 10:50:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:14 np0005593233 nova_compute[222017]: 2026-01-23 10:50:14.930 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:14.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:15 np0005593233 nova_compute[222017]: 2026-01-23 10:50:15.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:15 np0005593233 nova_compute[222017]: 2026-01-23 10:50:15.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:50:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:16.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:16.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:17 np0005593233 nova_compute[222017]: 2026-01-23 10:50:17.412 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:18.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:18.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:19 np0005593233 nova_compute[222017]: 2026-01-23 10:50:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:19 np0005593233 nova_compute[222017]: 2026-01-23 10:50:19.933 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:20.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:20.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:22.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:22 np0005593233 nova_compute[222017]: 2026-01-23 10:50:22.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:22 np0005593233 nova_compute[222017]: 2026-01-23 10:50:22.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:50:22.917 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:50:22 np0005593233 nova_compute[222017]: 2026-01-23 10:50:22.917 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:50:22.918 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:50:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:22.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:24.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:24 np0005593233 nova_compute[222017]: 2026-01-23 10:50:24.936 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:25 np0005593233 nova_compute[222017]: 2026-01-23 10:50:25.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:26.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:26 np0005593233 nova_compute[222017]: 2026-01-23 10:50:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:26.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:27 np0005593233 podman[303779]: 2026-01-23 10:50:27.060977015 +0000 UTC m=+0.072999222 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 23 05:50:27 np0005593233 nova_compute[222017]: 2026-01-23 10:50:27.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:28.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:28.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:29 np0005593233 nova_compute[222017]: 2026-01-23 10:50:29.951 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:30.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:30 np0005593233 nova_compute[222017]: 2026-01-23 10:50:30.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:30 np0005593233 nova_compute[222017]: 2026-01-23 10:50:30.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:50:30 np0005593233 nova_compute[222017]: 2026-01-23 10:50:30.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:50:30 np0005593233 nova_compute[222017]: 2026-01-23 10:50:30.444 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:50:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:50:30.920 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:50:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:30.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:32.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:32 np0005593233 nova_compute[222017]: 2026-01-23 10:50:32.419 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:32.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:34.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:34 np0005593233 nova_compute[222017]: 2026-01-23 10:50:34.954 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:35.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:36.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:37.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:37 np0005593233 nova_compute[222017]: 2026-01-23 10:50:37.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:38.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:39.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:39 np0005593233 nova_compute[222017]: 2026-01-23 10:50:39.957 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:40.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:41.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:42 np0005593233 podman[303800]: 2026-01-23 10:50:42.159648787 +0000 UTC m=+0.162298322 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:50:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:42.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:42 np0005593233 nova_compute[222017]: 2026-01-23 10:50:42.423 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:50:42.713 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:50:42.713 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:50:42.714 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:43.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:44.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:50:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2557681845' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:50:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:50:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2557681845' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:50:44 np0005593233 nova_compute[222017]: 2026-01-23 10:50:44.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:45.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:46.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:46 np0005593233 nova_compute[222017]: 2026-01-23 10:50:46.438 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:47.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:47 np0005593233 nova_compute[222017]: 2026-01-23 10:50:47.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:48.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:49.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:49 np0005593233 nova_compute[222017]: 2026-01-23 10:50:49.964 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:50.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:51.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:52.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:52 np0005593233 nova_compute[222017]: 2026-01-23 10:50:52.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:53.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:50:53Z|00867|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 23 05:50:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:54.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:54 np0005593233 nova_compute[222017]: 2026-01-23 10:50:54.968 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:55.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:50:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:56.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:50:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:57.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:57 np0005593233 nova_compute[222017]: 2026-01-23 10:50:57.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:58 np0005593233 podman[303828]: 2026-01-23 10:50:58.069089846 +0000 UTC m=+0.076722357 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:50:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:50:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:58.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:50:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:50:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:59 np0005593233 nova_compute[222017]: 2026-01-23 10:50:59.970 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:00.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:01.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:02.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:02 np0005593233 nova_compute[222017]: 2026-01-23 10:51:02.434 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:03.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:04 np0005593233 nova_compute[222017]: 2026-01-23 10:51:04.975 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:05.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:51:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:06.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:51:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:07.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:07 np0005593233 nova_compute[222017]: 2026-01-23 10:51:07.438 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:08.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:09.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:09 np0005593233 nova_compute[222017]: 2026-01-23 10:51:09.977 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:10 np0005593233 nova_compute[222017]: 2026-01-23 10:51:10.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:10.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:10 np0005593233 nova_compute[222017]: 2026-01-23 10:51:10.538 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:10 np0005593233 nova_compute[222017]: 2026-01-23 10:51:10.538 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:10 np0005593233 nova_compute[222017]: 2026-01-23 10:51:10.539 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:10 np0005593233 nova_compute[222017]: 2026-01-23 10:51:10.539 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:51:10 np0005593233 nova_compute[222017]: 2026-01-23 10:51:10.540 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:51:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2175884603' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:51:11 np0005593233 nova_compute[222017]: 2026-01-23 10:51:11.061 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:11.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:11 np0005593233 nova_compute[222017]: 2026-01-23 10:51:11.278 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:51:11 np0005593233 nova_compute[222017]: 2026-01-23 10:51:11.280 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4299MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:51:11 np0005593233 nova_compute[222017]: 2026-01-23 10:51:11.280 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:11 np0005593233 nova_compute[222017]: 2026-01-23 10:51:11.280 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.285 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.285 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.300 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.323 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.323 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.339 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.362 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.381 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.441 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:12.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:51:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3295298713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.868 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.876 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.916 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.917 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:51:12 np0005593233 nova_compute[222017]: 2026-01-23 10:51:12.917 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:13.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:13 np0005593233 podman[304022]: 2026-01-23 10:51:13.107448197 +0000 UTC m=+0.122303423 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:51:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:14.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:15 np0005593233 nova_compute[222017]: 2026-01-23 10:51:15.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:15.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:16.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:16 np0005593233 nova_compute[222017]: 2026-01-23 10:51:16.917 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:16 np0005593233 nova_compute[222017]: 2026-01-23 10:51:16.918 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:16 np0005593233 nova_compute[222017]: 2026-01-23 10:51:16.919 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:16 np0005593233 nova_compute[222017]: 2026-01-23 10:51:16.919 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:51:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:17.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:17 np0005593233 nova_compute[222017]: 2026-01-23 10:51:17.470 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:51:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.8 total, 600.0 interval#012Cumulative writes: 65K writes, 247K keys, 65K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.04 MB/s#012Cumulative WAL: 65K writes, 24K syncs, 2.63 writes per sync, written: 0.23 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4600 writes, 13K keys, 4600 commit groups, 1.0 writes per commit group, ingest: 11.03 MB, 0.02 MB/s#012Interval WAL: 4600 writes, 1959 syncs, 2.35 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:51:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:18.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:19.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.877474) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479877554, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 1293, "num_deletes": 256, "total_data_size": 2777493, "memory_usage": 2809088, "flush_reason": "Manual Compaction"}
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479897660, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1812004, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 88944, "largest_seqno": 90232, "table_properties": {"data_size": 1806451, "index_size": 2884, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12372, "raw_average_key_size": 19, "raw_value_size": 1794978, "raw_average_value_size": 2876, "num_data_blocks": 127, "num_entries": 624, "num_filter_entries": 624, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165382, "oldest_key_time": 1769165382, "file_creation_time": 1769165479, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 20244 microseconds, and 9671 cpu microseconds.
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.897719) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1812004 bytes OK
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.897741) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.900787) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.900802) EVENT_LOG_v1 {"time_micros": 1769165479900797, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.900822) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 2771222, prev total WAL file size 2771222, number of live WAL files 2.
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.902068) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353139' seq:72057594037927935, type:22 .. '6C6F676D0033373731' seq:0, type:0; will stop at (end)
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(1769KB)], [186(10MB)]
Jan 23 05:51:19 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479902206, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12839387, "oldest_snapshot_seqno": -1}
Jan 23 05:51:20 np0005593233 nova_compute[222017]: 2026-01-23 10:51:20.024 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 10713 keys, 12699769 bytes, temperature: kUnknown
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165480316833, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12699769, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12632306, "index_size": 39574, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26821, "raw_key_size": 283816, "raw_average_key_size": 26, "raw_value_size": 12446596, "raw_average_value_size": 1161, "num_data_blocks": 1499, "num_entries": 10713, "num_filter_entries": 10713, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165479, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.317343) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12699769 bytes
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.319162) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 31.0 rd, 30.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.5 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(14.1) write-amplify(7.0) OK, records in: 11244, records dropped: 531 output_compression: NoCompression
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.319197) EVENT_LOG_v1 {"time_micros": 1769165480319180, "job": 120, "event": "compaction_finished", "compaction_time_micros": 414816, "compaction_time_cpu_micros": 55315, "output_level": 6, "num_output_files": 1, "total_output_size": 12699769, "num_input_records": 11244, "num_output_records": 10713, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165480320010, "job": 120, "event": "table_file_deletion", "file_number": 188}
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165480323864, "job": 120, "event": "table_file_deletion", "file_number": 186}
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:19.901826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.323952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.323958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.323961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.323964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:51:20.323967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:20.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:21.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:21 np0005593233 nova_compute[222017]: 2026-01-23 10:51:21.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:22 np0005593233 nova_compute[222017]: 2026-01-23 10:51:22.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:22.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:22 np0005593233 nova_compute[222017]: 2026-01-23 10:51:22.473 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:23.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:24.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:25 np0005593233 nova_compute[222017]: 2026-01-23 10:51:25.025 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:25.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:26 np0005593233 nova_compute[222017]: 2026-01-23 10:51:26.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:26.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:27.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:27 np0005593233 nova_compute[222017]: 2026-01-23 10:51:27.474 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:28 np0005593233 nova_compute[222017]: 2026-01-23 10:51:28.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:28.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:29 np0005593233 podman[304099]: 2026-01-23 10:51:29.085893514 +0000 UTC m=+0.084138286 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:51:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:29.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:30 np0005593233 nova_compute[222017]: 2026-01-23 10:51:30.029 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:30.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:31.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:31 np0005593233 nova_compute[222017]: 2026-01-23 10:51:31.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:31 np0005593233 nova_compute[222017]: 2026-01-23 10:51:31.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:51:31 np0005593233 nova_compute[222017]: 2026-01-23 10:51:31.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:51:31 np0005593233 nova_compute[222017]: 2026-01-23 10:51:31.537 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:51:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:32.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:32 np0005593233 nova_compute[222017]: 2026-01-23 10:51:32.477 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:33.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:34.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:35 np0005593233 nova_compute[222017]: 2026-01-23 10:51:35.032 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:35.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:36.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:37.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:37 np0005593233 nova_compute[222017]: 2026-01-23 10:51:37.479 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:38.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:39.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:40 np0005593233 nova_compute[222017]: 2026-01-23 10:51:40.036 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:40.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:41.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:42 np0005593233 nova_compute[222017]: 2026-01-23 10:51:42.480 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:42.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:51:42.714 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:51:42.715 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:51:42.715 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:43.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:44 np0005593233 podman[304119]: 2026-01-23 10:51:44.082492284 +0000 UTC m=+0.093282324 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 05:51:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:44.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:45 np0005593233 nova_compute[222017]: 2026-01-23 10:51:45.037 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:45.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:46.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:47.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:47 np0005593233 nova_compute[222017]: 2026-01-23 10:51:47.482 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:48.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:50 np0005593233 nova_compute[222017]: 2026-01-23 10:51:50.041 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:50.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:51.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:52.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:52 np0005593233 nova_compute[222017]: 2026-01-23 10:51:52.524 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:53.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:51:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:54.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:51:55 np0005593233 nova_compute[222017]: 2026-01-23 10:51:55.044 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:55.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:56.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:51:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 17K writes, 90K keys, 17K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 17K writes, 17K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1545 writes, 7858 keys, 1545 commit groups, 1.0 writes per commit group, ingest: 16.08 MB, 0.03 MB/s#012Interval WAL: 1545 writes, 1545 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     54.1      2.08              0.51        60    0.035       0      0       0.0       0.0#012  L6      1/0   12.11 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.3     76.7     65.7      9.12              2.16        59    0.154    457K    31K       0.0       0.0#012 Sum      1/0   12.11 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.3     62.5     63.5     11.19              2.67       119    0.094    457K    31K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.0     59.7     61.0      1.32              0.34        12    0.110     66K   3078       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0     76.7     65.7      9.12              2.16        59    0.154    457K    31K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     54.2      2.08              0.51        59    0.035       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.110, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.69 GB write, 0.11 MB/s write, 0.68 GB read, 0.11 MB/s read, 11.2 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 76.12 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.000554 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4329,72.89 MB,23.9773%) FilterBlock(119,1.24 MB,0.407786%) IndexBlock(119,1.98 MB,0.652951%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:51:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:57.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:57 np0005593233 nova_compute[222017]: 2026-01-23 10:51:57.525 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:51:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:58.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:51:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:51:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:00 np0005593233 nova_compute[222017]: 2026-01-23 10:52:00.046 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:00 np0005593233 podman[304146]: 2026-01-23 10:52:00.069259255 +0000 UTC m=+0.072296891 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:52:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:00.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:01.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:02.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:02 np0005593233 nova_compute[222017]: 2026-01-23 10:52:02.528 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:03.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:04.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:05 np0005593233 nova_compute[222017]: 2026-01-23 10:52:05.083 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:05.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:06.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:07.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:07 np0005593233 nova_compute[222017]: 2026-01-23 10:52:07.572 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:07 np0005593233 nova_compute[222017]: 2026-01-23 10:52:07.575 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:07.575 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:52:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:07.577 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:52:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:08.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:09.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:10 np0005593233 nova_compute[222017]: 2026-01-23 10:52:10.086 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:10.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:11.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.415 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.415 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:52:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1988045669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:52:11 np0005593233 nova_compute[222017]: 2026-01-23 10:52:11.868 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.018 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.019 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4297MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.019 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.020 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.156 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.156 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.221 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:12.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.579 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:52:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/153341235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.676 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.681 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.701 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.702 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:52:12 np0005593233 nova_compute[222017]: 2026-01-23 10:52:12.702 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:13.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:13 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:13.579 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:14.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:15 np0005593233 nova_compute[222017]: 2026-01-23 10:52:15.087 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:15 np0005593233 podman[304209]: 2026-01-23 10:52:15.141837153 +0000 UTC m=+0.139285463 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:52:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:15.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:16.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:16 np0005593233 nova_compute[222017]: 2026-01-23 10:52:16.703 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:16 np0005593233 nova_compute[222017]: 2026-01-23 10:52:16.706 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:16 np0005593233 nova_compute[222017]: 2026-01-23 10:52:16.706 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:16 np0005593233 nova_compute[222017]: 2026-01-23 10:52:16.706 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:52:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:52:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:52:16 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:52:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:17 np0005593233 nova_compute[222017]: 2026-01-23 10:52:17.583 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:18.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:19.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:20 np0005593233 nova_compute[222017]: 2026-01-23 10:52:20.090 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:20.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:21.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:22.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:22 np0005593233 nova_compute[222017]: 2026-01-23 10:52:22.586 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:23.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:23 np0005593233 nova_compute[222017]: 2026-01-23 10:52:23.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:23 np0005593233 nova_compute[222017]: 2026-01-23 10:52:23.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:24.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:25 np0005593233 nova_compute[222017]: 2026-01-23 10:52:25.139 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:25.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:52:25 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:52:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:26.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:27.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:27 np0005593233 nova_compute[222017]: 2026-01-23 10:52:27.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:27 np0005593233 nova_compute[222017]: 2026-01-23 10:52:27.589 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:28 np0005593233 nova_compute[222017]: 2026-01-23 10:52:28.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:28.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:29.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:30 np0005593233 nova_compute[222017]: 2026-01-23 10:52:30.142 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:30.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:31 np0005593233 podman[304416]: 2026-01-23 10:52:31.117445781 +0000 UTC m=+0.099772628 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:52:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:31.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:31 np0005593233 nova_compute[222017]: 2026-01-23 10:52:31.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:31 np0005593233 nova_compute[222017]: 2026-01-23 10:52:31.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:52:31 np0005593233 nova_compute[222017]: 2026-01-23 10:52:31.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:52:31 np0005593233 nova_compute[222017]: 2026-01-23 10:52:31.425 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:52:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:32.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:32 np0005593233 nova_compute[222017]: 2026-01-23 10:52:32.591 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:33.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:34.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:35 np0005593233 nova_compute[222017]: 2026-01-23 10:52:35.184 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:35.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:36.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:37 np0005593233 nova_compute[222017]: 2026-01-23 10:52:37.631 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:38.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:39.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.187 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.476 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.477 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.501 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:52:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:40.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.592 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.593 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.603 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.604 222021 INFO nova.compute.claims [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:52:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:40 np0005593233 nova_compute[222017]: 2026-01-23 10:52:40.776 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:41.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:52:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/185312020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.287 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.296 222021 DEBUG nova.compute.provider_tree [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.314 222021 DEBUG nova.scheduler.client.report [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.348 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.349 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.541 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.542 222021 DEBUG nova.network.neutron [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.891 222021 INFO nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:52:41 np0005593233 nova_compute[222017]: 2026-01-23 10:52:41.953 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.082 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.084 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.085 222021 INFO nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Creating image(s)#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.129 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.177 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.235 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.241 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.287 222021 DEBUG nova.policy [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.346 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.347 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.349 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.349 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.391 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.397 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:42.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.634 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:42.715 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:42.716 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:42.716 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.867 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:42 np0005593233 nova_compute[222017]: 2026-01-23 10:52:42.970 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:52:43 np0005593233 nova_compute[222017]: 2026-01-23 10:52:43.138 222021 DEBUG nova.objects.instance [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 21edef97-3531-4772-8aa5-a3feeb9ff3f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:52:43 np0005593233 nova_compute[222017]: 2026-01-23 10:52:43.163 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:52:43 np0005593233 nova_compute[222017]: 2026-01-23 10:52:43.164 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Ensure instance console log exists: /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:52:43 np0005593233 nova_compute[222017]: 2026-01-23 10:52:43.165 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:43 np0005593233 nova_compute[222017]: 2026-01-23 10:52:43.165 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:43 np0005593233 nova_compute[222017]: 2026-01-23 10:52:43.166 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:43.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:44.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:45 np0005593233 nova_compute[222017]: 2026-01-23 10:52:45.189 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:45 np0005593233 nova_compute[222017]: 2026-01-23 10:52:45.248 222021 DEBUG nova.network.neutron [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Successfully created port: d08a5642-c043-410d-8d3a-e63134c79cd2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:52:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:45.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:46 np0005593233 podman[304622]: 2026-01-23 10:52:46.142492924 +0000 UTC m=+0.146315661 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:52:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:46.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:47.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:47 np0005593233 nova_compute[222017]: 2026-01-23 10:52:47.636 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.283 222021 DEBUG nova.network.neutron [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Successfully updated port: d08a5642-c043-410d-8d3a-e63134c79cd2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.390 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.390 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.390 222021 DEBUG nova.network.neutron [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.494 222021 DEBUG nova.compute.manager [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-changed-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.496 222021 DEBUG nova.compute.manager [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Refreshing instance network info cache due to event network-changed-d08a5642-c043-410d-8d3a-e63134c79cd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.497 222021 DEBUG oslo_concurrency.lockutils [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:52:48 np0005593233 nova_compute[222017]: 2026-01-23 10:52:48.549 222021 DEBUG nova.network.neutron [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:52:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:48.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:49.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.329 222021 DEBUG nova.network.neutron [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updating instance_info_cache with network_info: [{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.350 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.350 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Instance network_info: |[{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.351 222021 DEBUG oslo_concurrency.lockutils [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.351 222021 DEBUG nova.network.neutron [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Refreshing network info cache for port d08a5642-c043-410d-8d3a-e63134c79cd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.356 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Start _get_guest_xml network_info=[{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.363 222021 WARNING nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.369 222021 DEBUG nova.virt.libvirt.host [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.370 222021 DEBUG nova.virt.libvirt.host [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.381 222021 DEBUG nova.virt.libvirt.host [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.382 222021 DEBUG nova.virt.libvirt.host [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.384 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.384 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.385 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.386 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.386 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.387 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.387 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.388 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.388 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.389 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.389 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.390 222021 DEBUG nova.virt.hardware [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.396 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.445 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:52:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/105798800' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.881 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.931 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:49 np0005593233 nova_compute[222017]: 2026-01-23 10:52:49.939 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.193 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:50.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:52:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/426036556' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.680 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.683 222021 DEBUG nova.virt.libvirt.vif [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1044974910',display_name='tempest-TestNetworkAdvancedServerOps-server-1044974910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1044974910',id=210,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1Au+/nbNJXswn+/lBT/NUlG+C4qWpKV8LfTLcEIq/JQHMntJ0r6AZpvvHSolbGhgNEDJ2I0R+q+8ASoXZhoeZdCjKoEqwhuN6XwpPG1I72EeOI415/JreWIAcqSMa5Mw==',key_name='tempest-TestNetworkAdvancedServerOps-840850991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-70lt2fy5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:52:42Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=21edef97-3531-4772-8aa5-a3feeb9ff3f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.684 222021 DEBUG nova.network.os_vif_util [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.686 222021 DEBUG nova.network.os_vif_util [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.688 222021 DEBUG nova.objects.instance [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid 21edef97-3531-4772-8aa5-a3feeb9ff3f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.711 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <uuid>21edef97-3531-4772-8aa5-a3feeb9ff3f5</uuid>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <name>instance-000000d2</name>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1044974910</nova:name>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:52:49</nova:creationTime>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <nova:port uuid="d08a5642-c043-410d-8d3a-e63134c79cd2">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <entry name="serial">21edef97-3531-4772-8aa5-a3feeb9ff3f5</entry>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <entry name="uuid">21edef97-3531-4772-8aa5-a3feeb9ff3f5</entry>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk.config">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:59:40:79"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <target dev="tapd08a5642-c0"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/console.log" append="off"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:52:50 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:52:50 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:52:50 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:52:50 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.713 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Preparing to wait for external event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.714 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.715 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.715 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.716 222021 DEBUG nova.virt.libvirt.vif [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1044974910',display_name='tempest-TestNetworkAdvancedServerOps-server-1044974910',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1044974910',id=210,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1Au+/nbNJXswn+/lBT/NUlG+C4qWpKV8LfTLcEIq/JQHMntJ0r6AZpvvHSolbGhgNEDJ2I0R+q+8ASoXZhoeZdCjKoEqwhuN6XwpPG1I72EeOI415/JreWIAcqSMa5Mw==',key_name='tempest-TestNetworkAdvancedServerOps-840850991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-70lt2fy5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:52:42Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=21edef97-3531-4772-8aa5-a3feeb9ff3f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.717 222021 DEBUG nova.network.os_vif_util [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.718 222021 DEBUG nova.network.os_vif_util [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.718 222021 DEBUG os_vif [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.719 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.720 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.721 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.726 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.727 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd08a5642-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.728 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd08a5642-c0, col_values=(('external_ids', {'iface-id': 'd08a5642-c043-410d-8d3a-e63134c79cd2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:40:79', 'vm-uuid': '21edef97-3531-4772-8aa5-a3feeb9ff3f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:50 np0005593233 NetworkManager[48871]: <info>  [1769165570.7322] manager: (tapd08a5642-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.743 222021 INFO os_vif [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0')#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.785 222021 DEBUG nova.network.neutron [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updated VIF entry in instance network info cache for port d08a5642-c043-410d-8d3a-e63134c79cd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.785 222021 DEBUG nova.network.neutron [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updating instance_info_cache with network_info: [{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:52:50 np0005593233 nova_compute[222017]: 2026-01-23 10:52:50.805 222021 DEBUG oslo_concurrency.lockutils [req-19893cca-d0ca-4159-9617-19e62df61e4a req-473e83ca-ae0b-4dc9-961b-94733b7ea3d4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:52:51 np0005593233 nova_compute[222017]: 2026-01-23 10:52:51.052 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:52:51 np0005593233 nova_compute[222017]: 2026-01-23 10:52:51.052 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:52:51 np0005593233 nova_compute[222017]: 2026-01-23 10:52:51.053 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:59:40:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:52:51 np0005593233 nova_compute[222017]: 2026-01-23 10:52:51.053 222021 INFO nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Using config drive#033[00m
Jan 23 05:52:51 np0005593233 nova_compute[222017]: 2026-01-23 10:52:51.085 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:51.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:52.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:52 np0005593233 nova_compute[222017]: 2026-01-23 10:52:52.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.064 222021 INFO nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Creating config drive at /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/disk.config#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.074 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8v2lz4mw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.233 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8v2lz4mw" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:53.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.280 222021 DEBUG nova.storage.rbd_utils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.285 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/disk.config 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.497 222021 DEBUG oslo_concurrency.processutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/disk.config 21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.499 222021 INFO nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Deleting local config drive /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/disk.config because it was imported into RBD.#033[00m
Jan 23 05:52:53 np0005593233 kernel: tapd08a5642-c0: entered promiscuous mode
Jan 23 05:52:53 np0005593233 NetworkManager[48871]: <info>  [1769165573.5748] manager: (tapd08a5642-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/393)
Jan 23 05:52:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:53Z|00868|binding|INFO|Claiming lport d08a5642-c043-410d-8d3a-e63134c79cd2 for this chassis.
Jan 23 05:52:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:53Z|00869|binding|INFO|d08a5642-c043-410d-8d3a-e63134c79cd2: Claiming fa:16:3e:59:40:79 10.100.0.12
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.617 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.621 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.631 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:40:79 10.100.0.12'], port_security=['fa:16:3e:59:40:79 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '21edef97-3531-4772-8aa5-a3feeb9ff3f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dcfc02e2-e46d-4519-82a3-86ab6ef1b36e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=272555c7-6cc4-4fe4-972e-530a53d5843d, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=d08a5642-c043-410d-8d3a-e63134c79cd2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.632 140224 INFO neutron.agent.ovn.metadata.agent [-] Port d08a5642-c043-410d-8d3a-e63134c79cd2 in datapath fae0cfd0-9ee7-400c-bda6-94fd3af3625d bound to our chassis#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.633 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fae0cfd0-9ee7-400c-bda6-94fd3af3625d#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.647 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c6dbd9-8fe7-4cb9-b322-d170f211326a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.649 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfae0cfd0-91 in ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:52:53 np0005593233 systemd-udevd[304786]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.652 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfae0cfd0-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.652 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a2527950-c56c-4d42-807d-5f4c6a6c6a36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.654 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5055c4b4-3646-4a7d-8d56-b08a083b8fb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 systemd-machined[190954]: New machine qemu-94-instance-000000d2.
Jan 23 05:52:53 np0005593233 NetworkManager[48871]: <info>  [1769165573.6646] device (tapd08a5642-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:52:53 np0005593233 NetworkManager[48871]: <info>  [1769165573.6651] device (tapd08a5642-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.667 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5bb593-da2b-4cef-885a-023282f04a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:53Z|00870|binding|INFO|Setting lport d08a5642-c043-410d-8d3a-e63134c79cd2 ovn-installed in OVS
Jan 23 05:52:53 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:53Z|00871|binding|INFO|Setting lport d08a5642-c043-410d-8d3a-e63134c79cd2 up in Southbound
Jan 23 05:52:53 np0005593233 nova_compute[222017]: 2026-01-23 10:52:53.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:53 np0005593233 systemd[1]: Started Virtual Machine qemu-94-instance-000000d2.
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.697 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a8cf2a-e2b0-44f6-8add-b470570ce807]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.738 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb442ce-1917-4440-897e-e5fba7455249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 NetworkManager[48871]: <info>  [1769165573.7487] manager: (tapfae0cfd0-90): new Veth device (/org/freedesktop/NetworkManager/Devices/394)
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.747 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fb33bbad-4b79-4037-97db-02a806fc7bef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.784 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9fff0691-9cc4-4e12-b1a4-ff45a111284a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.787 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5efedb1a-d1b1-4854-8e4c-4f263f5b9aad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 NetworkManager[48871]: <info>  [1769165573.8165] device (tapfae0cfd0-90): carrier: link connected
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.821 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[5d39ca53-77bb-4f4d-a7bd-2c126d060743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.845 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[819c951f-3d16-4b96-b026-adfa95299392]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfae0cfd0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:ca:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 948644, 'reachable_time': 23697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304818, 'error': None, 'target': 'ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.867 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[33def9c6-a446-4c7d-a568-54fefaa6e540]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4c:ca82'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 948644, 'tstamp': 948644}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304819, 'error': None, 'target': 'ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.888 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[63c1f43c-46d7-4cf8-92d7-f8d22bcf2a3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfae0cfd0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:ca:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 260], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 948644, 'reachable_time': 23697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304820, 'error': None, 'target': 'ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:53.931 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c9afe4f7-c7ad-4607-bd09-26548adaa4c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.018 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2c381271-8dd0-4cfe-8150-ff8923d2685b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.020 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfae0cfd0-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.020 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.021 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfae0cfd0-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.023 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:54 np0005593233 kernel: tapfae0cfd0-90: entered promiscuous mode
Jan 23 05:52:54 np0005593233 NetworkManager[48871]: <info>  [1769165574.0243] manager: (tapfae0cfd0-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/395)
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.027 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfae0cfd0-90, col_values=(('external_ids', {'iface-id': '60ff24a2-99cd-4fae-9593-d07701636aa7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:54 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:54Z|00872|binding|INFO|Releasing lport 60ff24a2-99cd-4fae-9593-d07701636aa7 from this chassis (sb_readonly=0)
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.054 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.055 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fae0cfd0-9ee7-400c-bda6-94fd3af3625d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fae0cfd0-9ee7-400c-bda6-94fd3af3625d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.056 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[22bd4de6-9a2c-403f-b3f5-119aeef0e7b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.057 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-fae0cfd0-9ee7-400c-bda6-94fd3af3625d
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/fae0cfd0-9ee7-400c-bda6-94fd3af3625d.pid.haproxy
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID fae0cfd0-9ee7-400c-bda6-94fd3af3625d
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:52:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:52:54.058 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'env', 'PROCESS_TAG=haproxy-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fae0cfd0-9ee7-400c-bda6-94fd3af3625d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.415 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165574.4144714, 21edef97-3531-4772-8aa5-a3feeb9ff3f5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.416 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] VM Started (Lifecycle Event)#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.446 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.452 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165574.414712, 21edef97-3531-4772-8aa5-a3feeb9ff3f5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.453 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.486 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.491 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.519 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:52:54 np0005593233 podman[304894]: 2026-01-23 10:52:54.453079246 +0000 UTC m=+0.032000785 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:52:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:52:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:54.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.670 222021 DEBUG nova.compute.manager [req-631ae3c8-0888-4f93-9778-94cfd187fe58 req-a370cce1-1c04-4911-ad13-d44bfe32b8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.671 222021 DEBUG oslo_concurrency.lockutils [req-631ae3c8-0888-4f93-9778-94cfd187fe58 req-a370cce1-1c04-4911-ad13-d44bfe32b8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.672 222021 DEBUG oslo_concurrency.lockutils [req-631ae3c8-0888-4f93-9778-94cfd187fe58 req-a370cce1-1c04-4911-ad13-d44bfe32b8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.673 222021 DEBUG oslo_concurrency.lockutils [req-631ae3c8-0888-4f93-9778-94cfd187fe58 req-a370cce1-1c04-4911-ad13-d44bfe32b8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.674 222021 DEBUG nova.compute.manager [req-631ae3c8-0888-4f93-9778-94cfd187fe58 req-a370cce1-1c04-4911-ad13-d44bfe32b8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Processing event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.676 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.681 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165574.68084, 21edef97-3531-4772-8aa5-a3feeb9ff3f5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.682 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.686 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.690 222021 INFO nova.virt.libvirt.driver [-] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Instance spawned successfully.#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.690 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.717 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.722 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.725 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.726 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.726 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.726 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.727 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.727 222021 DEBUG nova.virt.libvirt.driver [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.756 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.783 222021 INFO nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Took 12.70 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.785 222021 DEBUG nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.859 222021 INFO nova.compute.manager [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Took 14.30 seconds to build instance.#033[00m
Jan 23 05:52:54 np0005593233 nova_compute[222017]: 2026-01-23 10:52:54.881 222021 DEBUG oslo_concurrency.lockutils [None req-23491dd2-3b3a-4687-84e8-493460e5a9f0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:55 np0005593233 podman[304894]: 2026-01-23 10:52:55.160822864 +0000 UTC m=+0.739744353 container create 0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:52:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:55.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:55 np0005593233 systemd[1]: Started libpod-conmon-0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320.scope.
Jan 23 05:52:55 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:52:55 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcc2611cbfd1cbcf78b0061423fc3b5b4711cda90e5cef63a64e58c7dc8790a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:52:55 np0005593233 podman[304894]: 2026-01-23 10:52:55.549911946 +0000 UTC m=+1.128833465 container init 0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:52:55 np0005593233 podman[304894]: 2026-01-23 10:52:55.560416673 +0000 UTC m=+1.139338122 container start 0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:52:55 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [NOTICE]   (304913) : New worker (304915) forked
Jan 23 05:52:55 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [NOTICE]   (304913) : Loading success.
Jan 23 05:52:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:55 np0005593233 nova_compute[222017]: 2026-01-23 10:52:55.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:56.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:56 np0005593233 nova_compute[222017]: 2026-01-23 10:52:56.807 222021 DEBUG nova.compute.manager [req-32eac6a6-1354-408b-a310-7fcd92a43735 req-d198d3f9-304b-4abb-a7e6-e699d857f597 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:56 np0005593233 nova_compute[222017]: 2026-01-23 10:52:56.808 222021 DEBUG oslo_concurrency.lockutils [req-32eac6a6-1354-408b-a310-7fcd92a43735 req-d198d3f9-304b-4abb-a7e6-e699d857f597 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:56 np0005593233 nova_compute[222017]: 2026-01-23 10:52:56.808 222021 DEBUG oslo_concurrency.lockutils [req-32eac6a6-1354-408b-a310-7fcd92a43735 req-d198d3f9-304b-4abb-a7e6-e699d857f597 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:56 np0005593233 nova_compute[222017]: 2026-01-23 10:52:56.809 222021 DEBUG oslo_concurrency.lockutils [req-32eac6a6-1354-408b-a310-7fcd92a43735 req-d198d3f9-304b-4abb-a7e6-e699d857f597 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:56 np0005593233 nova_compute[222017]: 2026-01-23 10:52:56.809 222021 DEBUG nova.compute.manager [req-32eac6a6-1354-408b-a310-7fcd92a43735 req-d198d3f9-304b-4abb-a7e6-e699d857f597 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] No waiting events found dispatching network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:52:56 np0005593233 nova_compute[222017]: 2026-01-23 10:52:56.810 222021 WARNING nova.compute.manager [req-32eac6a6-1354-408b-a310-7fcd92a43735 req-d198d3f9-304b-4abb-a7e6-e699d857f597 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received unexpected event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:52:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:52:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:52:57 np0005593233 nova_compute[222017]: 2026-01-23 10:52:57.641 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:58.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:59 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:59Z|00873|binding|INFO|Releasing lport 60ff24a2-99cd-4fae-9593-d07701636aa7 from this chassis (sb_readonly=0)
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.030 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:59 np0005593233 NetworkManager[48871]: <info>  [1769165579.0315] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/396)
Jan 23 05:52:59 np0005593233 NetworkManager[48871]: <info>  [1769165579.0332] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Jan 23 05:52:59 np0005593233 ovn_controller[130653]: 2026-01-23T10:52:59Z|00874|binding|INFO|Releasing lport 60ff24a2-99cd-4fae-9593-d07701636aa7 from this chassis (sb_readonly=0)
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.103 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:52:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:59.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.353 222021 DEBUG nova.compute.manager [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-changed-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.354 222021 DEBUG nova.compute.manager [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Refreshing instance network info cache due to event network-changed-d08a5642-c043-410d-8d3a-e63134c79cd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.355 222021 DEBUG oslo_concurrency.lockutils [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.355 222021 DEBUG oslo_concurrency.lockutils [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:52:59 np0005593233 nova_compute[222017]: 2026-01-23 10:52:59.356 222021 DEBUG nova.network.neutron [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Refreshing network info cache for port d08a5642-c043-410d-8d3a-e63134c79cd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:53:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:00.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:00 np0005593233 nova_compute[222017]: 2026-01-23 10:53:00.732 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:01 np0005593233 nova_compute[222017]: 2026-01-23 10:53:01.195 222021 DEBUG nova.network.neutron [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updated VIF entry in instance network info cache for port d08a5642-c043-410d-8d3a-e63134c79cd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:53:01 np0005593233 nova_compute[222017]: 2026-01-23 10:53:01.196 222021 DEBUG nova.network.neutron [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updating instance_info_cache with network_info: [{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:53:01 np0005593233 nova_compute[222017]: 2026-01-23 10:53:01.216 222021 DEBUG oslo_concurrency.lockutils [req-a22f27ef-5a03-4e7f-9e7c-33440b1e49f1 req-b5ad8dfe-7069-4ebe-81d6-93fd3a5f89cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:53:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:01.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:02 np0005593233 podman[304925]: 2026-01-23 10:53:02.09038883 +0000 UTC m=+0.102244037 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:53:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:02.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:02 np0005593233 nova_compute[222017]: 2026-01-23 10:53:02.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:03.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:04.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:05.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:05 np0005593233 nova_compute[222017]: 2026-01-23 10:53:05.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:06.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:07.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:07 np0005593233 nova_compute[222017]: 2026-01-23 10:53:07.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:08.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:09.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:53:10Z|00128|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:40:79 10.100.0.12
Jan 23 05:53:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:53:10Z|00129|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:40:79 10.100.0.12
Jan 23 05:53:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:53:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:10.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:53:10 np0005593233 nova_compute[222017]: 2026-01-23 10:53:10.737 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:11.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.421 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.421 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:12.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.685 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:53:12 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/648035263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:53:12 np0005593233 nova_compute[222017]: 2026-01-23 10:53:12.979 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.072 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.073 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:53:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:13.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.338 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.339 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4061MB free_disk=20.954879760742188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.339 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.339 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.439 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 21edef97-3531-4772-8aa5-a3feeb9ff3f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.439 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.440 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.503 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:53:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1754697079' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.952 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.962 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:53:13 np0005593233 nova_compute[222017]: 2026-01-23 10:53:13.983 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:53:14 np0005593233 nova_compute[222017]: 2026-01-23 10:53:14.014 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:53:14 np0005593233 nova_compute[222017]: 2026-01-23 10:53:14.015 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:14.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 23 05:53:15 np0005593233 nova_compute[222017]: 2026-01-23 10:53:15.262 222021 INFO nova.compute.manager [None req-9a55b0be-6b4c-4213-af42-810270099579 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Get console output#033[00m
Jan 23 05:53:15 np0005593233 nova_compute[222017]: 2026-01-23 10:53:15.275 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:53:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:15.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:15 np0005593233 nova_compute[222017]: 2026-01-23 10:53:15.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.006852) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596007018, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1306, "num_deletes": 251, "total_data_size": 2947612, "memory_usage": 2987632, "flush_reason": "Manual Compaction"}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596026860, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 1947382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90237, "largest_seqno": 91538, "table_properties": {"data_size": 1941751, "index_size": 3025, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12135, "raw_average_key_size": 19, "raw_value_size": 1930427, "raw_average_value_size": 3164, "num_data_blocks": 135, "num_entries": 610, "num_filter_entries": 610, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165480, "oldest_key_time": 1769165480, "file_creation_time": 1769165596, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 20221 microseconds, and 6715 cpu microseconds.
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.027080) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 1947382 bytes OK
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.027114) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.030098) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.030121) EVENT_LOG_v1 {"time_micros": 1769165596030114, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.030144) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 2941363, prev total WAL file size 2941363, number of live WAL files 2.
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.031896) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(1901KB)], [189(12MB)]
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596032040, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 14647151, "oldest_snapshot_seqno": -1}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 10808 keys, 12731918 bytes, temperature: kUnknown
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596273400, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 12731918, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12663900, "index_size": 39873, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27077, "raw_key_size": 286489, "raw_average_key_size": 26, "raw_value_size": 12476665, "raw_average_value_size": 1154, "num_data_blocks": 1506, "num_entries": 10808, "num_filter_entries": 10808, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165596, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.274019) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 12731918 bytes
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.327516) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.7 rd, 52.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.1 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(14.1) write-amplify(6.5) OK, records in: 11323, records dropped: 515 output_compression: NoCompression
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.327616) EVENT_LOG_v1 {"time_micros": 1769165596327593, "job": 122, "event": "compaction_finished", "compaction_time_micros": 241459, "compaction_time_cpu_micros": 60723, "output_level": 6, "num_output_files": 1, "total_output_size": 12731918, "num_input_records": 11323, "num_output_records": 10808, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596328540, "job": 122, "event": "table_file_deletion", "file_number": 191}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596332766, "job": 122, "event": "table_file_deletion", "file_number": 189}
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.031741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.333030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.333041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.333044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.333047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:53:16.333050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:16.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:17.108 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:53:17 np0005593233 nova_compute[222017]: 2026-01-23 10:53:17.109 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:17.111 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:53:17 np0005593233 podman[304992]: 2026-01-23 10:53:17.143950448 +0000 UTC m=+0.137575304 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:53:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:17.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:17 np0005593233 nova_compute[222017]: 2026-01-23 10:53:17.687 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:18 np0005593233 nova_compute[222017]: 2026-01-23 10:53:18.016 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:18 np0005593233 nova_compute[222017]: 2026-01-23 10:53:18.016 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:18 np0005593233 nova_compute[222017]: 2026-01-23 10:53:18.017 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:18 np0005593233 nova_compute[222017]: 2026-01-23 10:53:18.017 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:53:18 np0005593233 nova_compute[222017]: 2026-01-23 10:53:18.548 222021 INFO nova.compute.manager [None req-3579551c-6aee-4294-b497-9d45999103df 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Get console output#033[00m
Jan 23 05:53:18 np0005593233 nova_compute[222017]: 2026-01-23 10:53:18.556 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:53:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:18.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:19.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:20.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:20 np0005593233 nova_compute[222017]: 2026-01-23 10:53:20.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:21.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:22.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:22 np0005593233 nova_compute[222017]: 2026-01-23 10:53:22.690 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:23 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:23.113 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:53:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:23.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:24 np0005593233 nova_compute[222017]: 2026-01-23 10:53:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:24.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:24 np0005593233 nova_compute[222017]: 2026-01-23 10:53:24.841 222021 DEBUG oslo_concurrency.lockutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:53:24 np0005593233 nova_compute[222017]: 2026-01-23 10:53:24.842 222021 DEBUG oslo_concurrency.lockutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquired lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:53:24 np0005593233 nova_compute[222017]: 2026-01-23 10:53:24.842 222021 DEBUG nova.network.neutron [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:53:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:25.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:25 np0005593233 nova_compute[222017]: 2026-01-23 10:53:25.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:25 np0005593233 nova_compute[222017]: 2026-01-23 10:53:25.773 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:26.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:27.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:27 np0005593233 nova_compute[222017]: 2026-01-23 10:53:27.692 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:53:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:53:28 np0005593233 nova_compute[222017]: 2026-01-23 10:53:28.239 222021 DEBUG nova.network.neutron [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updating instance_info_cache with network_info: [{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:53:28 np0005593233 nova_compute[222017]: 2026-01-23 10:53:28.325 222021 DEBUG oslo_concurrency.lockutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Releasing lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:53:28 np0005593233 nova_compute[222017]: 2026-01-23 10:53:28.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:28 np0005593233 nova_compute[222017]: 2026-01-23 10:53:28.457 222021 DEBUG nova.virt.libvirt.driver [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 05:53:28 np0005593233 nova_compute[222017]: 2026-01-23 10:53:28.458 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Creating file /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/cd13c9d4079d469d9f9e95d51eaa4a26.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 05:53:28 np0005593233 nova_compute[222017]: 2026-01-23 10:53:28.458 222021 DEBUG oslo_concurrency.processutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/cd13c9d4079d469d9f9e95d51eaa4a26.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:28.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:29 np0005593233 nova_compute[222017]: 2026-01-23 10:53:29.020 222021 DEBUG oslo_concurrency.processutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/cd13c9d4079d469d9f9e95d51eaa4a26.tmp" returned: 1 in 0.561s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:29 np0005593233 nova_compute[222017]: 2026-01-23 10:53:29.021 222021 DEBUG oslo_concurrency.processutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5/cd13c9d4079d469d9f9e95d51eaa4a26.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:53:29 np0005593233 nova_compute[222017]: 2026-01-23 10:53:29.022 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Creating directory /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 05:53:29 np0005593233 nova_compute[222017]: 2026-01-23 10:53:29.022 222021 DEBUG oslo_concurrency.processutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:53:29Z|00875|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 23 05:53:29 np0005593233 nova_compute[222017]: 2026-01-23 10:53:29.305 222021 DEBUG oslo_concurrency.processutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/21edef97-3531-4772-8aa5-a3feeb9ff3f5" returned: 0 in 0.283s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:29 np0005593233 nova_compute[222017]: 2026-01-23 10:53:29.313 222021 DEBUG nova.virt.libvirt.driver [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:53:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:29.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:30 np0005593233 nova_compute[222017]: 2026-01-23 10:53:30.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:30.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:30 np0005593233 nova_compute[222017]: 2026-01-23 10:53:30.776 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:31.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:31 np0005593233 kernel: tapd08a5642-c0 (unregistering): left promiscuous mode
Jan 23 05:53:31 np0005593233 NetworkManager[48871]: <info>  [1769165611.6915] device (tapd08a5642-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:53:31 np0005593233 nova_compute[222017]: 2026-01-23 10:53:31.707 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:53:31Z|00876|binding|INFO|Releasing lport d08a5642-c043-410d-8d3a-e63134c79cd2 from this chassis (sb_readonly=0)
Jan 23 05:53:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:53:31Z|00877|binding|INFO|Setting lport d08a5642-c043-410d-8d3a-e63134c79cd2 down in Southbound
Jan 23 05:53:31 np0005593233 ovn_controller[130653]: 2026-01-23T10:53:31Z|00878|binding|INFO|Removing iface tapd08a5642-c0 ovn-installed in OVS
Jan 23 05:53:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:31.721 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:40:79 10.100.0.12'], port_security=['fa:16:3e:59:40:79 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '21edef97-3531-4772-8aa5-a3feeb9ff3f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcfc02e2-e46d-4519-82a3-86ab6ef1b36e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=272555c7-6cc4-4fe4-972e-530a53d5843d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=d08a5642-c043-410d-8d3a-e63134c79cd2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:53:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:31.723 140224 INFO neutron.agent.ovn.metadata.agent [-] Port d08a5642-c043-410d-8d3a-e63134c79cd2 in datapath fae0cfd0-9ee7-400c-bda6-94fd3af3625d unbound from our chassis#033[00m
Jan 23 05:53:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:31.725 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fae0cfd0-9ee7-400c-bda6-94fd3af3625d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:53:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:31.727 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f82869bd-c5de-4ac4-8747-d04f72ebc73d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:31 np0005593233 nova_compute[222017]: 2026-01-23 10:53:31.729 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:31.729 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d namespace which is not needed anymore#033[00m
Jan 23 05:53:31 np0005593233 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000d2.scope: Deactivated successfully.
Jan 23 05:53:31 np0005593233 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000d2.scope: Consumed 15.611s CPU time.
Jan 23 05:53:31 np0005593233 systemd-machined[190954]: Machine qemu-94-instance-000000d2 terminated.
Jan 23 05:53:31 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [NOTICE]   (304913) : haproxy version is 2.8.14-c23fe91
Jan 23 05:53:31 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [NOTICE]   (304913) : path to executable is /usr/sbin/haproxy
Jan 23 05:53:31 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [WARNING]  (304913) : Exiting Master process...
Jan 23 05:53:31 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [ALERT]    (304913) : Current worker (304915) exited with code 143 (Terminated)
Jan 23 05:53:31 np0005593233 neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d[304909]: [WARNING]  (304913) : All workers exited. Exiting... (0)
Jan 23 05:53:31 np0005593233 systemd[1]: libpod-0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320.scope: Deactivated successfully.
Jan 23 05:53:31 np0005593233 podman[305176]: 2026-01-23 10:53:31.952130721 +0000 UTC m=+0.075617586 container died 0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:53:31 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320-userdata-shm.mount: Deactivated successfully.
Jan 23 05:53:31 np0005593233 systemd[1]: var-lib-containers-storage-overlay-dcc2611cbfd1cbcf78b0061423fc3b5b4711cda90e5cef63a64e58c7dc8790a9-merged.mount: Deactivated successfully.
Jan 23 05:53:32 np0005593233 podman[305176]: 2026-01-23 10:53:32.006246828 +0000 UTC m=+0.129733673 container cleanup 0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:53:32 np0005593233 systemd[1]: libpod-conmon-0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320.scope: Deactivated successfully.
Jan 23 05:53:32 np0005593233 podman[305216]: 2026-01-23 10:53:32.087656776 +0000 UTC m=+0.050416204 container remove 0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.094 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[97004d0d-a3d7-4456-a62e-8b43abdb1452]: (4, ('Fri Jan 23 10:53:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d (0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320)\n0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320\nFri Jan 23 10:53:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d (0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320)\n0867f19aaf8301b10a59a449ca099f0b355681532dfe8f8d5f6b39d758d06320\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.096 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[457f8013-612c-4e7a-b321-de27f4c6f344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.097 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfae0cfd0-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.099 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:32 np0005593233 kernel: tapfae0cfd0-90: left promiscuous mode
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.129 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.133 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3063fc3c-bcda-4d9a-b952-eb7de86d7110]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.156 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c63b1870-31ba-454e-b1da-e5fed586ad36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.158 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc9b9ed-2c19-4d20-97ad-55e80185db5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.195 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[308aab8d-f262-4df0-9367-49494e16a265]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 948636, 'reachable_time': 36554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305240, 'error': None, 'target': 'ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 systemd[1]: run-netns-ovnmeta\x2dfae0cfd0\x2d9ee7\x2d400c\x2dbda6\x2d94fd3af3625d.mount: Deactivated successfully.
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.201 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fae0cfd0-9ee7-400c-bda6-94fd3af3625d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:53:32 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:32.202 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[24497a04-b1bc-4cbb-b248-3cc669261f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:53:32 np0005593233 podman[305230]: 2026-01-23 10:53:32.236665483 +0000 UTC m=+0.081759369 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.339 222021 INFO nova.virt.libvirt.driver [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.347 222021 INFO nova.virt.libvirt.driver [-] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Instance destroyed successfully.#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.349 222021 DEBUG nova.virt.libvirt.vif [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1044974910',display_name='tempest-TestNetworkAdvancedServerOps-server-1044974910',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1044974910',id=210,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1Au+/nbNJXswn+/lBT/NUlG+C4qWpKV8LfTLcEIq/JQHMntJ0r6AZpvvHSolbGhgNEDJ2I0R+q+8ASoXZhoeZdCjKoEqwhuN6XwpPG1I72EeOI415/JreWIAcqSMa5Mw==',key_name='tempest-TestNetworkAdvancedServerOps-840850991',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:52:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-70lt2fy5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:53:23Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=21edef97-3531-4772-8aa5-a3feeb9ff3f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1773492812", "vif_mac": "fa:16:3e:59:40:79"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.350 222021 DEBUG nova.network.os_vif_util [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1773492812", "vif_mac": "fa:16:3e:59:40:79"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.351 222021 DEBUG nova.network.os_vif_util [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.352 222021 DEBUG os_vif [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.355 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.355 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd08a5642-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.364 222021 INFO os_vif [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0')#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.371 222021 DEBUG nova.virt.libvirt.driver [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.372 222021 DEBUG nova.virt.libvirt.driver [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] skipping disk for instance-000000d2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:53:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:32.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:32 np0005593233 nova_compute[222017]: 2026-01-23 10:53:32.695 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.092 222021 DEBUG neutronclient.v2_0.client [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d08a5642-c043-410d-8d3a-e63134c79cd2 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.236 222021 DEBUG oslo_concurrency.lockutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.237 222021 DEBUG oslo_concurrency.lockutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.237 222021 DEBUG oslo_concurrency.lockutils [None req-4761d977-e2ca-41cc-84df-3adf7f03a56c 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:33.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.461 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.687 222021 DEBUG nova.compute.manager [req-adea3133-33e8-4155-be19-d024bc44aff8 req-df20ec4f-b402-4a12-b4b2-800ed1d6276f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-vif-unplugged-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.689 222021 DEBUG oslo_concurrency.lockutils [req-adea3133-33e8-4155-be19-d024bc44aff8 req-df20ec4f-b402-4a12-b4b2-800ed1d6276f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.689 222021 DEBUG oslo_concurrency.lockutils [req-adea3133-33e8-4155-be19-d024bc44aff8 req-df20ec4f-b402-4a12-b4b2-800ed1d6276f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.689 222021 DEBUG oslo_concurrency.lockutils [req-adea3133-33e8-4155-be19-d024bc44aff8 req-df20ec4f-b402-4a12-b4b2-800ed1d6276f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.690 222021 DEBUG nova.compute.manager [req-adea3133-33e8-4155-be19-d024bc44aff8 req-df20ec4f-b402-4a12-b4b2-800ed1d6276f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] No waiting events found dispatching network-vif-unplugged-d08a5642-c043-410d-8d3a-e63134c79cd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:53:33 np0005593233 nova_compute[222017]: 2026-01-23 10:53:33.690 222021 WARNING nova.compute.manager [req-adea3133-33e8-4155-be19-d024bc44aff8 req-df20ec4f-b402-4a12-b4b2-800ed1d6276f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received unexpected event network-vif-unplugged-d08a5642-c043-410d-8d3a-e63134c79cd2 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:53:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:34.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:35.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:35 np0005593233 nova_compute[222017]: 2026-01-23 10:53:35.804 222021 DEBUG nova.compute.manager [req-2b6b2ba7-984c-4ea1-9870-80c30e11d9da req-fdeb1e0a-b050-4a89-b8d2-f3f78362a9d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:53:35 np0005593233 nova_compute[222017]: 2026-01-23 10:53:35.806 222021 DEBUG oslo_concurrency.lockutils [req-2b6b2ba7-984c-4ea1-9870-80c30e11d9da req-fdeb1e0a-b050-4a89-b8d2-f3f78362a9d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:35 np0005593233 nova_compute[222017]: 2026-01-23 10:53:35.806 222021 DEBUG oslo_concurrency.lockutils [req-2b6b2ba7-984c-4ea1-9870-80c30e11d9da req-fdeb1e0a-b050-4a89-b8d2-f3f78362a9d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:35 np0005593233 nova_compute[222017]: 2026-01-23 10:53:35.806 222021 DEBUG oslo_concurrency.lockutils [req-2b6b2ba7-984c-4ea1-9870-80c30e11d9da req-fdeb1e0a-b050-4a89-b8d2-f3f78362a9d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:35 np0005593233 nova_compute[222017]: 2026-01-23 10:53:35.806 222021 DEBUG nova.compute.manager [req-2b6b2ba7-984c-4ea1-9870-80c30e11d9da req-fdeb1e0a-b050-4a89-b8d2-f3f78362a9d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] No waiting events found dispatching network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:53:35 np0005593233 nova_compute[222017]: 2026-01-23 10:53:35.806 222021 WARNING nova.compute.manager [req-2b6b2ba7-984c-4ea1-9870-80c30e11d9da req-fdeb1e0a-b050-4a89-b8d2-f3f78362a9d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received unexpected event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:53:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:36.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.318 222021 DEBUG nova.compute.manager [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-changed-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.319 222021 DEBUG nova.compute.manager [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Refreshing instance network info cache due to event network-changed-d08a5642-c043-410d-8d3a-e63134c79cd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.319 222021 DEBUG oslo_concurrency.lockutils [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.320 222021 DEBUG oslo_concurrency.lockutils [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.320 222021 DEBUG nova.network.neutron [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Refreshing network info cache for port d08a5642-c043-410d-8d3a-e63134c79cd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:53:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:37.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.358 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:37 np0005593233 nova_compute[222017]: 2026-01-23 10:53:37.697 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:38.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:39.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:40 np0005593233 nova_compute[222017]: 2026-01-23 10:53:40.413 222021 DEBUG nova.network.neutron [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updated VIF entry in instance network info cache for port d08a5642-c043-410d-8d3a-e63134c79cd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:53:40 np0005593233 nova_compute[222017]: 2026-01-23 10:53:40.414 222021 DEBUG nova.network.neutron [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updating instance_info_cache with network_info: [{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:53:40 np0005593233 nova_compute[222017]: 2026-01-23 10:53:40.456 222021 DEBUG oslo_concurrency.lockutils [req-117bd410-eb86-40f7-ab92-c914af82a08d req-eefc5a7c-470a-498f-b6a5-6f8a1a0a70a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:53:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e419 e419: 3 total, 3 up, 3 in
Jan 23 05:53:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:40.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:41.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.386 222021 DEBUG nova.compute.manager [req-a2b460ea-8260-4b38-affa-965f26eff5ac req-64259559-744e-4c2a-8eca-db432cf7068e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.387 222021 DEBUG oslo_concurrency.lockutils [req-a2b460ea-8260-4b38-affa-965f26eff5ac req-64259559-744e-4c2a-8eca-db432cf7068e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.387 222021 DEBUG oslo_concurrency.lockutils [req-a2b460ea-8260-4b38-affa-965f26eff5ac req-64259559-744e-4c2a-8eca-db432cf7068e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.388 222021 DEBUG oslo_concurrency.lockutils [req-a2b460ea-8260-4b38-affa-965f26eff5ac req-64259559-744e-4c2a-8eca-db432cf7068e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.388 222021 DEBUG nova.compute.manager [req-a2b460ea-8260-4b38-affa-965f26eff5ac req-64259559-744e-4c2a-8eca-db432cf7068e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] No waiting events found dispatching network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.388 222021 WARNING nova.compute.manager [req-a2b460ea-8260-4b38-affa-965f26eff5ac req-64259559-744e-4c2a-8eca-db432cf7068e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received unexpected event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 05:53:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:42.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:42 np0005593233 nova_compute[222017]: 2026-01-23 10:53:42.699 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:42.717 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:42.717 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:53:42.717 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:43.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:44 np0005593233 nova_compute[222017]: 2026-01-23 10:53:44.509 222021 DEBUG nova.compute.manager [req-4185b033-d0cd-410a-8fc6-8b9826f00c17 req-18f9d693-3451-4455-aaf2-56ab66b1e017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:53:44 np0005593233 nova_compute[222017]: 2026-01-23 10:53:44.510 222021 DEBUG oslo_concurrency.lockutils [req-4185b033-d0cd-410a-8fc6-8b9826f00c17 req-18f9d693-3451-4455-aaf2-56ab66b1e017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:44 np0005593233 nova_compute[222017]: 2026-01-23 10:53:44.510 222021 DEBUG oslo_concurrency.lockutils [req-4185b033-d0cd-410a-8fc6-8b9826f00c17 req-18f9d693-3451-4455-aaf2-56ab66b1e017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:44 np0005593233 nova_compute[222017]: 2026-01-23 10:53:44.510 222021 DEBUG oslo_concurrency.lockutils [req-4185b033-d0cd-410a-8fc6-8b9826f00c17 req-18f9d693-3451-4455-aaf2-56ab66b1e017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:44 np0005593233 nova_compute[222017]: 2026-01-23 10:53:44.511 222021 DEBUG nova.compute.manager [req-4185b033-d0cd-410a-8fc6-8b9826f00c17 req-18f9d693-3451-4455-aaf2-56ab66b1e017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] No waiting events found dispatching network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:53:44 np0005593233 nova_compute[222017]: 2026-01-23 10:53:44.511 222021 WARNING nova.compute.manager [req-4185b033-d0cd-410a-8fc6-8b9826f00c17 req-18f9d693-3451-4455-aaf2-56ab66b1e017 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Received unexpected event network-vif-plugged-d08a5642-c043-410d-8d3a-e63134c79cd2 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:53:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:44.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:45 np0005593233 nova_compute[222017]: 2026-01-23 10:53:45.305 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:45 np0005593233 nova_compute[222017]: 2026-01-23 10:53:45.306 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:45 np0005593233 nova_compute[222017]: 2026-01-23 10:53:45.306 222021 DEBUG nova.compute.manager [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Going to confirm migration 21 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 23 05:53:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:45.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.314 222021 DEBUG neutronclient.v2_0.client [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d08a5642-c043-410d-8d3a-e63134c79cd2 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.315 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.316 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.316 222021 DEBUG nova.network.neutron [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.316 222021 DEBUG nova.objects.instance [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'info_cache' on Instance uuid 21edef97-3531-4772-8aa5-a3feeb9ff3f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:53:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:46.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.957 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165611.9559774, 21edef97-3531-4772-8aa5-a3feeb9ff3f5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.957 222021 INFO nova.compute.manager [-] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.978 222021 DEBUG nova.compute.manager [None req-4aebc55c-c20b-47b4-865f-8620dca571e5 - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:53:46 np0005593233 nova_compute[222017]: 2026-01-23 10:53:46.984 222021 DEBUG nova.compute.manager [None req-4aebc55c-c20b-47b4-865f-8620dca571e5 - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:53:47 np0005593233 nova_compute[222017]: 2026-01-23 10:53:47.005 222021 INFO nova.compute.manager [None req-4aebc55c-c20b-47b4-865f-8620dca571e5 - - - - - -] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 23 05:53:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:47.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:47 np0005593233 nova_compute[222017]: 2026-01-23 10:53:47.362 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:47 np0005593233 nova_compute[222017]: 2026-01-23 10:53:47.702 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:48 np0005593233 nova_compute[222017]: 2026-01-23 10:53:48.112 222021 DEBUG nova.network.neutron [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 21edef97-3531-4772-8aa5-a3feeb9ff3f5] Updating instance_info_cache with network_info: [{"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:53:48 np0005593233 nova_compute[222017]: 2026-01-23 10:53:48.135 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-21edef97-3531-4772-8aa5-a3feeb9ff3f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:53:48 np0005593233 nova_compute[222017]: 2026-01-23 10:53:48.135 222021 DEBUG nova.objects.instance [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 21edef97-3531-4772-8aa5-a3feeb9ff3f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:53:48 np0005593233 podman[305303]: 2026-01-23 10:53:48.150303733 +0000 UTC m=+0.152223478 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:53:48 np0005593233 nova_compute[222017]: 2026-01-23 10:53:48.270 222021 DEBUG nova.storage.rbd_utils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] removing snapshot(nova-resize) on rbd image(21edef97-3531-4772-8aa5-a3feeb9ff3f5_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:53:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:53:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:48.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:53:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e420 e420: 3 total, 3 up, 3 in
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.681 222021 DEBUG nova.virt.libvirt.vif [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1044974910',display_name='tempest-TestNetworkAdvancedServerOps-server-1044974910',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1044974910',id=210,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN1Au+/nbNJXswn+/lBT/NUlG+C4qWpKV8LfTLcEIq/JQHMntJ0r6AZpvvHSolbGhgNEDJ2I0R+q+8ASoXZhoeZdCjKoEqwhuN6XwpPG1I72EeOI415/JreWIAcqSMa5Mw==',key_name='tempest-TestNetworkAdvancedServerOps-840850991',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:53:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-70lt2fy5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:53:42Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=21edef97-3531-4772-8aa5-a3feeb9ff3f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.682 222021 DEBUG nova.network.os_vif_util [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "d08a5642-c043-410d-8d3a-e63134c79cd2", "address": "fa:16:3e:59:40:79", "network": {"id": "fae0cfd0-9ee7-400c-bda6-94fd3af3625d", "bridge": "br-int", "label": "tempest-network-smoke--1773492812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd08a5642-c0", "ovs_interfaceid": "d08a5642-c043-410d-8d3a-e63134c79cd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.683 222021 DEBUG nova.network.os_vif_util [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.684 222021 DEBUG os_vif [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.687 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd08a5642-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.687 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.691 222021 INFO os_vif [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:40:79,bridge_name='br-int',has_traffic_filtering=True,id=d08a5642-c043-410d-8d3a-e63134c79cd2,network=Network(fae0cfd0-9ee7-400c-bda6-94fd3af3625d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd08a5642-c0')#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.691 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.692 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:49 np0005593233 nova_compute[222017]: 2026-01-23 10:53:49.774 222021 DEBUG oslo_concurrency.processutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:53:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1668620695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:53:50 np0005593233 nova_compute[222017]: 2026-01-23 10:53:50.274 222021 DEBUG oslo_concurrency.processutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:50 np0005593233 nova_compute[222017]: 2026-01-23 10:53:50.284 222021 DEBUG nova.compute.provider_tree [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:53:50 np0005593233 nova_compute[222017]: 2026-01-23 10:53:50.304 222021 DEBUG nova.scheduler.client.report [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:53:50 np0005593233 nova_compute[222017]: 2026-01-23 10:53:50.368 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:50 np0005593233 nova_compute[222017]: 2026-01-23 10:53:50.462 222021 INFO nova.scheduler.client.report [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocation for migration 8ab60eb4-5a21-4192-9b1a-3ef645cee3f1#033[00m
Jan 23 05:53:50 np0005593233 nova_compute[222017]: 2026-01-23 10:53:50.516 222021 DEBUG oslo_concurrency.lockutils [None req-7a3a3e3f-f22c-43bb-807e-1d03d287048a 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "21edef97-3531-4772-8aa5-a3feeb9ff3f5" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e420 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:50.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:51.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:52 np0005593233 nova_compute[222017]: 2026-01-23 10:53:52.365 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:52 np0005593233 nova_compute[222017]: 2026-01-23 10:53:52.736 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:53:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:52.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:53:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:53.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:54 np0005593233 nova_compute[222017]: 2026-01-23 10:53:54.415 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:54.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 e421: 3 total, 3 up, 3 in
Jan 23 05:53:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:55.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:53:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:56.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:53:57 np0005593233 nova_compute[222017]: 2026-01-23 10:53:57.368 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:57.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:57 np0005593233 nova_compute[222017]: 2026-01-23 10:53:57.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:58.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:53:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:59.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:00 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 23 05:54:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:00.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:01.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:01 np0005593233 ovn_controller[130653]: 2026-01-23T10:54:01Z|00879|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 23 05:54:02 np0005593233 nova_compute[222017]: 2026-01-23 10:54:02.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:02.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:02 np0005593233 nova_compute[222017]: 2026-01-23 10:54:02.769 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:03 np0005593233 podman[305390]: 2026-01-23 10:54:03.113015739 +0000 UTC m=+0.114539824 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:54:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:03.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:54:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:04.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:54:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:05.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:06.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:07 np0005593233 nova_compute[222017]: 2026-01-23 10:54:07.372 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:07.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:07 np0005593233 nova_compute[222017]: 2026-01-23 10:54:07.772 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:08.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:09.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:11.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:11 np0005593233 nova_compute[222017]: 2026-01-23 10:54:11.515 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:11 np0005593233 nova_compute[222017]: 2026-01-23 10:54:11.576 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:54:12.234 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:54:12 np0005593233 nova_compute[222017]: 2026-01-23 10:54:12.234 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:12 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:54:12.235 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:54:12 np0005593233 nova_compute[222017]: 2026-01-23 10:54:12.374 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:12 np0005593233 nova_compute[222017]: 2026-01-23 10:54:12.775 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:12.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:13.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.388 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.419 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.419 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.420 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.420 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.421 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:54:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:54:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/640579765' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:54:14 np0005593233 nova_compute[222017]: 2026-01-23 10:54:14.900 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:54:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:14.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.160 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.162 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4274MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.162 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.162 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.236 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.237 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.254 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:54:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:54:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:15.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:54:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:54:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4186340069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.750 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:54:15 np0005593233 nova_compute[222017]: 2026-01-23 10:54:15.760 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:54:16 np0005593233 nova_compute[222017]: 2026-01-23 10:54:16.133 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:54:16 np0005593233 nova_compute[222017]: 2026-01-23 10:54:16.198 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:54:16 np0005593233 nova_compute[222017]: 2026-01-23 10:54:16.199 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:54:16 np0005593233 nova_compute[222017]: 2026-01-23 10:54:16.201 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:16.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:17 np0005593233 nova_compute[222017]: 2026-01-23 10:54:17.377 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:54:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:17.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:54:17 np0005593233 nova_compute[222017]: 2026-01-23 10:54:17.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:18 np0005593233 nova_compute[222017]: 2026-01-23 10:54:18.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:18 np0005593233 nova_compute[222017]: 2026-01-23 10:54:18.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:18 np0005593233 nova_compute[222017]: 2026-01-23 10:54:18.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:54:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:18.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:19 np0005593233 podman[305456]: 2026-01-23 10:54:19.15215569 +0000 UTC m=+0.152081814 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:54:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:54:19.236 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:54:19 np0005593233 nova_compute[222017]: 2026-01-23 10:54:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:54:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:19.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:54:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:20.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:21.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:22 np0005593233 nova_compute[222017]: 2026-01-23 10:54:22.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:22 np0005593233 nova_compute[222017]: 2026-01-23 10:54:22.824 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:22.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:23.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:24 np0005593233 nova_compute[222017]: 2026-01-23 10:54:24.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:24.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:25.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:26 np0005593233 nova_compute[222017]: 2026-01-23 10:54:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:26.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:27 np0005593233 nova_compute[222017]: 2026-01-23 10:54:27.382 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:27.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:27 np0005593233 nova_compute[222017]: 2026-01-23 10:54:27.825 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:28 np0005593233 nova_compute[222017]: 2026-01-23 10:54:28.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:28.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:29.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:30.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:31 np0005593233 nova_compute[222017]: 2026-01-23 10:54:31.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:31.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:32 np0005593233 nova_compute[222017]: 2026-01-23 10:54:32.384 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:32 np0005593233 nova_compute[222017]: 2026-01-23 10:54:32.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:32.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:33.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:34 np0005593233 podman[305485]: 2026-01-23 10:54:34.052777963 +0000 UTC m=+0.063564466 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:54:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:34.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:35 np0005593233 nova_compute[222017]: 2026-01-23 10:54:35.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:35 np0005593233 nova_compute[222017]: 2026-01-23 10:54:35.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:54:35 np0005593233 nova_compute[222017]: 2026-01-23 10:54:35.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:54:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:35.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:36.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:54:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:54:37 np0005593233 nova_compute[222017]: 2026-01-23 10:54:37.387 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:37.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:37 np0005593233 nova_compute[222017]: 2026-01-23 10:54:37.771 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:54:37 np0005593233 nova_compute[222017]: 2026-01-23 10:54:37.829 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:38.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:39.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:40.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:41.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:42 np0005593233 nova_compute[222017]: 2026-01-23 10:54:42.389 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:54:42.718 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:54:42.719 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:54:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:54:42.719 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:54:42 np0005593233 nova_compute[222017]: 2026-01-23 10:54:42.831 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:42.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:43.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:44.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:45.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:46.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:47 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:47 np0005593233 nova_compute[222017]: 2026-01-23 10:54:47.391 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:47.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:47 np0005593233 nova_compute[222017]: 2026-01-23 10:54:47.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:48.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:49.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:50 np0005593233 podman[305685]: 2026-01-23 10:54:50.098203762 +0000 UTC m=+0.105871589 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:54:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:54:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:50.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:54:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:52 np0005593233 nova_compute[222017]: 2026-01-23 10:54:52.395 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:52 np0005593233 nova_compute[222017]: 2026-01-23 10:54:52.765 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:52 np0005593233 nova_compute[222017]: 2026-01-23 10:54:52.897 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:52.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:53.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:54.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:55.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:56.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:57 np0005593233 nova_compute[222017]: 2026-01-23 10:54:57.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:57.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:57 np0005593233 nova_compute[222017]: 2026-01-23 10:54:57.900 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:58.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:54:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:54:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:54:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:59.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:00.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:01.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:02 np0005593233 nova_compute[222017]: 2026-01-23 10:55:02.404 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:02 np0005593233 nova_compute[222017]: 2026-01-23 10:55:02.903 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:02.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:03.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:04.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:05 np0005593233 podman[305712]: 2026-01-23 10:55:05.085712338 +0000 UTC m=+0.087718698 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 05:55:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:05.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:06.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:07 np0005593233 nova_compute[222017]: 2026-01-23 10:55:07.406 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:07.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:07 np0005593233 nova_compute[222017]: 2026-01-23 10:55:07.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:08 np0005593233 nova_compute[222017]: 2026-01-23 10:55:08.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:55:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:08.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:55:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:09.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:09 np0005593233 nova_compute[222017]: 2026-01-23 10:55:09.880 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:09 np0005593233 nova_compute[222017]: 2026-01-23 10:55:09.881 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:10 np0005593233 nova_compute[222017]: 2026-01-23 10:55:10.458 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:55:10 np0005593233 nova_compute[222017]: 2026-01-23 10:55:10.660 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:10 np0005593233 nova_compute[222017]: 2026-01-23 10:55:10.661 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:10 np0005593233 nova_compute[222017]: 2026-01-23 10:55:10.676 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:55:10 np0005593233 nova_compute[222017]: 2026-01-23 10:55:10.677 222021 INFO nova.compute.claims [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:55:10 np0005593233 nova_compute[222017]: 2026-01-23 10:55:10.846 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:55:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1149503978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.339 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.349 222021 DEBUG nova.compute.provider_tree [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:55:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:11.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.605 222021 DEBUG nova.scheduler.client.report [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.646 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.647 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.811 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.812 222021 DEBUG nova.network.neutron [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.905 222021 INFO nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:55:11 np0005593233 nova_compute[222017]: 2026-01-23 10:55:11.956 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.207 222021 DEBUG nova.policy [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.242 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.244 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.244 222021 INFO nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Creating image(s)#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.282 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.320 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.355 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.360 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.408 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.456 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.457 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.458 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.458 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.543 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.547 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:12 np0005593233 nova_compute[222017]: 2026-01-23 10:55:12.952 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:13.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:55:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:13.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:55:13 np0005593233 nova_compute[222017]: 2026-01-23 10:55:13.698 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:13 np0005593233 nova_compute[222017]: 2026-01-23 10:55:13.808 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:55:13 np0005593233 nova_compute[222017]: 2026-01-23 10:55:13.966 222021 DEBUG nova.objects.instance [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.002 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.002 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Ensure instance console log exists: /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.003 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.003 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.003 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.436 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.518 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.519 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.519 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.520 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.520 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:55:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4255612814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:55:14 np0005593233 nova_compute[222017]: 2026-01-23 10:55:14.977 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:15.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.150 222021 DEBUG nova.network.neutron [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Successfully created port: 62f573cf-0476-448d-b148-040cec7b1042 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.216 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.217 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4248MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.217 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.217 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:15.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.642 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance fcb93bcf-9612-4dc7-9996-238d2739d8cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.642 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.642 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:55:15 np0005593233 nova_compute[222017]: 2026-01-23 10:55:15.708 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:55:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1674186411' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.168 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.175 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.296 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.342 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.343 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.344 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.345 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.366 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.366 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:16 np0005593233 nova_compute[222017]: 2026-01-23 10:55:16.367 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:55:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:17.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:17 np0005593233 nova_compute[222017]: 2026-01-23 10:55:17.411 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:17.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:17 np0005593233 nova_compute[222017]: 2026-01-23 10:55:17.955 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:18 np0005593233 nova_compute[222017]: 2026-01-23 10:55:18.795 222021 DEBUG nova.network.neutron [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Successfully updated port: 62f573cf-0476-448d-b148-040cec7b1042 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:55:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:19.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.054 222021 DEBUG nova.compute.manager [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.054 222021 DEBUG nova.compute.manager [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.055 222021 DEBUG oslo_concurrency.lockutils [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.055 222021 DEBUG oslo_concurrency.lockutils [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.056 222021 DEBUG nova.network.neutron [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.068 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:55:19 np0005593233 nova_compute[222017]: 2026-01-23 10:55:19.227 222021 DEBUG nova.network.neutron [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:55:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:19.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:20 np0005593233 nova_compute[222017]: 2026-01-23 10:55:20.385 222021 DEBUG nova.network.neutron [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:55:20 np0005593233 nova_compute[222017]: 2026-01-23 10:55:20.406 222021 DEBUG oslo_concurrency.lockutils [req-579803fa-d36d-4e95-ae11-d1ffa42ddf3e req-d76831de-2919-4e18-87a0-d27ce42bf884 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:55:20 np0005593233 nova_compute[222017]: 2026-01-23 10:55:20.408 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:55:20 np0005593233 nova_compute[222017]: 2026-01-23 10:55:20.408 222021 DEBUG nova.network.neutron [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:55:20 np0005593233 nova_compute[222017]: 2026-01-23 10:55:20.947 222021 DEBUG nova.network.neutron [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:55:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:21.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:21 np0005593233 podman[305963]: 2026-01-23 10:55:21.135024137 +0000 UTC m=+0.137539223 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 05:55:21 np0005593233 nova_compute[222017]: 2026-01-23 10:55:21.195 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:21 np0005593233 nova_compute[222017]: 2026-01-23 10:55:21.196 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:21 np0005593233 nova_compute[222017]: 2026-01-23 10:55:21.196 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:21 np0005593233 nova_compute[222017]: 2026-01-23 10:55:21.196 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:55:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:21.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.414 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.416 222021 DEBUG nova.network.neutron [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.518 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.518 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance network_info: |[{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.521 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start _get_guest_xml network_info=[{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.527 222021 WARNING nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.531 222021 DEBUG nova.virt.libvirt.host [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.532 222021 DEBUG nova.virt.libvirt.host [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.535 222021 DEBUG nova.virt.libvirt.host [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.536 222021 DEBUG nova.virt.libvirt.host [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.537 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.538 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.539 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.539 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.540 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.540 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.540 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.540 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.541 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.541 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.541 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.542 222021 DEBUG nova.virt.hardware [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.546 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:22 np0005593233 nova_compute[222017]: 2026-01-23 10:55:22.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:55:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1014336156' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:55:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:23.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.034 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.077 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.082 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:23.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:55:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3861363938' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.539 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.542 222021 DEBUG nova.virt.libvirt.vif [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:55:11Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.542 222021 DEBUG nova.network.os_vif_util [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.543 222021 DEBUG nova.network.os_vif_util [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.545 222021 DEBUG nova.objects.instance [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.633 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <uuid>fcb93bcf-9612-4dc7-9996-238d2739d8cb</uuid>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <name>instance-000000d3</name>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-660546175</nova:name>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:55:22</nova:creationTime>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <nova:port uuid="62f573cf-0476-448d-b148-040cec7b1042">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <entry name="serial">fcb93bcf-9612-4dc7-9996-238d2739d8cb</entry>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <entry name="uuid">fcb93bcf-9612-4dc7-9996-238d2739d8cb</entry>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:f9:18:47"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <target dev="tap62f573cf-04"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/console.log" append="off"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:55:23 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:55:23 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:55:23 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:55:23 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.635 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Preparing to wait for external event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.635 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.635 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.635 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.636 222021 DEBUG nova.virt.libvirt.vif [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:55:11Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.637 222021 DEBUG nova.network.os_vif_util [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.637 222021 DEBUG nova.network.os_vif_util [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.638 222021 DEBUG os_vif [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.639 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.639 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.646 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62f573cf-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.647 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62f573cf-04, col_values=(('external_ids', {'iface-id': '62f573cf-0476-448d-b148-040cec7b1042', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:18:47', 'vm-uuid': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.650 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:23 np0005593233 NetworkManager[48871]: <info>  [1769165723.6529] manager: (tap62f573cf-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.653 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.659 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.660 222021 INFO os_vif [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04')#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.980 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.981 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.981 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:f9:18:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:55:23 np0005593233 nova_compute[222017]: 2026-01-23 10:55:23.982 222021 INFO nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Using config drive#033[00m
Jan 23 05:55:24 np0005593233 nova_compute[222017]: 2026-01-23 10:55:24.024 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:25.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:25 np0005593233 nova_compute[222017]: 2026-01-23 10:55:25.148 222021 INFO nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Creating config drive at /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/disk.config#033[00m
Jan 23 05:55:25 np0005593233 nova_compute[222017]: 2026-01-23 10:55:25.157 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkjhjml5i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:25 np0005593233 nova_compute[222017]: 2026-01-23 10:55:25.327 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkjhjml5i" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:25 np0005593233 nova_compute[222017]: 2026-01-23 10:55:25.378 222021 DEBUG nova.storage.rbd_utils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:55:25 np0005593233 nova_compute[222017]: 2026-01-23 10:55:25.383 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/disk.config fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:25 np0005593233 nova_compute[222017]: 2026-01-23 10:55:25.421 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:25.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:27.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:27 np0005593233 nova_compute[222017]: 2026-01-23 10:55:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:27 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:27Z|00880|memory_trim|INFO|Detected inactivity (last active 30048 ms ago): trimming memory
Jan 23 05:55:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:27.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:27 np0005593233 nova_compute[222017]: 2026-01-23 10:55:27.962 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.656 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.701 222021 DEBUG oslo_concurrency.processutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/disk.config fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.317s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.702 222021 INFO nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Deleting local config drive /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/disk.config because it was imported into RBD.#033[00m
Jan 23 05:55:28 np0005593233 kernel: tap62f573cf-04: entered promiscuous mode
Jan 23 05:55:28 np0005593233 NetworkManager[48871]: <info>  [1769165728.7739] manager: (tap62f573cf-04): new Tun device (/org/freedesktop/NetworkManager/Devices/399)
Jan 23 05:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:28Z|00881|binding|INFO|Claiming lport 62f573cf-0476-448d-b148-040cec7b1042 for this chassis.
Jan 23 05:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:28Z|00882|binding|INFO|62f573cf-0476-448d-b148-040cec7b1042: Claiming fa:16:3e:f9:18:47 10.100.0.14
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.776 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:28 np0005593233 systemd-machined[190954]: New machine qemu-95-instance-000000d3.
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.836 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:28 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:28Z|00883|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 ovn-installed in OVS
Jan 23 05:55:28 np0005593233 nova_compute[222017]: 2026-01-23 10:55:28.844 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:28 np0005593233 systemd[1]: Started Virtual Machine qemu-95-instance-000000d3.
Jan 23 05:55:28 np0005593233 systemd-udevd[306126]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:55:28 np0005593233 NetworkManager[48871]: <info>  [1769165728.8681] device (tap62f573cf-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:55:28 np0005593233 NetworkManager[48871]: <info>  [1769165728.8706] device (tap62f573cf-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:55:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:55:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:29.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:55:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:29Z|00884|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 up in Southbound
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.111 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:47 10.100.0.14'], port_security=['fa:16:3e:f9:18:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5afed19d-3ff6-4459-b8a0-c5fc6a279e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=866a455a-94b4-4bbd-a367-b902a726ce2f, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=62f573cf-0476-448d-b148-040cec7b1042) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.113 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 62f573cf-0476-448d-b148-040cec7b1042 in datapath 6c737d6f-3e00-482b-aed5-4f8eabd246f2 bound to our chassis#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.115 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c737d6f-3e00-482b-aed5-4f8eabd246f2#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.130 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[48364f14-0b07-40b1-b60d-dfcbab4d8c7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.131 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c737d6f-31 in ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.134 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c737d6f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.134 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[67a407a7-05e9-4c13-b9e9-d486317e0be0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.136 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5f95c8-f4b7-46fe-bc7a-7af0c2868d6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.155 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[480ab53c-3a32-4c30-ad99-d2b43b8ac95c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.182 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4f5404-3ce3-4f2a-a19b-cc76bb197788]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.229 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8bb419-0ebc-453b-bcb8-4995a21f1b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 NetworkManager[48871]: <info>  [1769165729.2375] manager: (tap6c737d6f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/400)
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.235 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f8e0c8-c48a-48a3-a6f2-8d65722670f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.286 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6b25ac-63ff-4158-9a54-d50e80bd62d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.291 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f722df32-46fe-4dba-9330-2ab5c3ca7db3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 NetworkManager[48871]: <info>  [1769165729.3241] device (tap6c737d6f-30): carrier: link connected
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.337 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4f1d28-ef46-456f-afee-cd0c5f56cda4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.365 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eed0da3c-1794-4cd5-8458-dfb3843ba45b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c737d6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:05:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 263], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 964195, 'reachable_time': 44296, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306159, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.391 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b2909457-7388-4bc6-95cb-22c8d20bb965]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:508'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 964195, 'tstamp': 964195}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 306167, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.422 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5342e7ff-54ad-43a6-95fe-876a75c23226]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c737d6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:05:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 263], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 964195, 'reachable_time': 44296, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 306168, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.475 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[df7a0653-1277-4432-a48c-2fec01b213d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:55:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:29.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.543 222021 DEBUG nova.compute.manager [req-1acb36f1-e52c-4c3a-9434-410da85b4965 req-f197a1be-0c6b-4573-93eb-e42f30d1a89e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.544 222021 DEBUG oslo_concurrency.lockutils [req-1acb36f1-e52c-4c3a-9434-410da85b4965 req-f197a1be-0c6b-4573-93eb-e42f30d1a89e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.545 222021 DEBUG oslo_concurrency.lockutils [req-1acb36f1-e52c-4c3a-9434-410da85b4965 req-f197a1be-0c6b-4573-93eb-e42f30d1a89e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.546 222021 DEBUG oslo_concurrency.lockutils [req-1acb36f1-e52c-4c3a-9434-410da85b4965 req-f197a1be-0c6b-4573-93eb-e42f30d1a89e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.546 222021 DEBUG nova.compute.manager [req-1acb36f1-e52c-4c3a-9434-410da85b4965 req-f197a1be-0c6b-4573-93eb-e42f30d1a89e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Processing event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.556 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[cada08c0-6647-4278-aace-d789859a9f4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.558 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c737d6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.558 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.558 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c737d6f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.561 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:29 np0005593233 kernel: tap6c737d6f-30: entered promiscuous mode
Jan 23 05:55:29 np0005593233 NetworkManager[48871]: <info>  [1769165729.5624] manager: (tap6c737d6f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.564 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c737d6f-30, col_values=(('external_ids', {'iface-id': '8bc5480b-7bdc-475b-b309-693291ebc39a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:29 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:29Z|00885|binding|INFO|Releasing lport 8bc5480b-7bdc-475b-b309-693291ebc39a from this chassis (sb_readonly=0)
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.578 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:55:29 np0005593233 nova_compute[222017]: 2026-01-23 10:55:29.578 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.579 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e89acb36-e430-48ae-9c5c-f572818841fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.580 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-6c737d6f-3e00-482b-aed5-4f8eabd246f2
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 6c737d6f-3e00-482b-aed5-4f8eabd246f2
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:55:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:29.580 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'env', 'PROCESS_TAG=haproxy-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c737d6f-3e00-482b-aed5-4f8eabd246f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:55:30 np0005593233 podman[306198]: 2026-01-23 10:55:29.989540891 +0000 UTC m=+0.047916454 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:55:30 np0005593233 nova_compute[222017]: 2026-01-23 10:55:30.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:31.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:31.388 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:31.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.661 222021 DEBUG nova.compute.manager [req-eabe916f-28a7-41ba-ac79-8ee33391d951 req-1054e838-24fb-47a8-bf17-602db7e5abd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.662 222021 DEBUG oslo_concurrency.lockutils [req-eabe916f-28a7-41ba-ac79-8ee33391d951 req-1054e838-24fb-47a8-bf17-602db7e5abd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.663 222021 DEBUG oslo_concurrency.lockutils [req-eabe916f-28a7-41ba-ac79-8ee33391d951 req-1054e838-24fb-47a8-bf17-602db7e5abd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.664 222021 DEBUG oslo_concurrency.lockutils [req-eabe916f-28a7-41ba-ac79-8ee33391d951 req-1054e838-24fb-47a8-bf17-602db7e5abd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.664 222021 DEBUG nova.compute.manager [req-eabe916f-28a7-41ba-ac79-8ee33391d951 req-1054e838-24fb-47a8-bf17-602db7e5abd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:55:31 np0005593233 nova_compute[222017]: 2026-01-23 10:55:31.665 222021 WARNING nova.compute.manager [req-eabe916f-28a7-41ba-ac79-8ee33391d951 req-1054e838-24fb-47a8-bf17-602db7e5abd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:55:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:32 np0005593233 nova_compute[222017]: 2026-01-23 10:55:32.978 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:55:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:33.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.265 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.266 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165733.2641659, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.266 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Started (Lifecycle Event)#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.271 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.276 222021 INFO nova.virt.libvirt.driver [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance spawned successfully.#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.277 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.293 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.303 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.308 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.309 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.310 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.311 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.311 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.312 222021 DEBUG nova.virt.libvirt.driver [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.343 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.344 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165733.2674036, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.344 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.381 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.387 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165733.2705727, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.387 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.403 222021 INFO nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Took 21.16 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.403 222021 DEBUG nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.450 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.455 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.500 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.520 222021 INFO nova.compute.manager [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Took 22.92 seconds to build instance.#033[00m
Jan 23 05:55:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:33.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.584 222021 DEBUG oslo_concurrency.lockutils [None req-cc5ee045-905b-42c1-91fa-12f0b1c50cbf 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:33 np0005593233 podman[306198]: 2026-01-23 10:55:33.701893112 +0000 UTC m=+3.760268665 container create a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 05:55:33 np0005593233 nova_compute[222017]: 2026-01-23 10:55:33.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:33 np0005593233 systemd[1]: Started libpod-conmon-a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4.scope.
Jan 23 05:55:33 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:55:33 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85a461cd56444edcde3ba6bc3911471a4a5f2f98c51d2c8507372a10dfaa7886/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:55:33 np0005593233 podman[306198]: 2026-01-23 10:55:33.98449362 +0000 UTC m=+4.042869213 container init a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:55:33 np0005593233 podman[306198]: 2026-01-23 10:55:33.994876793 +0000 UTC m=+4.053252306 container start a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:55:34 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [NOTICE]   (306253) : New worker (306255) forked
Jan 23 05:55:34 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [NOTICE]   (306253) : Loading success.
Jan 23 05:55:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:34.078 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:55:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:35.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:35 np0005593233 podman[306266]: 2026-01-23 10:55:35.38948317 +0000 UTC m=+0.081749089 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:55:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:35.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:36 np0005593233 nova_compute[222017]: 2026-01-23 10:55:36.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:36 np0005593233 nova_compute[222017]: 2026-01-23 10:55:36.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:55:36 np0005593233 nova_compute[222017]: 2026-01-23 10:55:36.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:55:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:37.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:37.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:37 np0005593233 nova_compute[222017]: 2026-01-23 10:55:37.617 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:55:37 np0005593233 nova_compute[222017]: 2026-01-23 10:55:37.618 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:55:37 np0005593233 nova_compute[222017]: 2026-01-23 10:55:37.619 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:55:37 np0005593233 nova_compute[222017]: 2026-01-23 10:55:37.619 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:55:37 np0005593233 nova_compute[222017]: 2026-01-23 10:55:37.983 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:38 np0005593233 nova_compute[222017]: 2026-01-23 10:55:38.712 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:39.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:39.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:41.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:41.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:42.719 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:42.720 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:42.721 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:42 np0005593233 nova_compute[222017]: 2026-01-23 10:55:42.986 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:43.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:43.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:43 np0005593233 nova_compute[222017]: 2026-01-23 10:55:43.628 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:43 np0005593233 NetworkManager[48871]: <info>  [1769165743.6301] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/402)
Jan 23 05:55:43 np0005593233 NetworkManager[48871]: <info>  [1769165743.6322] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/403)
Jan 23 05:55:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:43Z|00886|binding|INFO|Releasing lport 8bc5480b-7bdc-475b-b309-693291ebc39a from this chassis (sb_readonly=0)
Jan 23 05:55:43 np0005593233 nova_compute[222017]: 2026-01-23 10:55:43.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:43 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:43Z|00887|binding|INFO|Releasing lport 8bc5480b-7bdc-475b-b309-693291ebc39a from this chassis (sb_readonly=0)
Jan 23 05:55:43 np0005593233 nova_compute[222017]: 2026-01-23 10:55:43.675 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:43 np0005593233 nova_compute[222017]: 2026-01-23 10:55:43.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:44 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:55:44.080 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.186 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.492 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.493 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.497 222021 DEBUG nova.compute.manager [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.498 222021 DEBUG nova.compute.manager [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.498 222021 DEBUG oslo_concurrency.lockutils [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.499 222021 DEBUG oslo_concurrency.lockutils [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:55:44 np0005593233 nova_compute[222017]: 2026-01-23 10:55:44.499 222021 DEBUG nova.network.neutron [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:55:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:45.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:45.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:45 np0005593233 nova_compute[222017]: 2026-01-23 10:55:45.900 222021 DEBUG nova.network.neutron [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated VIF entry in instance network info cache for port 62f573cf-0476-448d-b148-040cec7b1042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:55:45 np0005593233 nova_compute[222017]: 2026-01-23 10:55:45.902 222021 DEBUG nova.network.neutron [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:55:45 np0005593233 nova_compute[222017]: 2026-01-23 10:55:45.952 222021 DEBUG oslo_concurrency.lockutils [req-7e871614-f049-4a73-8610-ce3268e8e224 req-79db3cc4-25bc-40d9-98a4-0e3f507e5c55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:55:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:48 np0005593233 nova_compute[222017]: 2026-01-23 10:55:48.021 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:55:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:55:48 np0005593233 nova_compute[222017]: 2026-01-23 10:55:48.717 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:55:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:55:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:49.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:50Z|00130|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:18:47 10.100.0.14
Jan 23 05:55:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:55:50Z|00131|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:18:47 10.100.0.14
Jan 23 05:55:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:55:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:55:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:55:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:51.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:51.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:52 np0005593233 podman[306541]: 2026-01-23 10:55:52.091181542 +0000 UTC m=+0.107697171 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 05:55:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:53 np0005593233 nova_compute[222017]: 2026-01-23 10:55:53.024 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:53.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:53.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:53 np0005593233 nova_compute[222017]: 2026-01-23 10:55:53.738 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:55:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:55:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:55.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:57.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:57.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:58 np0005593233 nova_compute[222017]: 2026-01-23 10:55:58.027 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:58 np0005593233 nova_compute[222017]: 2026-01-23 10:55:58.368 222021 INFO nova.compute.manager [None req-157f1fdc-537d-473f-9a15-e5aa9c95f8a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Get console output#033[00m
Jan 23 05:55:58 np0005593233 nova_compute[222017]: 2026-01-23 10:55:58.377 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:55:58 np0005593233 nova_compute[222017]: 2026-01-23 10:55:58.740 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:59.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:55:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:59.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:01.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:02 np0005593233 nova_compute[222017]: 2026-01-23 10:56:02.993 222021 INFO nova.compute.manager [None req-6b16c301-3b7c-46d8-b930-ed6f3dc3dc17 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Get console output#033[00m
Jan 23 05:56:03 np0005593233 nova_compute[222017]: 2026-01-23 10:56:03.002 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:56:03 np0005593233 nova_compute[222017]: 2026-01-23 10:56:03.031 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:03.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:03 np0005593233 nova_compute[222017]: 2026-01-23 10:56:03.743 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:05.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:05.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:06 np0005593233 podman[306568]: 2026-01-23 10:56:06.085706805 +0000 UTC m=+0.085220307 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:56:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:07.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:07.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:08 np0005593233 nova_compute[222017]: 2026-01-23 10:56:08.052 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:08 np0005593233 nova_compute[222017]: 2026-01-23 10:56:08.731 222021 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:08 np0005593233 nova_compute[222017]: 2026-01-23 10:56:08.732 222021 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:08 np0005593233 nova_compute[222017]: 2026-01-23 10:56:08.732 222021 DEBUG nova.network.neutron [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:56:08 np0005593233 nova_compute[222017]: 2026-01-23 10:56:08.747 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:09.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:09 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:56:09 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:56:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:09.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:11.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:11.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.056 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:13.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.196 222021 DEBUG nova.network.neutron [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.218 222021 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.337 222021 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.338 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Creating file /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/87612b19de63466f84c993526342245d.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.338 222021 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/87612b19de63466f84c993526342245d.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:13.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:13 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:13Z|00888|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.749 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.775 222021 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/87612b19de63466f84c993526342245d.tmp" returned: 1 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.776 222021 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/87612b19de63466f84c993526342245d.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.777 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Creating directory /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 05:56:13 np0005593233 nova_compute[222017]: 2026-01-23 10:56:13.777 222021 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:14 np0005593233 nova_compute[222017]: 2026-01-23 10:56:14.014 222021 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb" returned: 0 in 0.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:14 np0005593233 nova_compute[222017]: 2026-01-23 10:56:14.023 222021 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:56:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:15.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:15 np0005593233 nova_compute[222017]: 2026-01-23 10:56:15.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:15.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:16 np0005593233 nova_compute[222017]: 2026-01-23 10:56:16.849 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:16 np0005593233 nova_compute[222017]: 2026-01-23 10:56:16.849 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:16 np0005593233 nova_compute[222017]: 2026-01-23 10:56:16.850 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:16 np0005593233 nova_compute[222017]: 2026-01-23 10:56:16.850 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:56:16 np0005593233 nova_compute[222017]: 2026-01-23 10:56:16.850 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:17.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:56:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3364572878' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.372 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.455 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.455 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:56:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:17.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.638 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.639 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4053MB free_disk=20.942729949951172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.639 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.640 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.691 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating resource usage from migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.716 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.717 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.717 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.737 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.755 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.755 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.768 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.791 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:56:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:17 np0005593233 nova_compute[222017]: 2026-01-23 10:56:17.857 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.055 222021 INFO nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance shutdown successfully after 4 seconds.#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.057 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:56:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/617111945' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.365 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.375 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:56:18 np0005593233 kernel: tap62f573cf-04 (unregistering): left promiscuous mode
Jan 23 05:56:18 np0005593233 NetworkManager[48871]: <info>  [1769165778.5124] device (tap62f573cf-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:56:18 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:18Z|00889|binding|INFO|Releasing lport 62f573cf-0476-448d-b148-040cec7b1042 from this chassis (sb_readonly=0)
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:18Z|00890|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 down in Southbound
Jan 23 05:56:18 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:18Z|00891|binding|INFO|Removing iface tap62f573cf-04 ovn-installed in OVS
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.535 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.610 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000d3.scope: Deactivated successfully.
Jan 23 05:56:18 np0005593233 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000d3.scope: Consumed 16.588s CPU time.
Jan 23 05:56:18 np0005593233 systemd-machined[190954]: Machine qemu-95-instance-000000d3 terminated.
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.711 222021 INFO nova.virt.libvirt.driver [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance destroyed successfully.#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.713 222021 DEBUG nova.virt.libvirt.vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:55:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:56:07Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.713 222021 DEBUG nova.network.os_vif_util [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.714 222021 DEBUG nova.network.os_vif_util [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.715 222021 DEBUG os_vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.718 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62f573cf-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.720 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.723 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.726 222021 INFO os_vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04')#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.734 222021 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:56:18 np0005593233 nova_compute[222017]: 2026-01-23 10:56:18.735 222021 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:56:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:19.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:19 np0005593233 nova_compute[222017]: 2026-01-23 10:56:19.258 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:56:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:19.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.103 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:47 10.100.0.14'], port_security=['fa:16:3e:f9:18:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5afed19d-3ff6-4459-b8a0-c5fc6a279e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=866a455a-94b4-4bbd-a367-b902a726ce2f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=62f573cf-0476-448d-b148-040cec7b1042) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.106 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 62f573cf-0476-448d-b148-040cec7b1042 in datapath 6c737d6f-3e00-482b-aed5-4f8eabd246f2 unbound from our chassis#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.109 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c737d6f-3e00-482b-aed5-4f8eabd246f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.111 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8054d5-1272-4f92-93e8-b80bfd282b49]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.113 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 namespace which is not needed anymore#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.161 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.162 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.206 222021 DEBUG neutronclient.v2_0.client [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 62f573cf-0476-448d-b148-040cec7b1042 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.320 222021 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.320 222021 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.321 222021 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.443 222021 DEBUG nova.compute.manager [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.443 222021 DEBUG oslo_concurrency.lockutils [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.444 222021 DEBUG oslo_concurrency.lockutils [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.444 222021 DEBUG oslo_concurrency.lockutils [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.445 222021 DEBUG nova.compute.manager [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.445 222021 WARNING nova.compute.manager [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:56:20 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [NOTICE]   (306253) : haproxy version is 2.8.14-c23fe91
Jan 23 05:56:20 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [NOTICE]   (306253) : path to executable is /usr/sbin/haproxy
Jan 23 05:56:20 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [WARNING]  (306253) : Exiting Master process...
Jan 23 05:56:20 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [ALERT]    (306253) : Current worker (306255) exited with code 143 (Terminated)
Jan 23 05:56:20 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[306249]: [WARNING]  (306253) : All workers exited. Exiting... (0)
Jan 23 05:56:20 np0005593233 systemd[1]: libpod-a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4.scope: Deactivated successfully.
Jan 23 05:56:20 np0005593233 podman[306719]: 2026-01-23 10:56:20.615465329 +0000 UTC m=+0.372773604 container died a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:56:20 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:56:20 np0005593233 systemd[1]: var-lib-containers-storage-overlay-85a461cd56444edcde3ba6bc3911471a4a5f2f98c51d2c8507372a10dfaa7886-merged.mount: Deactivated successfully.
Jan 23 05:56:20 np0005593233 podman[306719]: 2026-01-23 10:56:20.730006462 +0000 UTC m=+0.487314717 container cleanup a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 05:56:20 np0005593233 systemd[1]: libpod-conmon-a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4.scope: Deactivated successfully.
Jan 23 05:56:20 np0005593233 podman[306750]: 2026-01-23 10:56:20.845339688 +0000 UTC m=+0.089289262 container remove a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.856 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[898419d7-1df5-43d7-b217-f6f03b9b5de0]: (4, ('Fri Jan 23 10:56:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 (a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4)\na7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4\nFri Jan 23 10:56:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 (a7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4)\na7d84971aa3884b3a408ec3b59707b397d2341d7e07c9656f8b19c96d7a3d1a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.858 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a145925-19a2-4ca8-b092-6930373f4cc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.859 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c737d6f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:20 np0005593233 kernel: tap6c737d6f-30: left promiscuous mode
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.864 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.868 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[12529c1b-e629-4e9a-936e-60e6959cbc38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 nova_compute[222017]: 2026-01-23 10:56:20.890 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.898 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c702f3d9-c432-48c2-81b7-170c670ceef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.900 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[39cd5ac8-127f-49a3-a323-dbbb3a6a2c9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.914 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ff517660-d147-4d0d-92f4-2d50ddfaff5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 964185, 'reachable_time': 34621, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306766, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.917 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:56:20 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:20.917 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[08c5462a-3a29-41da-a729-60e280edc163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:20 np0005593233 systemd[1]: run-netns-ovnmeta\x2d6c737d6f\x2d3e00\x2d482b\x2daed5\x2d4f8eabd246f2.mount: Deactivated successfully.
Jan 23 05:56:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:21.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.059 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:23 np0005593233 podman[306767]: 2026-01-23 10:56:23.104958412 +0000 UTC m=+0.115542014 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:56:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:23.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.276 222021 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.277 222021 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.277 222021 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.278 222021 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.278 222021 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.278 222021 WARNING nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.278 222021 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.279 222021 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.280 222021 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.280 222021 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.280 222021 DEBUG nova.network.neutron [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:56:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:23.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:23 np0005593233 nova_compute[222017]: 2026-01-23 10:56:23.722 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:24 np0005593233 nova_compute[222017]: 2026-01-23 10:56:24.163 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:24 np0005593233 nova_compute[222017]: 2026-01-23 10:56:24.167 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:24 np0005593233 nova_compute[222017]: 2026-01-23 10:56:24.167 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:24 np0005593233 nova_compute[222017]: 2026-01-23 10:56:24.168 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:56:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:25.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e422 e422: 3 total, 3 up, 3 in
Jan 23 05:56:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:25.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:25 np0005593233 nova_compute[222017]: 2026-01-23 10:56:25.662 222021 DEBUG nova.network.neutron [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated VIF entry in instance network info cache for port 62f573cf-0476-448d-b148-040cec7b1042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:56:25 np0005593233 nova_compute[222017]: 2026-01-23 10:56:25.663 222021 DEBUG nova.network.neutron [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:25 np0005593233 nova_compute[222017]: 2026-01-23 10:56:25.834 222021 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:26 np0005593233 nova_compute[222017]: 2026-01-23 10:56:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:27.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:27.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.061 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.318 222021 DEBUG nova.compute.manager [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.318 222021 DEBUG oslo_concurrency.lockutils [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.318 222021 DEBUG oslo_concurrency.lockutils [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.319 222021 DEBUG oslo_concurrency.lockutils [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.319 222021 DEBUG nova.compute.manager [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.319 222021 WARNING nova.compute.manager [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 05:56:28 np0005593233 nova_compute[222017]: 2026-01-23 10:56:28.725 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:29.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:29 np0005593233 nova_compute[222017]: 2026-01-23 10:56:29.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:29.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:30 np0005593233 nova_compute[222017]: 2026-01-23 10:56:30.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:31.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:31 np0005593233 nova_compute[222017]: 2026-01-23 10:56:31.366 222021 DEBUG nova.compute.manager [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:31 np0005593233 nova_compute[222017]: 2026-01-23 10:56:31.367 222021 DEBUG oslo_concurrency.lockutils [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:31 np0005593233 nova_compute[222017]: 2026-01-23 10:56:31.367 222021 DEBUG oslo_concurrency.lockutils [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:31 np0005593233 nova_compute[222017]: 2026-01-23 10:56:31.368 222021 DEBUG oslo_concurrency.lockutils [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:31 np0005593233 nova_compute[222017]: 2026-01-23 10:56:31.368 222021 DEBUG nova.compute.manager [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:31 np0005593233 nova_compute[222017]: 2026-01-23 10:56:31.368 222021 WARNING nova.compute.manager [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:56:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:31.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.064 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:33.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:33.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.709 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165778.7080212, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.709 222021 INFO nova.compute.manager [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.727 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.729 222021 DEBUG nova.compute.manager [None req-8329c5ed-ca0f-4301-b915-d33fa9cecacc - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.732 222021 DEBUG nova.compute.manager [None req-8329c5ed-ca0f-4301-b915-d33fa9cecacc - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:56:33 np0005593233 nova_compute[222017]: 2026-01-23 10:56:33.752 222021 INFO nova.compute.manager [None req-8329c5ed-ca0f-4301-b915-d33fa9cecacc - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 23 05:56:34 np0005593233 nova_compute[222017]: 2026-01-23 10:56:34.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:35.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:35.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:36 np0005593233 nova_compute[222017]: 2026-01-23 10:56:36.577 222021 DEBUG nova.compute.manager [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:36 np0005593233 nova_compute[222017]: 2026-01-23 10:56:36.578 222021 DEBUG oslo_concurrency.lockutils [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:36 np0005593233 nova_compute[222017]: 2026-01-23 10:56:36.578 222021 DEBUG oslo_concurrency.lockutils [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:36 np0005593233 nova_compute[222017]: 2026-01-23 10:56:36.578 222021 DEBUG oslo_concurrency.lockutils [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:36 np0005593233 nova_compute[222017]: 2026-01-23 10:56:36.579 222021 DEBUG nova.compute.manager [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:36 np0005593233 nova_compute[222017]: 2026-01-23 10:56:36.579 222021 WARNING nova.compute.manager [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:56:37 np0005593233 nova_compute[222017]: 2026-01-23 10:56:37.060 222021 INFO nova.compute.manager [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Swapping old allocation on dict_keys(['929812a2-38ca-4ee7-9f24-090d633cb42b']) held by migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259 for instance#033[00m
Jan 23 05:56:37 np0005593233 podman[306793]: 2026-01-23 10:56:37.080400123 +0000 UTC m=+0.082808309 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:56:37 np0005593233 nova_compute[222017]: 2026-01-23 10:56:37.095 222021 DEBUG nova.scheduler.client.report [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Overwriting current allocation {'allocations': {'89873210-bee9-46e9-9f9d-0cd7a156c3a8': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 94}}, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'consumer_generation': 1} on consumer fcb93bcf-9612-4dc7-9996-238d2739d8cb move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 23 05:56:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:37.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:37.232 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:56:37 np0005593233 nova_compute[222017]: 2026-01-23 10:56:37.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:37 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:37.234 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:56:37 np0005593233 nova_compute[222017]: 2026-01-23 10:56:37.302 222021 INFO nova.network.neutron [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating port 62f573cf-0476-448d-b148-040cec7b1042 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:56:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:37.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.118 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.459 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.459 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.460 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.460 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.708 222021 DEBUG nova.compute.manager [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.709 222021 DEBUG oslo_concurrency.lockutils [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.709 222021 DEBUG oslo_concurrency.lockutils [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.709 222021 DEBUG oslo_concurrency.lockutils [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.709 222021 DEBUG nova.compute.manager [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.710 222021 WARNING nova.compute.manager [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.729 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:38 np0005593233 nova_compute[222017]: 2026-01-23 10:56:38.768 222021 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:39.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:39 np0005593233 nova_compute[222017]: 2026-01-23 10:56:39.545 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:39 np0005593233 nova_compute[222017]: 2026-01-23 10:56:39.565 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:39 np0005593233 nova_compute[222017]: 2026-01-23 10:56:39.566 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:56:39 np0005593233 nova_compute[222017]: 2026-01-23 10:56:39.566 222021 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:39 np0005593233 nova_compute[222017]: 2026-01-23 10:56:39.566 222021 DEBUG nova.network.neutron [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:56:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:39.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.036 222021 DEBUG nova.compute.manager [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.037 222021 DEBUG nova.compute.manager [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.037 222021 DEBUG oslo_concurrency.lockutils [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:41.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:41.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.816 222021 DEBUG nova.network.neutron [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.867 222021 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.868 222021 DEBUG nova.virt.libvirt.driver [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.919 222021 DEBUG oslo_concurrency.lockutils [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:41 np0005593233 nova_compute[222017]: 2026-01-23 10:56:41.920 222021 DEBUG nova.network.neutron [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:56:42 np0005593233 nova_compute[222017]: 2026-01-23 10:56:42.227 222021 DEBUG nova.storage.rbd_utils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rolling back rbd image(fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 23 05:56:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:42.722 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:42.723 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:42.723 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:42 np0005593233 nova_compute[222017]: 2026-01-23 10:56:42.824 222021 DEBUG nova.storage.rbd_utils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] removing snapshot(nova-resize) on rbd image(fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:56:43 np0005593233 nova_compute[222017]: 2026-01-23 10:56:43.123 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:43.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:43 np0005593233 nova_compute[222017]: 2026-01-23 10:56:43.517 222021 DEBUG nova.network.neutron [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated VIF entry in instance network info cache for port 62f573cf-0476-448d-b148-040cec7b1042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:56:43 np0005593233 nova_compute[222017]: 2026-01-23 10:56:43.518 222021 DEBUG nova.network.neutron [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:43 np0005593233 nova_compute[222017]: 2026-01-23 10:56:43.540 222021 DEBUG oslo_concurrency.lockutils [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:43.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:43 np0005593233 nova_compute[222017]: 2026-01-23 10:56:43.732 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:45.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e423 e423: 3 total, 3 up, 3 in
Jan 23 05:56:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:45.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:47.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:47.237 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:47.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.125 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.444 222021 DEBUG nova.virt.libvirt.driver [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start _get_guest_xml network_info=[{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.450 222021 WARNING nova.virt.libvirt.driver [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.455 222021 DEBUG nova.virt.libvirt.host [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.456 222021 DEBUG nova.virt.libvirt.host [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.459 222021 DEBUG nova.virt.libvirt.host [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.460 222021 DEBUG nova.virt.libvirt.host [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.461 222021 DEBUG nova.virt.libvirt.driver [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.461 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.461 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.461 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.462 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.463 222021 DEBUG nova.virt.hardware [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.463 222021 DEBUG nova.objects.instance [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'vcpu_model' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:48 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:48Z|00892|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 23 05:56:48 np0005593233 nova_compute[222017]: 2026-01-23 10:56:48.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:56:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:49.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:56:49 np0005593233 nova_compute[222017]: 2026-01-23 10:56:49.332 222021 DEBUG oslo_concurrency.processutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:49.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:56:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3466841033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:56:49 np0005593233 nova_compute[222017]: 2026-01-23 10:56:49.996 222021 DEBUG oslo_concurrency.processutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.664s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.059 222021 DEBUG oslo_concurrency.processutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:56:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/963713923' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.566 222021 DEBUG oslo_concurrency.processutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.570 222021 DEBUG nova.virt.libvirt.vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:56:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:56:32Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.571 222021 DEBUG nova.network.os_vif_util [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.572 222021 DEBUG nova.network.os_vif_util [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.578 222021 DEBUG nova.virt.libvirt.driver [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <uuid>fcb93bcf-9612-4dc7-9996-238d2739d8cb</uuid>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <name>instance-000000d3</name>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-660546175</nova:name>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:56:48</nova:creationTime>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <nova:port uuid="62f573cf-0476-448d-b148-040cec7b1042">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <entry name="serial">fcb93bcf-9612-4dc7-9996-238d2739d8cb</entry>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <entry name="uuid">fcb93bcf-9612-4dc7-9996-238d2739d8cb</entry>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:f9:18:47"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <target dev="tap62f573cf-04"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/console.log" append="off"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:56:50 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:56:50 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:56:50 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:56:50 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.580 222021 DEBUG nova.compute.manager [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Preparing to wait for external event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.581 222021 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.582 222021 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.582 222021 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.584 222021 DEBUG nova.virt.libvirt.vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:56:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:56:32Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.584 222021 DEBUG nova.network.os_vif_util [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.586 222021 DEBUG nova.network.os_vif_util [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.586 222021 DEBUG os_vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.588 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.589 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.590 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.596 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.597 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62f573cf-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.598 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62f573cf-04, col_values=(('external_ids', {'iface-id': '62f573cf-0476-448d-b148-040cec7b1042', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:18:47', 'vm-uuid': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:50 np0005593233 NetworkManager[48871]: <info>  [1769165810.6024] manager: (tap62f573cf-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/404)
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.601 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.609 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.611 222021 INFO os_vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04')#033[00m
Jan 23 05:56:50 np0005593233 kernel: tap62f573cf-04: entered promiscuous mode
Jan 23 05:56:50 np0005593233 NetworkManager[48871]: <info>  [1769165810.8810] manager: (tap62f573cf-04): new Tun device (/org/freedesktop/NetworkManager/Devices/405)
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.889 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:50Z|00893|binding|INFO|Claiming lport 62f573cf-0476-448d-b148-040cec7b1042 for this chassis.
Jan 23 05:56:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:50Z|00894|binding|INFO|62f573cf-0476-448d-b148-040cec7b1042: Claiming fa:16:3e:f9:18:47 10.100.0.14
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.903 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:47 10.100.0.14'], port_security=['fa:16:3e:f9:18:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '5afed19d-3ff6-4459-b8a0-c5fc6a279e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=866a455a-94b4-4bbd-a367-b902a726ce2f, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=62f573cf-0476-448d-b148-040cec7b1042) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.906 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 62f573cf-0476-448d-b148-040cec7b1042 in datapath 6c737d6f-3e00-482b-aed5-4f8eabd246f2 bound to our chassis#033[00m
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.908 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c737d6f-3e00-482b-aed5-4f8eabd246f2#033[00m
Jan 23 05:56:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:50Z|00895|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 ovn-installed in OVS
Jan 23 05:56:50 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:50Z|00896|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 up in Southbound
Jan 23 05:56:50 np0005593233 systemd-udevd[306943]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:56:50 np0005593233 nova_compute[222017]: 2026-01-23 10:56:50.922 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.931 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[03c57aad-58ba-4b87-add7-4296248d8c66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.932 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c737d6f-31 in ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:56:50 np0005593233 NetworkManager[48871]: <info>  [1769165810.9360] device (tap62f573cf-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:56:50 np0005593233 NetworkManager[48871]: <info>  [1769165810.9367] device (tap62f573cf-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.936 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c737d6f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.937 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[51afebbf-71e0-4eaf-9266-581d86a5c555]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.938 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2a92ee7b-523f-4b76-8861-51e4b6a828d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:50 np0005593233 systemd-machined[190954]: New machine qemu-96-instance-000000d3.
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.950 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[bba0c230-5e9e-4a5f-a8f1-370cb5fc0913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:50 np0005593233 systemd[1]: Started Virtual Machine qemu-96-instance-000000d3.
Jan 23 05:56:50 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:50.975 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3093db4d-7168-4152-8255-115d6c43355d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.018 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[487f66f6-8bc5-491e-b941-b85df5b4db86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.026 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6136c8bd-1b1d-492f-9a7d-99dbfe4241d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 systemd-udevd[306946]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:56:51 np0005593233 NetworkManager[48871]: <info>  [1769165811.0275] manager: (tap6c737d6f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/406)
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.063 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[658ddc01-434a-4936-a587-9e4a3f786d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.067 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[478fa6ba-0c24-4322-8704-0fa4a08d3a45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 NetworkManager[48871]: <info>  [1769165811.0952] device (tap6c737d6f-30): carrier: link connected
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.102 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[898af2a8-d651-4e09-9f48-5acde008f960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.128 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[13aa3c92-bb13-4094-8c54-f2ce1bb64147]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c737d6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:05:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 266], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 972372, 'reachable_time': 15876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306976, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.155 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[59b56326-4437-4835-94f2-f2b99ed2b705]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:508'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 972372, 'tstamp': 972372}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 306977, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:51.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.183 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[51ef933b-bdd4-4879-9256-504c436d4664]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c737d6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:05:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 266], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 972372, 'reachable_time': 15876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 306978, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.234 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d5489d10-8c1a-43a5-a162-d4de80711036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.328 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d08d532f-e805-4ddb-b419-e89de3f361be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.329 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c737d6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.330 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.330 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c737d6f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.332 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:51 np0005593233 NetworkManager[48871]: <info>  [1769165811.3331] manager: (tap6c737d6f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/407)
Jan 23 05:56:51 np0005593233 kernel: tap6c737d6f-30: entered promiscuous mode
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.335 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.336 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c737d6f-30, col_values=(('external_ids', {'iface-id': '8bc5480b-7bdc-475b-b309-693291ebc39a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:51 np0005593233 ovn_controller[130653]: 2026-01-23T10:56:51Z|00897|binding|INFO|Releasing lport 8bc5480b-7bdc-475b-b309-693291ebc39a from this chassis (sb_readonly=0)
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.338 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.351 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.352 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.353 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d74d233b-4aff-4559-b051-581f2ab51141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.354 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-6c737d6f-3e00-482b-aed5-4f8eabd246f2
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 6c737d6f-3e00-482b-aed5-4f8eabd246f2
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:56:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:56:51.355 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'env', 'PROCESS_TAG=haproxy-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c737d6f-3e00-482b-aed5-4f8eabd246f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.379 222021 DEBUG nova.compute.manager [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.379 222021 DEBUG oslo_concurrency.lockutils [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.379 222021 DEBUG oslo_concurrency.lockutils [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.380 222021 DEBUG oslo_concurrency.lockutils [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.380 222021 DEBUG nova.compute.manager [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Processing event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:56:51 np0005593233 nova_compute[222017]: 2026-01-23 10:56:51.561 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:51.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:51 np0005593233 podman[307028]: 2026-01-23 10:56:51.771815039 +0000 UTC m=+0.038575260 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.181 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165812.1811764, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.182 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Started (Lifecycle Event)#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.184 222021 DEBUG nova.compute.manager [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.190 222021 INFO nova.virt.libvirt.driver [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance running successfully.#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.191 222021 DEBUG nova.virt.libvirt.driver [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.221 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.229 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.256 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.257 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165812.1813657, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.257 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.275 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.279 222021 INFO nova.compute.manager [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance to original state: 'active'#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.282 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165812.1870353, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.282 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.316 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:52 np0005593233 nova_compute[222017]: 2026-01-23 10:56:52.320 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:56:52 np0005593233 podman[307028]: 2026-01-23 10:56:52.871869301 +0000 UTC m=+1.138622432 container create c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:56:52 np0005593233 systemd[1]: Started libpod-conmon-c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a.scope.
Jan 23 05:56:52 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:56:53 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db369e6091bf0bf78e5c9afcd0816a8c39775580e85c81cf6c03cbebe895624f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:56:53 np0005593233 podman[307028]: 2026-01-23 10:56:53.034627736 +0000 UTC m=+1.301380817 container init c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:56:53 np0005593233 podman[307028]: 2026-01-23 10:56:53.042283312 +0000 UTC m=+1.309036393 container start c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:56:53 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [NOTICE]   (307071) : New worker (307073) forked
Jan 23 05:56:53 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [NOTICE]   (307071) : Loading success.
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.172 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:53.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.586 222021 DEBUG nova.compute.manager [req-2fe7b0b1-cb21-4178-8da5-bc845a252c90 req-d6b2498a-08bf-4e0e-b99b-197bcc98ef6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.588 222021 DEBUG oslo_concurrency.lockutils [req-2fe7b0b1-cb21-4178-8da5-bc845a252c90 req-d6b2498a-08bf-4e0e-b99b-197bcc98ef6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.588 222021 DEBUG oslo_concurrency.lockutils [req-2fe7b0b1-cb21-4178-8da5-bc845a252c90 req-d6b2498a-08bf-4e0e-b99b-197bcc98ef6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.588 222021 DEBUG oslo_concurrency.lockutils [req-2fe7b0b1-cb21-4178-8da5-bc845a252c90 req-d6b2498a-08bf-4e0e-b99b-197bcc98ef6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.589 222021 DEBUG nova.compute.manager [req-2fe7b0b1-cb21-4178-8da5-bc845a252c90 req-d6b2498a-08bf-4e0e-b99b-197bcc98ef6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:53 np0005593233 nova_compute[222017]: 2026-01-23 10:56:53.589 222021 WARNING nova.compute.manager [req-2fe7b0b1-cb21-4178-8da5-bc845a252c90 req-d6b2498a-08bf-4e0e-b99b-197bcc98ef6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:56:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:53.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:53 np0005593233 podman[307084]: 2026-01-23 10:56:53.863472943 +0000 UTC m=+0.120674058 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 23 05:56:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:55.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:55 np0005593233 nova_compute[222017]: 2026-01-23 10:56:55.647 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:55.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:57.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.604678) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817604727, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2360, "num_deletes": 252, "total_data_size": 5880781, "memory_usage": 5942464, "flush_reason": "Manual Compaction"}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817673256, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3852097, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91543, "largest_seqno": 93898, "table_properties": {"data_size": 3842288, "index_size": 6238, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19971, "raw_average_key_size": 20, "raw_value_size": 3822848, "raw_average_value_size": 3928, "num_data_blocks": 271, "num_entries": 973, "num_filter_entries": 973, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165597, "oldest_key_time": 1769165597, "file_creation_time": 1769165817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 68632 microseconds, and 9141 cpu microseconds.
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.673313) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3852097 bytes OK
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.673335) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.677465) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.677533) EVENT_LOG_v1 {"time_micros": 1769165817677517, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.677565) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5870213, prev total WAL file size 5870213, number of live WAL files 2.
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.679952) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3761KB)], [192(12MB)]
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817680039, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 16584015, "oldest_snapshot_seqno": -1}
Jan 23 05:56:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:57.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 e424: 3 total, 3 up, 3 in
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11256 keys, 14633772 bytes, temperature: kUnknown
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817877664, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 14633772, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14561154, "index_size": 43385, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28165, "raw_key_size": 296580, "raw_average_key_size": 26, "raw_value_size": 14364430, "raw_average_value_size": 1276, "num_data_blocks": 1651, "num_entries": 11256, "num_filter_entries": 11256, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.878062) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 14633772 bytes
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.886377) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.8 rd, 74.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 12.1 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(8.1) write-amplify(3.8) OK, records in: 11781, records dropped: 525 output_compression: NoCompression
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.886447) EVENT_LOG_v1 {"time_micros": 1769165817886424, "job": 124, "event": "compaction_finished", "compaction_time_micros": 197820, "compaction_time_cpu_micros": 60239, "output_level": 6, "num_output_files": 1, "total_output_size": 14633772, "num_input_records": 11781, "num_output_records": 11256, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817888551, "job": 124, "event": "table_file_deletion", "file_number": 194}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817891209, "job": 124, "event": "table_file_deletion", "file_number": 192}
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.679780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.891324) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.891332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.891334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.891336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:56:57.891338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:58 np0005593233 nova_compute[222017]: 2026-01-23 10:56:58.215 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:56:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:59.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:56:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:56:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:59.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:00 np0005593233 nova_compute[222017]: 2026-01-23 10:57:00.686 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:01.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:01.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:03.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:03 np0005593233 nova_compute[222017]: 2026-01-23 10:57:03.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:03.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:05.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:05 np0005593233 ovn_controller[130653]: 2026-01-23T10:57:05Z|00132|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:18:47 10.100.0.14
Jan 23 05:57:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:05.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:05 np0005593233 nova_compute[222017]: 2026-01-23 10:57:05.738 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:07.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:07.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:08 np0005593233 podman[307112]: 2026-01-23 10:57:08.093509797 +0000 UTC m=+0.089807306 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:57:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:08 np0005593233 nova_compute[222017]: 2026-01-23 10:57:08.306 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:09.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:09.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:57:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:57:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:10 np0005593233 nova_compute[222017]: 2026-01-23 10:57:10.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:11 np0005593233 nova_compute[222017]: 2026-01-23 10:57:11.207 222021 INFO nova.compute.manager [None req-cbbf892d-0e90-4e57-b3b3-c5c6546060c4 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Get console output#033[00m
Jan 23 05:57:11 np0005593233 nova_compute[222017]: 2026-01-23 10:57:11.214 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:57:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:57:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:57:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:57:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:11.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:57:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:13.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:13 np0005593233 nova_compute[222017]: 2026-01-23 10:57:13.310 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:13.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:15.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:15.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:15 np0005593233 nova_compute[222017]: 2026-01-23 10:57:15.770 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:17.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:17 np0005593233 nova_compute[222017]: 2026-01-23 10:57:17.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:17.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:18 np0005593233 nova_compute[222017]: 2026-01-23 10:57:18.361 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:57:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:19.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:57:19 np0005593233 nova_compute[222017]: 2026-01-23 10:57:19.622 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:19 np0005593233 nova_compute[222017]: 2026-01-23 10:57:19.622 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:19 np0005593233 nova_compute[222017]: 2026-01-23 10:57:19.622 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:19 np0005593233 nova_compute[222017]: 2026-01-23 10:57:19.622 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:57:19 np0005593233 nova_compute[222017]: 2026-01-23 10:57:19.623 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:57:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:19.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:57:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1088406848' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:57:20 np0005593233 nova_compute[222017]: 2026-01-23 10:57:20.082 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:57:20 np0005593233 nova_compute[222017]: 2026-01-23 10:57:20.775 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:21.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:21.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:23.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:23 np0005593233 nova_compute[222017]: 2026-01-23 10:57:23.364 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:23.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:23 np0005593233 nova_compute[222017]: 2026-01-23 10:57:23.965 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:57:23 np0005593233 nova_compute[222017]: 2026-01-23 10:57:23.965 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d3 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:57:24 np0005593233 podman[307289]: 2026-01-23 10:57:24.144119412 +0000 UTC m=+0.145300413 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.189 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.190 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4062MB free_disk=20.942649841308594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.190 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.190 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:24.392 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:57:24 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:24.393 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.474 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance fcb93bcf-9612-4dc7-9996-238d2739d8cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.475 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.475 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:57:24 np0005593233 nova_compute[222017]: 2026-01-23 10:57:24.625 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:57:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:57:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3276376625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:57:25 np0005593233 nova_compute[222017]: 2026-01-23 10:57:25.154 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:57:25 np0005593233 nova_compute[222017]: 2026-01-23 10:57:25.163 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:57:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:25.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:57:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:25.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:57:25 np0005593233 nova_compute[222017]: 2026-01-23 10:57:25.779 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:25 np0005593233 nova_compute[222017]: 2026-01-23 10:57:25.992 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.208 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.209 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.399 222021 DEBUG nova.compute.manager [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.400 222021 DEBUG nova.compute.manager [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.401 222021 DEBUG oslo_concurrency.lockutils [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.401 222021 DEBUG oslo_concurrency.lockutils [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.402 222021 DEBUG nova.network.neutron [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.500 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.500 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.501 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.501 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.502 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.504 222021 INFO nova.compute.manager [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Terminating instance#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.505 222021 DEBUG nova.compute.manager [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:57:26 np0005593233 kernel: tap62f573cf-04 (unregistering): left promiscuous mode
Jan 23 05:57:26 np0005593233 NetworkManager[48871]: <info>  [1769165846.7053] device (tap62f573cf-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:57:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:57:26Z|00898|binding|INFO|Releasing lport 62f573cf-0476-448d-b148-040cec7b1042 from this chassis (sb_readonly=0)
Jan 23 05:57:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:57:26Z|00899|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 down in Southbound
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.724 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 ovn_controller[130653]: 2026-01-23T10:57:26Z|00900|binding|INFO|Removing iface tap62f573cf-04 ovn-installed in OVS
Jan 23 05:57:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:26.731 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:47 10.100.0.14'], port_security=['fa:16:3e:f9:18:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '12', 'neutron:security_group_ids': '5afed19d-3ff6-4459-b8a0-c5fc6a279e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=866a455a-94b4-4bbd-a367-b902a726ce2f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=62f573cf-0476-448d-b148-040cec7b1042) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:57:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:26.732 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 62f573cf-0476-448d-b148-040cec7b1042 in datapath 6c737d6f-3e00-482b-aed5-4f8eabd246f2 unbound from our chassis#033[00m
Jan 23 05:57:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:26.733 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c737d6f-3e00-482b-aed5-4f8eabd246f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:57:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:26.734 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[048e4b82-e681-4ac6-9b08-62ed913daadc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:26.735 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 namespace which is not needed anymore#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.761 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000d3.scope: Deactivated successfully.
Jan 23 05:57:26 np0005593233 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000d3.scope: Consumed 15.705s CPU time.
Jan 23 05:57:26 np0005593233 systemd-machined[190954]: Machine qemu-96-instance-000000d3 terminated.
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.930 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.936 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.950 222021 INFO nova.virt.libvirt.driver [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance destroyed successfully.#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.951 222021 DEBUG nova.objects.instance [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.976 222021 DEBUG nova.virt.libvirt.vif [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:56:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:56:52Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.977 222021 DEBUG nova.network.os_vif_util [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.979 222021 DEBUG nova.network.os_vif_util [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.979 222021 DEBUG os_vif [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.983 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.983 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62f573cf-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.985 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.987 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.993 222021 DEBUG nova.compute.manager [req-2106ad3d-42a9-4d6c-b88d-7f05cdac3771 req-d1f29ebb-6ef9-4623-9421-bdc459ee1dcc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.994 222021 DEBUG oslo_concurrency.lockutils [req-2106ad3d-42a9-4d6c-b88d-7f05cdac3771 req-d1f29ebb-6ef9-4623-9421-bdc459ee1dcc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.994 222021 DEBUG oslo_concurrency.lockutils [req-2106ad3d-42a9-4d6c-b88d-7f05cdac3771 req-d1f29ebb-6ef9-4623-9421-bdc459ee1dcc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.995 222021 DEBUG oslo_concurrency.lockutils [req-2106ad3d-42a9-4d6c-b88d-7f05cdac3771 req-d1f29ebb-6ef9-4623-9421-bdc459ee1dcc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.995 222021 DEBUG nova.compute.manager [req-2106ad3d-42a9-4d6c-b88d-7f05cdac3771 req-d1f29ebb-6ef9-4623-9421-bdc459ee1dcc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.996 222021 DEBUG nova.compute.manager [req-2106ad3d-42a9-4d6c-b88d-7f05cdac3771 req-d1f29ebb-6ef9-4623-9421-bdc459ee1dcc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:57:26 np0005593233 nova_compute[222017]: 2026-01-23 10:57:26.998 222021 INFO os_vif [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04')#033[00m
Jan 23 05:57:27 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [NOTICE]   (307071) : haproxy version is 2.8.14-c23fe91
Jan 23 05:57:27 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [NOTICE]   (307071) : path to executable is /usr/sbin/haproxy
Jan 23 05:57:27 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [WARNING]  (307071) : Exiting Master process...
Jan 23 05:57:27 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [WARNING]  (307071) : Exiting Master process...
Jan 23 05:57:27 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [ALERT]    (307071) : Current worker (307073) exited with code 143 (Terminated)
Jan 23 05:57:27 np0005593233 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[307067]: [WARNING]  (307071) : All workers exited. Exiting... (0)
Jan 23 05:57:27 np0005593233 systemd[1]: libpod-c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a.scope: Deactivated successfully.
Jan 23 05:57:27 np0005593233 podman[307359]: 2026-01-23 10:57:27.184366102 +0000 UTC m=+0.344869646 container died c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:57:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:27.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:27 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a-userdata-shm.mount: Deactivated successfully.
Jan 23 05:57:27 np0005593233 systemd[1]: var-lib-containers-storage-overlay-db369e6091bf0bf78e5c9afcd0816a8c39775580e85c81cf6c03cbebe895624f-merged.mount: Deactivated successfully.
Jan 23 05:57:27 np0005593233 podman[307359]: 2026-01-23 10:57:27.329161678 +0000 UTC m=+0.489665212 container cleanup c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:57:27 np0005593233 systemd[1]: libpod-conmon-c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a.scope: Deactivated successfully.
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.396 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:57:27 np0005593233 nova_compute[222017]: 2026-01-23 10:57:27.588 222021 DEBUG nova.network.neutron [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated VIF entry in instance network info cache for port 62f573cf-0476-448d-b148-040cec7b1042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:57:27 np0005593233 nova_compute[222017]: 2026-01-23 10:57:27.589 222021 DEBUG nova.network.neutron [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:57:27 np0005593233 podman[307416]: 2026-01-23 10:57:27.605645813 +0000 UTC m=+0.236502497 container remove c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.613 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b7186347-ae26-49a0-940f-576810e34515]: (4, ('Fri Jan 23 10:57:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 (c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a)\nc479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a\nFri Jan 23 10:57:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 (c479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a)\nc479402acec7cee6853f3eb6491935fa3ef604aad0fc8c4a25152c4da9b2579a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.615 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aef6160d-3cdb-451e-8029-bc0795a07089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.617 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c737d6f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:57:27 np0005593233 nova_compute[222017]: 2026-01-23 10:57:27.619 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:27 np0005593233 kernel: tap6c737d6f-30: left promiscuous mode
Jan 23 05:57:27 np0005593233 nova_compute[222017]: 2026-01-23 10:57:27.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.642 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fb644fd0-a9dd-414b-978a-fdf445b2326d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.666 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[be97c130-eccb-4970-aedd-866cfee22595]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.668 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eea00ad7-ece7-4584-9fca-905bd764aa18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.690 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[f57edfc9-1d89-4215-9b8f-3bdd9d45ee1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 972364, 'reachable_time': 40580, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307431, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.694 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:57:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:27.694 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0d6560-eed1-4e61-9c11-8c395f958160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:57:27 np0005593233 systemd[1]: run-netns-ovnmeta\x2d6c737d6f\x2d3e00\x2d482b\x2daed5\x2d4f8eabd246f2.mount: Deactivated successfully.
Jan 23 05:57:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:27.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:27 np0005593233 nova_compute[222017]: 2026-01-23 10:57:27.893 222021 DEBUG oslo_concurrency.lockutils [req-f6c92f57-9708-4fff-b5f5-50e8b6ce9b70 req-f2bc0283-7774-44b2-91e9-3b7c6f655328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:57:28 np0005593233 nova_compute[222017]: 2026-01-23 10:57:28.210 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:28 np0005593233 nova_compute[222017]: 2026-01-23 10:57:28.210 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:28 np0005593233 nova_compute[222017]: 2026-01-23 10:57:28.211 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:28 np0005593233 nova_compute[222017]: 2026-01-23 10:57:28.211 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:28 np0005593233 nova_compute[222017]: 2026-01-23 10:57:28.211 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:57:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:28 np0005593233 nova_compute[222017]: 2026-01-23 10:57:28.366 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:29.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:29.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.785 222021 DEBUG nova.compute.manager [req-0cbe10e7-7a57-4b73-8794-d8c56b1fc564 req-8b766515-e055-46c5-bf1b-78e1ff55f9a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.785 222021 DEBUG oslo_concurrency.lockutils [req-0cbe10e7-7a57-4b73-8794-d8c56b1fc564 req-8b766515-e055-46c5-bf1b-78e1ff55f9a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.786 222021 DEBUG oslo_concurrency.lockutils [req-0cbe10e7-7a57-4b73-8794-d8c56b1fc564 req-8b766515-e055-46c5-bf1b-78e1ff55f9a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.786 222021 DEBUG oslo_concurrency.lockutils [req-0cbe10e7-7a57-4b73-8794-d8c56b1fc564 req-8b766515-e055-46c5-bf1b-78e1ff55f9a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.786 222021 DEBUG nova.compute.manager [req-0cbe10e7-7a57-4b73-8794-d8c56b1fc564 req-8b766515-e055-46c5-bf1b-78e1ff55f9a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:57:29 np0005593233 nova_compute[222017]: 2026-01-23 10:57:29.787 222021 WARNING nova.compute.manager [req-0cbe10e7-7a57-4b73-8794-d8c56b1fc564 req-8b766515-e055-46c5-bf1b-78e1ff55f9a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:57:30 np0005593233 nova_compute[222017]: 2026-01-23 10:57:30.635 222021 INFO nova.virt.libvirt.driver [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Deleting instance files /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb_del#033[00m
Jan 23 05:57:30 np0005593233 nova_compute[222017]: 2026-01-23 10:57:30.637 222021 INFO nova.virt.libvirt.driver [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Deletion of /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb_del complete#033[00m
Jan 23 05:57:31 np0005593233 nova_compute[222017]: 2026-01-23 10:57:31.117 222021 INFO nova.compute.manager [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Took 4.61 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:57:31 np0005593233 nova_compute[222017]: 2026-01-23 10:57:31.118 222021 DEBUG oslo.service.loopingcall [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:57:31 np0005593233 nova_compute[222017]: 2026-01-23 10:57:31.118 222021 DEBUG nova.compute.manager [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:57:31 np0005593233 nova_compute[222017]: 2026-01-23 10:57:31.119 222021 DEBUG nova.network.neutron [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:57:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:31.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:31 np0005593233 nova_compute[222017]: 2026-01-23 10:57:31.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:31.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:31 np0005593233 nova_compute[222017]: 2026-01-23 10:57:31.987 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:33.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.265 222021 DEBUG nova.network.neutron [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.312 222021 INFO nova.compute.manager [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Took 2.19 seconds to deallocate network for instance.#033[00m
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.368 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:33 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.615 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.616 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.696 222021 DEBUG oslo_concurrency.processutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:57:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:33.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:33 np0005593233 nova_compute[222017]: 2026-01-23 10:57:33.889 222021 DEBUG nova.compute.manager [req-ab0d20dc-0528-4918-aaaf-e3459370e4d4 req-cec8ab15-a189-4e24-9672-19bca5675b69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-deleted-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:57:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:57:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2068586127' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.571 222021 DEBUG oslo_concurrency.processutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.875s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.580 222021 DEBUG nova.compute.provider_tree [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.715 222021 DEBUG nova.scheduler.client.report [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.738 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.851 222021 INFO nova.scheduler.client.report [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance fcb93bcf-9612-4dc7-9996-238d2739d8cb#033[00m
Jan 23 05:57:34 np0005593233 nova_compute[222017]: 2026-01-23 10:57:34.917 222021 DEBUG oslo_concurrency.lockutils [None req-cce2760f-12a3-4ba3-925f-f02655d1b1e0 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:35.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:35.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:37 np0005593233 nova_compute[222017]: 2026-01-23 10:57:37.033 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:37.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:37.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:38 np0005593233 nova_compute[222017]: 2026-01-23 10:57:38.407 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:39 np0005593233 podman[307505]: 2026-01-23 10:57:39.124412414 +0000 UTC m=+0.114133932 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:57:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:39.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:39 np0005593233 nova_compute[222017]: 2026-01-23 10:57:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:39 np0005593233 nova_compute[222017]: 2026-01-23 10:57:39.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:57:39 np0005593233 nova_compute[222017]: 2026-01-23 10:57:39.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:57:39 np0005593233 nova_compute[222017]: 2026-01-23 10:57:39.408 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:57:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:39.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:41.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:41.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:41 np0005593233 nova_compute[222017]: 2026-01-23 10:57:41.948 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165846.9473252, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:57:41 np0005593233 nova_compute[222017]: 2026-01-23 10:57:41.948 222021 INFO nova.compute.manager [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:57:41 np0005593233 nova_compute[222017]: 2026-01-23 10:57:41.982 222021 DEBUG nova.compute.manager [None req-2adfa261-2e09-4aab-9b3a-6ec363416c91 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:57:42 np0005593233 nova_compute[222017]: 2026-01-23 10:57:42.060 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:42.722 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:42.723 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:57:42.723 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:43.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:43 np0005593233 nova_compute[222017]: 2026-01-23 10:57:43.429 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:43.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:44 np0005593233 nova_compute[222017]: 2026-01-23 10:57:44.687 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:44 np0005593233 nova_compute[222017]: 2026-01-23 10:57:44.767 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:45.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:45.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:47 np0005593233 nova_compute[222017]: 2026-01-23 10:57:47.063 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:47.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:57:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:47.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:57:48 np0005593233 nova_compute[222017]: 2026-01-23 10:57:48.492 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:49.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:49.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:51.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:51.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:52 np0005593233 nova_compute[222017]: 2026-01-23 10:57:52.066 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #196. Immutable memtables: 0.
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.078410) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 125] Flushing memtable with next log file: 196
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165873078544, "job": 125, "event": "flush_started", "num_memtables": 1, "num_entries": 790, "num_deletes": 258, "total_data_size": 1482206, "memory_usage": 1499808, "flush_reason": "Manual Compaction"}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 125] Level-0 flush table #197: started
Jan 23 05:57:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:53.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165873368588, "cf_name": "default", "job": 125, "event": "table_file_creation", "file_number": 197, "file_size": 673149, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 93903, "largest_seqno": 94688, "table_properties": {"data_size": 669770, "index_size": 1158, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9221, "raw_average_key_size": 21, "raw_value_size": 662569, "raw_average_value_size": 1519, "num_data_blocks": 50, "num_entries": 436, "num_filter_entries": 436, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165817, "oldest_key_time": 1769165817, "file_creation_time": 1769165873, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 197, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 125] Flush lasted 290214 microseconds, and 4987 cpu microseconds.
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:57:53 np0005593233 nova_compute[222017]: 2026-01-23 10:57:53.497 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.368640) [db/flush_job.cc:967] [default] [JOB 125] Level-0 flush table #197: 673149 bytes OK
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.368663) [db/memtable_list.cc:519] [default] Level-0 commit table #197 started
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.547655) [db/memtable_list.cc:722] [default] Level-0 commit table #197: memtable #1 done
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.547750) EVENT_LOG_v1 {"time_micros": 1769165873547728, "job": 125, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.547799) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 125] Try to delete WAL files size 1477978, prev total WAL file size 1477978, number of live WAL files 2.
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000193.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.549235) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033323539' seq:72057594037927935, type:22 .. '6D6772737461740033353137' seq:0, type:0; will stop at (end)
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 126] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 125 Base level 0, inputs: [197(657KB)], [195(13MB)]
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165873549372, "job": 126, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [197], "files_L6": [195], "score": -1, "input_data_size": 15306921, "oldest_snapshot_seqno": -1}
Jan 23 05:57:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:53.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 126] Generated table #198: 11176 keys, 11655635 bytes, temperature: kUnknown
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165873827429, "cf_name": "default", "job": 126, "event": "table_file_creation", "file_number": 198, "file_size": 11655635, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11587833, "index_size": 38761, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27973, "raw_key_size": 295136, "raw_average_key_size": 26, "raw_value_size": 11396844, "raw_average_value_size": 1019, "num_data_blocks": 1460, "num_entries": 11176, "num_filter_entries": 11176, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165873, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 198, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.827796) [db/compaction/compaction_job.cc:1663] [default] [JOB 126] Compacted 1@0 + 1@6 files to L6 => 11655635 bytes
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.937274) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 55.0 rd, 41.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 14.0 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(40.1) write-amplify(17.3) OK, records in: 11692, records dropped: 516 output_compression: NoCompression
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.937342) EVENT_LOG_v1 {"time_micros": 1769165873937316, "job": 126, "event": "compaction_finished", "compaction_time_micros": 278154, "compaction_time_cpu_micros": 63153, "output_level": 6, "num_output_files": 1, "total_output_size": 11655635, "num_input_records": 11692, "num_output_records": 11176, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000197.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165873937976, "job": 126, "event": "table_file_deletion", "file_number": 197}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000195.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165873944115, "job": 126, "event": "table_file_deletion", "file_number": 195}
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.549124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.944270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.944282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.944287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.944291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:53 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:57:53.944295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:55 np0005593233 podman[307526]: 2026-01-23 10:57:55.110437826 +0000 UTC m=+0.122292873 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:57:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:57:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:55.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:57:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:55.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:57 np0005593233 nova_compute[222017]: 2026-01-23 10:57:57.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:57.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:57.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.627 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.628 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.662 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.757 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.758 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.766 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.766 222021 INFO nova.compute.claims [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:57:58 np0005593233 nova_compute[222017]: 2026-01-23 10:57:58.965 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:57:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:59.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:57:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2312781954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.463 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.472 222021 DEBUG nova.compute.provider_tree [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.488 222021 DEBUG nova.scheduler.client.report [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.511 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.518 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.569 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.571 222021 DEBUG nova.network.neutron [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.597 222021 INFO nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.618 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.731 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.733 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.733 222021 INFO nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Creating image(s)#033[00m
Jan 23 05:57:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:57:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:57:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:59.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:57:59 np0005593233 nova_compute[222017]: 2026-01-23 10:57:59.810 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.028 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.056 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.059 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.128 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.129 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.130 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.130 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.164 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.169 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 575c76d9-7306-429f-baca-4d450f37c388_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:00 np0005593233 nova_compute[222017]: 2026-01-23 10:58:00.247 222021 DEBUG nova.policy [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:58:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:01.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:01 np0005593233 nova_compute[222017]: 2026-01-23 10:58:01.293 222021 DEBUG nova.network.neutron [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Successfully created port: eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:58:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:01.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.042 222021 DEBUG nova.network.neutron [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Successfully updated port: eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.056 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.057 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.057 222021 DEBUG nova.network.neutron [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.114 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.145 222021 DEBUG nova.compute.manager [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.146 222021 DEBUG nova.compute.manager [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing instance network info cache due to event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.146 222021 DEBUG oslo_concurrency.lockutils [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.191 222021 DEBUG nova.network.neutron [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.258 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 575c76d9-7306-429f-baca-4d450f37c388_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.370 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.662 222021 DEBUG nova.objects.instance [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 575c76d9-7306-429f-baca-4d450f37c388 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.681 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.681 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Ensure instance console log exists: /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.682 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.682 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:02 np0005593233 nova_compute[222017]: 2026-01-23 10:58:02.682 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:03.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.487 222021 DEBUG nova.network.neutron [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.510 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.510 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Instance network_info: |[{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.510 222021 DEBUG oslo_concurrency.lockutils [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.511 222021 DEBUG nova.network.neutron [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.513 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Start _get_guest_xml network_info=[{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.518 222021 WARNING nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.529 222021 DEBUG nova.virt.libvirt.host [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.530 222021 DEBUG nova.virt.libvirt.host [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.567 222021 DEBUG nova.virt.libvirt.host [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.568 222021 DEBUG nova.virt.libvirt.host [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.570 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.571 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.574 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.574 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.575 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.575 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.575 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.575 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.576 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.576 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.576 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.576 222021 DEBUG nova.virt.hardware [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.579 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:03 np0005593233 nova_compute[222017]: 2026-01-23 10:58:03.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:03.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:58:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2252766267' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.052 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.093 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.100 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:04 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:58:04 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4153180939' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.673 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.674 222021 DEBUG nova.virt.libvirt.vif [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:57:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1747551892',display_name='tempest-TestNetworkAdvancedServerOps-server-1747551892',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1747551892',id=212,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNLmxmIGURR44ldVtZLj5mDiJy9rKp0lzbESRue6Wnd2DycIawGU9GHFwoSAqxCF+VQclApTf6ivzJfIihq8OTikttBlSKyYddrT5smaN5oWB8DMKx3zgAFW+FO8eDOMGA==',key_name='tempest-TestNetworkAdvancedServerOps-647126014',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-pdznqph4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:57:59Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=575c76d9-7306-429f-baca-4d450f37c388,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.675 222021 DEBUG nova.network.os_vif_util [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.676 222021 DEBUG nova.network.os_vif_util [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.677 222021 DEBUG nova.objects.instance [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid 575c76d9-7306-429f-baca-4d450f37c388 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.702 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <uuid>575c76d9-7306-429f-baca-4d450f37c388</uuid>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <name>instance-000000d4</name>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1747551892</nova:name>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 10:58:03</nova:creationTime>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <nova:port uuid="eb5d7473-2661-43a3-bf30-f8ccc9735ea6">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <system>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <entry name="serial">575c76d9-7306-429f-baca-4d450f37c388</entry>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <entry name="uuid">575c76d9-7306-429f-baca-4d450f37c388</entry>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </system>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <os>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </os>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <features>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </features>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </clock>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  <devices>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/575c76d9-7306-429f-baca-4d450f37c388_disk">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/575c76d9-7306-429f-baca-4d450f37c388_disk.config">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </source>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      </auth>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </disk>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:a0:da:56"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <target dev="tapeb5d7473-26"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </interface>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/console.log" append="off"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </serial>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <video>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </video>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </rng>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 05:58:04 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 05:58:04 np0005593233 nova_compute[222017]:  </devices>
Jan 23 05:58:04 np0005593233 nova_compute[222017]: </domain>
Jan 23 05:58:04 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.704 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Preparing to wait for external event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.704 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.704 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.705 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.705 222021 DEBUG nova.virt.libvirt.vif [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:57:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1747551892',display_name='tempest-TestNetworkAdvancedServerOps-server-1747551892',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1747551892',id=212,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNLmxmIGURR44ldVtZLj5mDiJy9rKp0lzbESRue6Wnd2DycIawGU9GHFwoSAqxCF+VQclApTf6ivzJfIihq8OTikttBlSKyYddrT5smaN5oWB8DMKx3zgAFW+FO8eDOMGA==',key_name='tempest-TestNetworkAdvancedServerOps-647126014',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-pdznqph4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:57:59Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=575c76d9-7306-429f-baca-4d450f37c388,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.706 222021 DEBUG nova.network.os_vif_util [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.707 222021 DEBUG nova.network.os_vif_util [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.707 222021 DEBUG os_vif [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.708 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.708 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.709 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.713 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.713 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb5d7473-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.713 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeb5d7473-26, col_values=(('external_ids', {'iface-id': 'eb5d7473-2661-43a3-bf30-f8ccc9735ea6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:da:56', 'vm-uuid': '575c76d9-7306-429f-baca-4d450f37c388'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.748 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:04 np0005593233 NetworkManager[48871]: <info>  [1769165884.7495] manager: (tapeb5d7473-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.751 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.758 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.759 222021 INFO os_vif [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26')#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.826 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.827 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.827 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:a0:da:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.828 222021 INFO nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Using config drive#033[00m
Jan 23 05:58:04 np0005593233 nova_compute[222017]: 2026-01-23 10:58:04.856 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.243 222021 DEBUG nova.network.neutron [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updated VIF entry in instance network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.244 222021 DEBUG nova.network.neutron [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.265 222021 DEBUG oslo_concurrency.lockutils [req-7b4616ec-fabb-4a28-bb1f-a2233390407d req-69d2d059-d9be-42c9-a791-cdd5341ec487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:05.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.400 222021 INFO nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Creating config drive at /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/disk.config#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.411 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc1wxr3td execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.569 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc1wxr3td" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.603 222021 DEBUG nova.storage.rbd_utils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 575c76d9-7306-429f-baca-4d450f37c388_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:58:05 np0005593233 nova_compute[222017]: 2026-01-23 10:58:05.606 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/disk.config 575c76d9-7306-429f-baca-4d450f37c388_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:05.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:07.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.356 222021 DEBUG oslo_concurrency.processutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/disk.config 575c76d9-7306-429f-baca-4d450f37c388_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.749s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.357 222021 INFO nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Deleting local config drive /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/disk.config because it was imported into RBD.#033[00m
Jan 23 05:58:07 np0005593233 kernel: tapeb5d7473-26: entered promiscuous mode
Jan 23 05:58:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:07Z|00901|binding|INFO|Claiming lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for this chassis.
Jan 23 05:58:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:07Z|00902|binding|INFO|eb5d7473-2661-43a3-bf30-f8ccc9735ea6: Claiming fa:16:3e:a0:da:56 10.100.0.13
Jan 23 05:58:07 np0005593233 NetworkManager[48871]: <info>  [1769165887.4281] manager: (tapeb5d7473-26): new Tun device (/org/freedesktop/NetworkManager/Devices/409)
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.446 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:da:56 10.100.0.13'], port_security=['fa:16:3e:a0:da:56 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '575c76d9-7306-429f-baca-4d450f37c388', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6b9b22d8-1c0e-45a7-b51b-d7eb1b043141', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c950e56e-abf1-437a-896b-f8357baa2295, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=eb5d7473-2661-43a3-bf30-f8ccc9735ea6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.447 140224 INFO neutron.agent.ovn.metadata.agent [-] Port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 in datapath 709bd24b-e32e-4388-bc40-5f1d023f1ad4 bound to our chassis#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.448 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 709bd24b-e32e-4388-bc40-5f1d023f1ad4#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.463 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[21988db2-3ec0-4524-9242-d98012bd0457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.464 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap709bd24b-e1 in ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.466 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap709bd24b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.466 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[365a7400-b8af-4cb1-b86e-4be2ea36f0dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.467 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8e628a3b-ade8-418c-854a-d6d8a7ca6483]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 systemd-udevd[307877]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.478 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4b68e9-bd5a-4079-b506-7065eb45d44a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 systemd-machined[190954]: New machine qemu-97-instance-000000d4.
Jan 23 05:58:07 np0005593233 NetworkManager[48871]: <info>  [1769165887.4867] device (tapeb5d7473-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:58:07 np0005593233 NetworkManager[48871]: <info>  [1769165887.4872] device (tapeb5d7473-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.501 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:07 np0005593233 systemd[1]: Started Virtual Machine qemu-97-instance-000000d4.
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.506 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aec37a73-7af3-4f18-ae60-25ed05eb7abf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:07Z|00903|binding|INFO|Setting lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 ovn-installed in OVS
Jan 23 05:58:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:07Z|00904|binding|INFO|Setting lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 up in Southbound
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.510 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.542 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[22fe6c54-8567-4c17-901a-34b5fae57c05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 NetworkManager[48871]: <info>  [1769165887.5477] manager: (tap709bd24b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/410)
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.547 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc3b89b-26ff-425b-9cd2-6ecaa0356fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.581 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[a19dcbd9-f217-4990-8a24-571ae87a181e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.585 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[14db2cf4-8913-42ea-9703-1540f24126a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 NetworkManager[48871]: <info>  [1769165887.6125] device (tap709bd24b-e0): carrier: link connected
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.616 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[88b27bf1-81f6-4ebe-b5e0-66d2dcda22a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.633 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a3b502-5593-4323-936a-6a5133c36b54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap709bd24b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:b0:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 269], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 980024, 'reachable_time': 41584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307910, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.650 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[769aa415-fe19-4bcb-856c-0c98c4eb6315]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:b0f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 980024, 'tstamp': 980024}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307911, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.663 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9d0f17a7-c336-46a3-a3ce-b43dcbacb9e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap709bd24b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:b0:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 269], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 980024, 'reachable_time': 41584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 307912, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.692 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[3f51378f-5345-48b7-94f9-6f0dc4fa1d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.765 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[20da1da4-0374-4e01-adb5-a4909e21fe46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.766 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap709bd24b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.765 222021 DEBUG nova.compute.manager [req-9236c462-ba5d-4bb5-9fcc-bc19d51338ee req-339d8c7d-e122-4554-b2c8-8efbc745a040 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.766 222021 DEBUG oslo_concurrency.lockutils [req-9236c462-ba5d-4bb5-9fcc-bc19d51338ee req-339d8c7d-e122-4554-b2c8-8efbc745a040 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.766 222021 DEBUG oslo_concurrency.lockutils [req-9236c462-ba5d-4bb5-9fcc-bc19d51338ee req-339d8c7d-e122-4554-b2c8-8efbc745a040 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.766 222021 DEBUG oslo_concurrency.lockutils [req-9236c462-ba5d-4bb5-9fcc-bc19d51338ee req-339d8c7d-e122-4554-b2c8-8efbc745a040 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.766 222021 DEBUG nova.compute.manager [req-9236c462-ba5d-4bb5-9fcc-bc19d51338ee req-339d8c7d-e122-4554-b2c8-8efbc745a040 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Processing event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.768 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.769 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap709bd24b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:07 np0005593233 NetworkManager[48871]: <info>  [1769165887.7721] manager: (tap709bd24b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Jan 23 05:58:07 np0005593233 kernel: tap709bd24b-e0: entered promiscuous mode
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.775 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap709bd24b-e0, col_values=(('external_ids', {'iface-id': '5311e73f-96f0-4470-83d4-e798caf6a353'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:07 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:07Z|00905|binding|INFO|Releasing lport 5311e73f-96f0-4470-83d4-e798caf6a353 from this chassis (sb_readonly=0)
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:07.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:07 np0005593233 nova_compute[222017]: 2026-01-23 10:58:07.811 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.812 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/709bd24b-e32e-4388-bc40-5f1d023f1ad4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/709bd24b-e32e-4388-bc40-5f1d023f1ad4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.813 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[71331ae9-e84a-48da-a0e6-330fa294fe63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.814 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-709bd24b-e32e-4388-bc40-5f1d023f1ad4
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/709bd24b-e32e-4388-bc40-5f1d023f1ad4.pid.haproxy
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 709bd24b-e32e-4388-bc40-5f1d023f1ad4
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:58:07 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:07.814 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'env', 'PROCESS_TAG=haproxy-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/709bd24b-e32e-4388-bc40-5f1d023f1ad4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.129 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165888.128694, 575c76d9-7306-429f-baca-4d450f37c388 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.130 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Started (Lifecycle Event)#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.132 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.140 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.144 222021 INFO nova.virt.libvirt.driver [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Instance spawned successfully.#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.144 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.164 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.169 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.173 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.174 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.174 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.174 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.175 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.175 222021 DEBUG nova.virt.libvirt.driver [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.214 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.214 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165888.1288924, 575c76d9-7306-429f-baca-4d450f37c388 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.215 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.277 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.282 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165888.1343534, 575c76d9-7306-429f-baca-4d450f37c388 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.282 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.300 222021 INFO nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Took 8.57 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.300 222021 DEBUG nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.309 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.312 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:58:08 np0005593233 podman[307986]: 2026-01-23 10:58:08.227203925 +0000 UTC m=+0.043695425 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.355 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.378 222021 INFO nova.compute.manager [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Took 9.65 seconds to build instance.#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.399 222021 DEBUG oslo_concurrency.lockutils [None req-edee322d-4b0d-46a3-8203-08c1cf8469f5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:08 np0005593233 nova_compute[222017]: 2026-01-23 10:58:08.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:09 np0005593233 podman[307986]: 2026-01-23 10:58:09.268173949 +0000 UTC m=+1.084665419 container create b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:58:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:09.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:09 np0005593233 systemd[1]: Started libpod-conmon-b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8.scope.
Jan 23 05:58:09 np0005593233 podman[307999]: 2026-01-23 10:58:09.586236588 +0000 UTC m=+0.264491247 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:58:09 np0005593233 systemd[1]: Started libcrun container.
Jan 23 05:58:09 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6848b37b2012fcfa05cf64b239d02d46d09fbe7a4f58069bc9a889769307cbfe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:58:09 np0005593233 podman[307986]: 2026-01-23 10:58:09.727349491 +0000 UTC m=+1.543841001 container init b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:58:09 np0005593233 podman[307986]: 2026-01-23 10:58:09.733816144 +0000 UTC m=+1.550307604 container start b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.748 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:09 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [NOTICE]   (308026) : New worker (308028) forked
Jan 23 05:58:09 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [NOTICE]   (308026) : Loading success.
Jan 23 05:58:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:09.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.854 222021 DEBUG nova.compute.manager [req-5d195dc1-78cd-412f-9fcf-efdd569ace51 req-e28cd65e-0724-4ac1-bd5e-9acb7e1d742f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.854 222021 DEBUG oslo_concurrency.lockutils [req-5d195dc1-78cd-412f-9fcf-efdd569ace51 req-e28cd65e-0724-4ac1-bd5e-9acb7e1d742f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.854 222021 DEBUG oslo_concurrency.lockutils [req-5d195dc1-78cd-412f-9fcf-efdd569ace51 req-e28cd65e-0724-4ac1-bd5e-9acb7e1d742f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.855 222021 DEBUG oslo_concurrency.lockutils [req-5d195dc1-78cd-412f-9fcf-efdd569ace51 req-e28cd65e-0724-4ac1-bd5e-9acb7e1d742f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.855 222021 DEBUG nova.compute.manager [req-5d195dc1-78cd-412f-9fcf-efdd569ace51 req-e28cd65e-0724-4ac1-bd5e-9acb7e1d742f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:09 np0005593233 nova_compute[222017]: 2026-01-23 10:58:09.856 222021 WARNING nova.compute.manager [req-5d195dc1-78cd-412f-9fcf-efdd569ace51 req-e28cd65e-0724-4ac1-bd5e-9acb7e1d742f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:58:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:10Z|00906|binding|INFO|Releasing lport 5311e73f-96f0-4470-83d4-e798caf6a353 from this chassis (sb_readonly=0)
Jan 23 05:58:10 np0005593233 NetworkManager[48871]: <info>  [1769165890.4855] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Jan 23 05:58:10 np0005593233 NetworkManager[48871]: <info>  [1769165890.4882] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.492 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:10 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:10Z|00907|binding|INFO|Releasing lport 5311e73f-96f0-4470-83d4-e798caf6a353 from this chassis (sb_readonly=0)
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.530 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.538 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.920 222021 DEBUG nova.compute.manager [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.920 222021 DEBUG nova.compute.manager [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing instance network info cache due to event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.921 222021 DEBUG oslo_concurrency.lockutils [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.921 222021 DEBUG oslo_concurrency.lockutils [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:10 np0005593233 nova_compute[222017]: 2026-01-23 10:58:10.922 222021 DEBUG nova.network.neutron [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:58:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:11.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:11.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:12 np0005593233 nova_compute[222017]: 2026-01-23 10:58:12.478 222021 DEBUG nova.network.neutron [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updated VIF entry in instance network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:58:12 np0005593233 nova_compute[222017]: 2026-01-23 10:58:12.479 222021 DEBUG nova.network.neutron [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:12 np0005593233 nova_compute[222017]: 2026-01-23 10:58:12.499 222021 DEBUG oslo_concurrency.lockutils [req-19d2cfb5-05e9-4e53-ab71-ce892c990724 req-e5baa5af-30e9-419a-a323-d71438430c52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:13.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:13 np0005593233 nova_compute[222017]: 2026-01-23 10:58:13.610 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:13.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:14 np0005593233 nova_compute[222017]: 2026-01-23 10:58:14.789 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:15.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:15.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:17.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.412 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.412 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.413 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:17.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:58:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2272411043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.908 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.983 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:58:17 np0005593233 nova_compute[222017]: 2026-01-23 10:58:17.985 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.159 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.161 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4078MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.161 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.161 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.258 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance 575c76d9-7306-429f-baca-4d450f37c388 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.259 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.259 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.314 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.652 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:58:18 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3315826958' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.762 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.768 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.784 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.809 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:58:18 np0005593233 nova_compute[222017]: 2026-01-23 10:58:18.810 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:19 np0005593233 nova_compute[222017]: 2026-01-23 10:58:19.790 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:19.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:21.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:21.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:21Z|00133|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:da:56 10.100.0.13
Jan 23 05:58:21 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:21Z|00134|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:da:56 10.100.0.13
Jan 23 05:58:22 np0005593233 nova_compute[222017]: 2026-01-23 10:58:22.810 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:23.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:23 np0005593233 nova_compute[222017]: 2026-01-23 10:58:23.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:23 np0005593233 nova_compute[222017]: 2026-01-23 10:58:23.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:23 np0005593233 nova_compute[222017]: 2026-01-23 10:58:23.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:58:23 np0005593233 nova_compute[222017]: 2026-01-23 10:58:23.656 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:23.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:24 np0005593233 nova_compute[222017]: 2026-01-23 10:58:24.838 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:25.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:25.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:25 np0005593233 podman[308083]: 2026-01-23 10:58:25.871812554 +0000 UTC m=+0.128672753 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:58:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:27.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:27.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:28 np0005593233 nova_compute[222017]: 2026-01-23 10:58:28.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:28 np0005593233 nova_compute[222017]: 2026-01-23 10:58:28.391 222021 INFO nova.compute.manager [None req-12745fe8-2440-49de-8793-628d54009397 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Get console output#033[00m
Jan 23 05:58:28 np0005593233 nova_compute[222017]: 2026-01-23 10:58:28.401 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:58:28 np0005593233 nova_compute[222017]: 2026-01-23 10:58:28.698 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:29.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:29 np0005593233 nova_compute[222017]: 2026-01-23 10:58:29.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:29 np0005593233 nova_compute[222017]: 2026-01-23 10:58:29.689 222021 INFO nova.compute.manager [None req-857dc30f-f686-45cb-88c3-a97113f0ec37 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Get console output#033[00m
Jan 23 05:58:29 np0005593233 nova_compute[222017]: 2026-01-23 10:58:29.696 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:58:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:29.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:29 np0005593233 nova_compute[222017]: 2026-01-23 10:58:29.889 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:31.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:31.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:33.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:33 np0005593233 nova_compute[222017]: 2026-01-23 10:58:33.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:33 np0005593233 nova_compute[222017]: 2026-01-23 10:58:33.748 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:33.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:58:34 np0005593233 nova_compute[222017]: 2026-01-23 10:58:34.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:34 np0005593233 nova_compute[222017]: 2026-01-23 10:58:34.874 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Check if temp file /var/lib/nova/instances/tmpri2lv6mn exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 23 05:58:34 np0005593233 nova_compute[222017]: 2026-01-23 10:58:34.874 222021 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='575c76d9-7306-429f-baca-4d450f37c388',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 23 05:58:34 np0005593233 nova_compute[222017]: 2026-01-23 10:58:34.891 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:58:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:58:35 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:58:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:35.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:35.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 05:58:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:37.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 05:58:37 np0005593233 nova_compute[222017]: 2026-01-23 10:58:37.835 222021 DEBUG nova.compute.manager [req-17497d63-a1fa-4c14-ba7b-fb50b21ef5b3 req-e163a646-7ed4-4f4e-b91f-42ed6271b065 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:37 np0005593233 nova_compute[222017]: 2026-01-23 10:58:37.835 222021 DEBUG oslo_concurrency.lockutils [req-17497d63-a1fa-4c14-ba7b-fb50b21ef5b3 req-e163a646-7ed4-4f4e-b91f-42ed6271b065 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:37 np0005593233 nova_compute[222017]: 2026-01-23 10:58:37.836 222021 DEBUG oslo_concurrency.lockutils [req-17497d63-a1fa-4c14-ba7b-fb50b21ef5b3 req-e163a646-7ed4-4f4e-b91f-42ed6271b065 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:37 np0005593233 nova_compute[222017]: 2026-01-23 10:58:37.836 222021 DEBUG oslo_concurrency.lockutils [req-17497d63-a1fa-4c14-ba7b-fb50b21ef5b3 req-e163a646-7ed4-4f4e-b91f-42ed6271b065 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:37 np0005593233 nova_compute[222017]: 2026-01-23 10:58:37.837 222021 DEBUG nova.compute.manager [req-17497d63-a1fa-4c14-ba7b-fb50b21ef5b3 req-e163a646-7ed4-4f4e-b91f-42ed6271b065 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:37 np0005593233 nova_compute[222017]: 2026-01-23 10:58:37.837 222021 DEBUG nova.compute.manager [req-17497d63-a1fa-4c14-ba7b-fb50b21ef5b3 req-e163a646-7ed4-4f4e-b91f-42ed6271b065 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:58:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:37.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.289 222021 INFO nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Took 2.53 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.290 222021 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.329 222021 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='575c76d9-7306-429f-baca-4d450f37c388',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(3347d51a-008b-4da5-ab25-6eaec6f2436c),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.333 222021 DEBUG nova.objects.instance [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'migration_context' on Instance uuid 575c76d9-7306-429f-baca-4d450f37c388 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.335 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.337 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.337 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.357 222021 DEBUG nova.virt.libvirt.vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:57:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1747551892',display_name='tempest-TestNetworkAdvancedServerOps-server-1747551892',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1747551892',id=212,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNLmxmIGURR44ldVtZLj5mDiJy9rKp0lzbESRue6Wnd2DycIawGU9GHFwoSAqxCF+VQclApTf6ivzJfIihq8OTikttBlSKyYddrT5smaN5oWB8DMKx3zgAFW+FO8eDOMGA==',key_name='tempest-TestNetworkAdvancedServerOps-647126014',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:58:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-pdznqph4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:58:08Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=575c76d9-7306-429f-baca-4d450f37c388,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.357 222021 DEBUG nova.network.os_vif_util [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.359 222021 DEBUG nova.network.os_vif_util [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.360 222021 DEBUG nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating guest XML with vif config: <interface type="ethernet">
Jan 23 05:58:38 np0005593233 nova_compute[222017]:  <mac address="fa:16:3e:a0:da:56"/>
Jan 23 05:58:38 np0005593233 nova_compute[222017]:  <model type="virtio"/>
Jan 23 05:58:38 np0005593233 nova_compute[222017]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:58:38 np0005593233 nova_compute[222017]:  <mtu size="1442"/>
Jan 23 05:58:38 np0005593233 nova_compute[222017]:  <target dev="tapeb5d7473-26"/>
Jan 23 05:58:38 np0005593233 nova_compute[222017]: </interface>
Jan 23 05:58:38 np0005593233 nova_compute[222017]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.361 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.792 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.840 222021 DEBUG nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.841 222021 INFO nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 23 05:58:38 np0005593233 nova_compute[222017]: 2026-01-23 10:58:38.971 222021 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 23 05:58:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:39.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.410 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.411 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 575c76d9-7306-429f-baca-4d450f37c388 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.474 222021 DEBUG nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.474 222021 DEBUG nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 05:58:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:39.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.935 222021 DEBUG nova.compute.manager [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.935 222021 DEBUG oslo_concurrency.lockutils [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.936 222021 DEBUG oslo_concurrency.lockutils [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.936 222021 DEBUG oslo_concurrency.lockutils [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.936 222021 DEBUG nova.compute.manager [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.936 222021 WARNING nova.compute.manager [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.937 222021 DEBUG nova.compute.manager [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.937 222021 DEBUG nova.compute.manager [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing instance network info cache due to event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.937 222021 DEBUG oslo_concurrency.lockutils [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.978 222021 DEBUG nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 05:58:39 np0005593233 nova_compute[222017]: 2026-01-23 10:58:39.979 222021 DEBUG nova.virt.libvirt.migration [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 05:58:40 np0005593233 podman[308243]: 2026-01-23 10:58:40.046023001 +0000 UTC m=+0.055764845 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.130 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769165920.1299725, 575c76d9-7306-429f-baca-4d450f37c388 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.131 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.148 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.152 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.169 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 23 05:58:40 np0005593233 kernel: tapeb5d7473-26 (unregistering): left promiscuous mode
Jan 23 05:58:40 np0005593233 NetworkManager[48871]: <info>  [1769165920.3146] device (tapeb5d7473-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:58:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:40Z|00908|binding|INFO|Releasing lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 from this chassis (sb_readonly=0)
Jan 23 05:58:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:40Z|00909|binding|INFO|Setting lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 down in Southbound
Jan 23 05:58:40 np0005593233 ovn_controller[130653]: 2026-01-23T10:58:40Z|00910|binding|INFO|Removing iface tapeb5d7473-26 ovn-installed in OVS
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.325 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.328 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.338 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:da:56 10.100.0.13'], port_security=['fa:16:3e:a0:da:56 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3ec410d4-99bb-47ec-9f70-86f8400b2621'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '575c76d9-7306-429f-baca-4d450f37c388', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6b9b22d8-1c0e-45a7-b51b-d7eb1b043141', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c950e56e-abf1-437a-896b-f8357baa2295, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=eb5d7473-2661-43a3-bf30-f8ccc9735ea6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.341 140224 INFO neutron.agent.ovn.metadata.agent [-] Port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 in datapath 709bd24b-e32e-4388-bc40-5f1d023f1ad4 unbound from our chassis#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.345 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 709bd24b-e32e-4388-bc40-5f1d023f1ad4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.347 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a3715cc5-e295-404a-9753-625a26239e0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.347 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.349 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 namespace which is not needed anymore#033[00m
Jan 23 05:58:40 np0005593233 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000d4.scope: Deactivated successfully.
Jan 23 05:58:40 np0005593233 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000d4.scope: Consumed 15.569s CPU time.
Jan 23 05:58:40 np0005593233 systemd-machined[190954]: Machine qemu-97-instance-000000d4 terminated.
Jan 23 05:58:40 np0005593233 virtqemud[221325]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/575c76d9-7306-429f-baca-4d450f37c388_disk: No such file or directory
Jan 23 05:58:40 np0005593233 virtqemud[221325]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/575c76d9-7306-429f-baca-4d450f37c388_disk: No such file or directory
Jan 23 05:58:40 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [NOTICE]   (308026) : haproxy version is 2.8.14-c23fe91
Jan 23 05:58:40 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [NOTICE]   (308026) : path to executable is /usr/sbin/haproxy
Jan 23 05:58:40 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [WARNING]  (308026) : Exiting Master process...
Jan 23 05:58:40 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [ALERT]    (308026) : Current worker (308028) exited with code 143 (Terminated)
Jan 23 05:58:40 np0005593233 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[308018]: [WARNING]  (308026) : All workers exited. Exiting... (0)
Jan 23 05:58:40 np0005593233 systemd[1]: libpod-b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8.scope: Deactivated successfully.
Jan 23 05:58:40 np0005593233 podman[308288]: 2026-01-23 10:58:40.500208872 +0000 UTC m=+0.047236864 container died b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.508 222021 DEBUG nova.virt.libvirt.guest [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.510 222021 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migration operation has completed#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.511 222021 INFO nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] _post_live_migration() is started..#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.516 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.517 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.517 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 23 05:58:40 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8-userdata-shm.mount: Deactivated successfully.
Jan 23 05:58:40 np0005593233 systemd[1]: var-lib-containers-storage-overlay-6848b37b2012fcfa05cf64b239d02d46d09fbe7a4f58069bc9a889769307cbfe-merged.mount: Deactivated successfully.
Jan 23 05:58:40 np0005593233 podman[308288]: 2026-01-23 10:58:40.553012212 +0000 UTC m=+0.100040204 container cleanup b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:58:40 np0005593233 systemd[1]: libpod-conmon-b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8.scope: Deactivated successfully.
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.590 222021 DEBUG nova.compute.manager [req-93f91044-87e3-49d5-b611-7af7af7dac14 req-6d075f5a-d37c-4b3a-86ff-e030a12227f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.590 222021 DEBUG oslo_concurrency.lockutils [req-93f91044-87e3-49d5-b611-7af7af7dac14 req-6d075f5a-d37c-4b3a-86ff-e030a12227f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.591 222021 DEBUG oslo_concurrency.lockutils [req-93f91044-87e3-49d5-b611-7af7af7dac14 req-6d075f5a-d37c-4b3a-86ff-e030a12227f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.591 222021 DEBUG oslo_concurrency.lockutils [req-93f91044-87e3-49d5-b611-7af7af7dac14 req-6d075f5a-d37c-4b3a-86ff-e030a12227f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.591 222021 DEBUG nova.compute.manager [req-93f91044-87e3-49d5-b611-7af7af7dac14 req-6d075f5a-d37c-4b3a-86ff-e030a12227f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.592 222021 DEBUG nova.compute.manager [req-93f91044-87e3-49d5-b611-7af7af7dac14 req-6d075f5a-d37c-4b3a-86ff-e030a12227f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:58:40 np0005593233 podman[308332]: 2026-01-23 10:58:40.624570492 +0000 UTC m=+0.047116120 container remove b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.631 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[09a3ffd6-3760-46ea-8d37-e8c549e1728d]: (4, ('Fri Jan 23 10:58:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 (b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8)\nb0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8\nFri Jan 23 10:58:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 (b0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8)\nb0b1bc135f39f7511fd64c9be9bcd01a9f2ac0353345d5af0ed7266c4a3ee6c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.633 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ca82b0bf-797c-45e1-bfec-e0c6d0541eae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.635 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap709bd24b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.635 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:40 np0005593233 kernel: tap709bd24b-e0: left promiscuous mode
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.638 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.656 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.660 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba4e9c0-c1e0-41ee-a0ec-02ef799e045b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.669 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.669 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.669 222021 DEBUG oslo_concurrency.lockutils [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:40 np0005593233 nova_compute[222017]: 2026-01-23 10:58:40.669 222021 DEBUG nova.network.neutron [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.675 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[5cb35b60-fcce-4542-940d-f09bbc6d5728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.676 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff5784d-56d0-4fcc-9289-b60234be657b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.692 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[60a9ff3f-c7a5-4fb3-965d-b69a42e902dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 980016, 'reachable_time': 26299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308374, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.694 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:58:40 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:40.694 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[78e6acd2-c38f-49ad-8162-b18ad7b28a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:40 np0005593233 systemd[1]: run-netns-ovnmeta\x2d709bd24b\x2de32e\x2d4388\x2dbc40\x2d5f1d023f1ad4.mount: Deactivated successfully.
Jan 23 05:58:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.204 222021 DEBUG nova.network.neutron [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Activated binding for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.204 222021 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.205 222021 DEBUG nova.virt.libvirt.vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:57:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1747551892',display_name='tempest-TestNetworkAdvancedServerOps-server-1747551892',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1747551892',id=212,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNLmxmIGURR44ldVtZLj5mDiJy9rKp0lzbESRue6Wnd2DycIawGU9GHFwoSAqxCF+VQclApTf6ivzJfIihq8OTikttBlSKyYddrT5smaN5oWB8DMKx3zgAFW+FO8eDOMGA==',key_name='tempest-TestNetworkAdvancedServerOps-647126014',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:58:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-pdznqph4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:58:30Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=575c76d9-7306-429f-baca-4d450f37c388,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.206 222021 DEBUG nova.network.os_vif_util [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.206 222021 DEBUG nova.network.os_vif_util [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.207 222021 DEBUG os_vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.209 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.209 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb5d7473-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.215 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.216 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.219 222021 INFO os_vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26')#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.220 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.220 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.220 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.220 222021 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.221 222021 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Deleting instance files /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388_del#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.221 222021 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Deletion of /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388_del complete#033[00m
Jan 23 05:58:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:41.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.760 222021 DEBUG nova.network.neutron [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updated VIF entry in instance network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.761 222021 DEBUG nova.network.neutron [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:41 np0005593233 nova_compute[222017]: 2026-01-23 10:58:41.788 222021 DEBUG oslo_concurrency.lockutils [req-d5fc74bb-3339-4faf-8d54-da06d9e578de req-1a30202b-cb84-4b1c-a4f2-bf63a1feafcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:58:41 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:58:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:41.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.021 222021 DEBUG nova.compute.manager [req-fe7ed934-df2d-4dd8-9a5d-bc7cac0049cd req-0c06ee9b-c6db-42ca-a6e7-064cf6a04a41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.022 222021 DEBUG oslo_concurrency.lockutils [req-fe7ed934-df2d-4dd8-9a5d-bc7cac0049cd req-0c06ee9b-c6db-42ca-a6e7-064cf6a04a41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.022 222021 DEBUG oslo_concurrency.lockutils [req-fe7ed934-df2d-4dd8-9a5d-bc7cac0049cd req-0c06ee9b-c6db-42ca-a6e7-064cf6a04a41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.023 222021 DEBUG oslo_concurrency.lockutils [req-fe7ed934-df2d-4dd8-9a5d-bc7cac0049cd req-0c06ee9b-c6db-42ca-a6e7-064cf6a04a41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.023 222021 DEBUG nova.compute.manager [req-fe7ed934-df2d-4dd8-9a5d-bc7cac0049cd req-0c06ee9b-c6db-42ca-a6e7-064cf6a04a41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.023 222021 DEBUG nova.compute.manager [req-fe7ed934-df2d-4dd8-9a5d-bc7cac0049cd req-0c06ee9b-c6db-42ca-a6e7-064cf6a04a41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.676 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.677 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.677 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.678 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.678 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.678 222021 WARNING nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.679 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.679 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.679 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.680 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.680 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.680 222021 WARNING nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.681 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.681 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.681 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.682 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.682 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.682 222021 WARNING nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.683 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.683 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.683 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.684 222021 DEBUG oslo_concurrency.lockutils [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.684 222021 DEBUG nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:42 np0005593233 nova_compute[222017]: 2026-01-23 10:58:42.684 222021 WARNING nova.compute.manager [req-b35c2b28-3602-4595-9eb4-3447ce898c7e req-21424dbc-b9d7-494b-8465-9cad5cbeab90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 05:58:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:42.723 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:42.724 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:42.724 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:43.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:43 np0005593233 nova_compute[222017]: 2026-01-23 10:58:43.794 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:43.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:45.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:45.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.216 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.304 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.305 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.305 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.326 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.326 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.326 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.327 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.327 222021 DEBUG oslo_concurrency.processutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:58:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/993885068' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:58:46 np0005593233 nova_compute[222017]: 2026-01-23 10:58:46.822 222021 DEBUG oslo_concurrency.processutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.027 222021 WARNING nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.029 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4248MB free_disk=20.942718505859375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.029 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.030 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.077 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Migration for instance 575c76d9-7306-429f-baca-4d450f37c388 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.106 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.139 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Migration 3347d51a-008b-4da5-ab25-6eaec6f2436c is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.140 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.140 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.191 222021 DEBUG oslo_concurrency.processutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:58:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:47.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3466059728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.717 222021 DEBUG oslo_concurrency.processutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.727 222021 DEBUG nova.compute.provider_tree [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.745 222021 DEBUG nova.scheduler.client.report [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.774 222021 DEBUG nova.compute.resource_tracker [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.775 222021 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.779 222021 INFO nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #199. Immutable memtables: 0.
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.807073) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 127] Flushing memtable with next log file: 199
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927807180, "job": 127, "event": "flush_started", "num_memtables": 1, "num_entries": 778, "num_deletes": 256, "total_data_size": 1474164, "memory_usage": 1495040, "flush_reason": "Manual Compaction"}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 127] Level-0 flush table #200: started
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927820429, "cf_name": "default", "job": 127, "event": "table_file_creation", "file_number": 200, "file_size": 974112, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 94693, "largest_seqno": 95466, "table_properties": {"data_size": 970325, "index_size": 1566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8391, "raw_average_key_size": 19, "raw_value_size": 962768, "raw_average_value_size": 2183, "num_data_blocks": 69, "num_entries": 441, "num_filter_entries": 441, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165874, "oldest_key_time": 1769165874, "file_creation_time": 1769165927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 200, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 127] Flush lasted 13407 microseconds, and 5831 cpu microseconds.
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.820499) [db/flush_job.cc:967] [default] [JOB 127] Level-0 flush table #200: 974112 bytes OK
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.820522) [db/memtable_list.cc:519] [default] Level-0 commit table #200 started
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.822455) [db/memtable_list.cc:722] [default] Level-0 commit table #200: memtable #1 done
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.822470) EVENT_LOG_v1 {"time_micros": 1769165927822464, "job": 127, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.822489) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 127] Try to delete WAL files size 1470010, prev total WAL file size 1475411, number of live WAL files 2.
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000196.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.823263) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373730' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end)
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 128] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 127 Base level 0, inputs: [200(951KB)], [198(11MB)]
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927823295, "job": 128, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [200], "files_L6": [198], "score": -1, "input_data_size": 12629747, "oldest_snapshot_seqno": -1}
Jan 23 05:58:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:47.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.876 222021 INFO nova.scheduler.client.report [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Deleted allocation for migration 3347d51a-008b-4da5-ab25-6eaec6f2436c#033[00m
Jan 23 05:58:47 np0005593233 nova_compute[222017]: 2026-01-23 10:58:47.877 222021 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 128] Generated table #201: 11093 keys, 12495527 bytes, temperature: kUnknown
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927930425, "cf_name": "default", "job": 128, "event": "table_file_creation", "file_number": 201, "file_size": 12495527, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12426929, "index_size": 39749, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 294372, "raw_average_key_size": 26, "raw_value_size": 12236039, "raw_average_value_size": 1103, "num_data_blocks": 1499, "num_entries": 11093, "num_filter_entries": 11093, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769165927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 201, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.930764) [db/compaction/compaction_job.cc:1663] [default] [JOB 128] Compacted 1@0 + 1@6 files to L6 => 12495527 bytes
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.959555) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 117.8 rd, 116.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.1 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(25.8) write-amplify(12.8) OK, records in: 11617, records dropped: 524 output_compression: NoCompression
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.959610) EVENT_LOG_v1 {"time_micros": 1769165927959591, "job": 128, "event": "compaction_finished", "compaction_time_micros": 107255, "compaction_time_cpu_micros": 29490, "output_level": 6, "num_output_files": 1, "total_output_size": 12495527, "num_input_records": 11617, "num_output_records": 11093, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000200.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927960306, "job": 128, "event": "table_file_deletion", "file_number": 200}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000198.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927962870, "job": 128, "event": "table_file_deletion", "file_number": 198}
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.823169) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.962956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.962961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.962963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.962965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-10:58:47.962967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:48 np0005593233 nova_compute[222017]: 2026-01-23 10:58:48.850 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:49.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:49.578 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=92, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=91) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:58:49 np0005593233 nova_compute[222017]: 2026-01-23 10:58:49.579 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:49 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:49.580 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:58:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:49.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:51 np0005593233 nova_compute[222017]: 2026-01-23 10:58:51.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:51.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:51.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:53.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:53 np0005593233 nova_compute[222017]: 2026-01-23 10:58:53.853 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:53.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:58:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:58:55 np0005593233 nova_compute[222017]: 2026-01-23 10:58:55.509 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165920.5072463, 575c76d9-7306-429f-baca-4d450f37c388 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:55 np0005593233 nova_compute[222017]: 2026-01-23 10:58:55.510 222021 INFO nova.compute.manager [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:58:55 np0005593233 nova_compute[222017]: 2026-01-23 10:58:55.531 222021 DEBUG nova.compute.manager [None req-43fa9d0e-d8d0-47c5-ac97-b097d673fec8 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:55 np0005593233 nova_compute[222017]: 2026-01-23 10:58:55.664 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:55.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:56 np0005593233 podman[308449]: 2026-01-23 10:58:56.120857983 +0000 UTC m=+0.114823192 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:58:56 np0005593233 nova_compute[222017]: 2026-01-23 10:58:56.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:57 np0005593233 nova_compute[222017]: 2026-01-23 10:58:57.313 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:57 np0005593233 nova_compute[222017]: 2026-01-23 10:58:57.367 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:57.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:57.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:59 np0005593233 nova_compute[222017]: 2026-01-23 10:58:58.856 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:59.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:58:59.581 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '92'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:58:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:59.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:00 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:01 np0005593233 nova_compute[222017]: 2026-01-23 10:59:01.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:01.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:59:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:01.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:59:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:03 np0005593233 nova_compute[222017]: 2026-01-23 10:59:03.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:03.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:05.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:05.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:05 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:06 np0005593233 nova_compute[222017]: 2026-01-23 10:59:06.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:07.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:07.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:08 np0005593233 nova_compute[222017]: 2026-01-23 10:59:08.907 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:09.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:09.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:11 np0005593233 podman[308479]: 2026-01-23 10:59:11.061638919 +0000 UTC m=+0.072359303 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:59:11 np0005593233 nova_compute[222017]: 2026-01-23 10:59:11.228 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:11.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:13.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:13 np0005593233 nova_compute[222017]: 2026-01-23 10:59:13.911 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:15.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:15.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:16 np0005593233 nova_compute[222017]: 2026-01-23 10:59:16.276 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:17.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:17.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:18 np0005593233 nova_compute[222017]: 2026-01-23 10:59:18.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.414 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.414 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:19.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:19 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:59:19 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2210208185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:59:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:59:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:19.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:59:19 np0005593233 nova_compute[222017]: 2026-01-23 10:59:19.932 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.146 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.148 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4255MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.148 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.148 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.215 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.215 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.230 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:20 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:59:20 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1565660605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.757 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.766 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.788 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.793 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:59:20 np0005593233 nova_compute[222017]: 2026-01-23 10:59:20.794 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:21 np0005593233 nova_compute[222017]: 2026-01-23 10:59:21.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:21.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:59:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:21.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:59:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:23.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:23.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:23 np0005593233 nova_compute[222017]: 2026-01-23 10:59:23.944 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:24 np0005593233 nova_compute[222017]: 2026-01-23 10:59:24.796 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:24 np0005593233 nova_compute[222017]: 2026-01-23 10:59:24.797 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:25.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:25 np0005593233 nova_compute[222017]: 2026-01-23 10:59:25.559 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:25 np0005593233 nova_compute[222017]: 2026-01-23 10:59:25.560 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:59:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:25.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:26 np0005593233 nova_compute[222017]: 2026-01-23 10:59:26.321 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:27 np0005593233 podman[308542]: 2026-01-23 10:59:27.172065974 +0000 UTC m=+0.176239336 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 05:59:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:27.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:27.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:29 np0005593233 nova_compute[222017]: 2026-01-23 10:59:28.999 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:29 np0005593233 nova_compute[222017]: 2026-01-23 10:59:29.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:29.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:29.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:30 np0005593233 nova_compute[222017]: 2026-01-23 10:59:30.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:31 np0005593233 nova_compute[222017]: 2026-01-23 10:59:31.366 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:31.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:31.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:33.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:59:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:33.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:59:34 np0005593233 nova_compute[222017]: 2026-01-23 10:59:34.002 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:34 np0005593233 nova_compute[222017]: 2026-01-23 10:59:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:35 np0005593233 nova_compute[222017]: 2026-01-23 10:59:35.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:35.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:35.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:36 np0005593233 nova_compute[222017]: 2026-01-23 10:59:36.370 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:37.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:37 np0005593233 ovn_controller[130653]: 2026-01-23T10:59:37Z|00911|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 23 05:59:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:37.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:39 np0005593233 nova_compute[222017]: 2026-01-23 10:59:39.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:39 np0005593233 nova_compute[222017]: 2026-01-23 10:59:39.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:39 np0005593233 nova_compute[222017]: 2026-01-23 10:59:39.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:59:39 np0005593233 nova_compute[222017]: 2026-01-23 10:59:39.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:59:39 np0005593233 nova_compute[222017]: 2026-01-23 10:59:39.415 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:59:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:39.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:59:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:39.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:59:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:41 np0005593233 podman[308617]: 2026-01-23 10:59:41.279127376 +0000 UTC m=+0.093533941 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 05:59:41 np0005593233 nova_compute[222017]: 2026-01-23 10:59:41.373 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:41.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:41.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:59:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:59:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:59:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:59:42.724 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:59:42.724 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:59:42.724 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:43.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:43.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:44 np0005593233 nova_compute[222017]: 2026-01-23 10:59:44.006 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 05:59:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:45.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 05:59:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:45.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:46 np0005593233 nova_compute[222017]: 2026-01-23 10:59:46.397 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:47.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:47.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:59:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:59:49 np0005593233 nova_compute[222017]: 2026-01-23 10:59:49.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:49.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:49.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:51 np0005593233 nova_compute[222017]: 2026-01-23 10:59:51.400 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:51.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:51.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:53.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:54 np0005593233 nova_compute[222017]: 2026-01-23 10:59:54.015 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:55.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:55.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:59:56.209 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=93, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=92) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:59:56 np0005593233 nova_compute[222017]: 2026-01-23 10:59:56.210 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:59:56.212 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:59:56 np0005593233 nova_compute[222017]: 2026-01-23 10:59:56.402 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 10:59:57.214 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '93'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:57.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:57.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:58 np0005593233 podman[308768]: 2026-01-23 10:59:58.115260333 +0000 UTC m=+0.130785023 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:59:59 np0005593233 nova_compute[222017]: 2026-01-23 10:59:59.042 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 05:59:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 05:59:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 05:59:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:59.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 06:00:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:01 np0005593233 nova_compute[222017]: 2026-01-23 11:00:01.471 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:01.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:01.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:03.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:04 np0005593233 nova_compute[222017]: 2026-01-23 11:00:04.086 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:05.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:05.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:06 np0005593233 nova_compute[222017]: 2026-01-23 11:00:06.518 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:07.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:08.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:08 np0005593233 nova_compute[222017]: 2026-01-23 11:00:08.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:09 np0005593233 nova_compute[222017]: 2026-01-23 11:00:09.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:09.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:10.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:11.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:11 np0005593233 nova_compute[222017]: 2026-01-23 11:00:11.570 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:12.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:12 np0005593233 podman[308794]: 2026-01-23 11:00:12.059889699 +0000 UTC m=+0.076449329 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 06:00:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:13.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:14.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:14 np0005593233 nova_compute[222017]: 2026-01-23 11:00:14.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:15.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:00:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:16.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:00:16 np0005593233 nova_compute[222017]: 2026-01-23 11:00:16.573 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:17 np0005593233 nova_compute[222017]: 2026-01-23 11:00:17.398 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:17 np0005593233 nova_compute[222017]: 2026-01-23 11:00:17.398 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:00:17 np0005593233 nova_compute[222017]: 2026-01-23 11:00:17.416 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:00:17 np0005593233 nova_compute[222017]: 2026-01-23 11:00:17.417 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:17 np0005593233 nova_compute[222017]: 2026-01-23 11:00:17.417 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:00:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:18.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:19 np0005593233 nova_compute[222017]: 2026-01-23 11:00:19.137 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:00:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:00:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.415 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.456 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.457 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.458 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.458 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.459 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:21 np0005593233 nova_compute[222017]: 2026-01-23 11:00:21.576 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:00:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1427616392' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:00:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:22.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.028 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.235 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.237 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4271MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.237 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.237 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.297 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.298 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.312 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:00:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2730887029' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.788 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.793 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.811 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.813 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:00:22 np0005593233 nova_compute[222017]: 2026-01-23 11:00:22.813 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:00:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:00:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:24.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:24 np0005593233 nova_compute[222017]: 2026-01-23 11:00:24.139 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:25.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:25 np0005593233 nova_compute[222017]: 2026-01-23 11:00:25.785 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:26 np0005593233 nova_compute[222017]: 2026-01-23 11:00:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:26 np0005593233 nova_compute[222017]: 2026-01-23 11:00:26.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:26 np0005593233 nova_compute[222017]: 2026-01-23 11:00:26.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:00:26 np0005593233 nova_compute[222017]: 2026-01-23 11:00:26.580 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:27.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:29 np0005593233 nova_compute[222017]: 2026-01-23 11:00:29.140 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:29 np0005593233 podman[308858]: 2026-01-23 11:00:29.164022444 +0000 UTC m=+0.175030212 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 06:00:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:30.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:30 np0005593233 nova_compute[222017]: 2026-01-23 11:00:30.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:31.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:31 np0005593233 nova_compute[222017]: 2026-01-23 11:00:31.582 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:00:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:32.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:00:32 np0005593233 nova_compute[222017]: 2026-01-23 11:00:32.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:33.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:34.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:34 np0005593233 nova_compute[222017]: 2026-01-23 11:00:34.203 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:34 np0005593233 nova_compute[222017]: 2026-01-23 11:00:34.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:35 np0005593233 nova_compute[222017]: 2026-01-23 11:00:35.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:35.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:36.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:36 np0005593233 nova_compute[222017]: 2026-01-23 11:00:36.623 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:37.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:38.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #202. Immutable memtables: 0.
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.414803) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 129] Flushing memtable with next log file: 202
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038414851, "job": 129, "event": "flush_started", "num_memtables": 1, "num_entries": 1279, "num_deletes": 251, "total_data_size": 2808818, "memory_usage": 2844968, "flush_reason": "Manual Compaction"}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 129] Level-0 flush table #203: started
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038432170, "cf_name": "default", "job": 129, "event": "table_file_creation", "file_number": 203, "file_size": 1855448, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95471, "largest_seqno": 96745, "table_properties": {"data_size": 1849926, "index_size": 2916, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11948, "raw_average_key_size": 19, "raw_value_size": 1838809, "raw_average_value_size": 3054, "num_data_blocks": 130, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165927, "oldest_key_time": 1769165927, "file_creation_time": 1769166038, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 203, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 129] Flush lasted 17452 microseconds, and 6307 cpu microseconds.
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.432254) [db/flush_job.cc:967] [default] [JOB 129] Level-0 flush table #203: 1855448 bytes OK
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.432283) [db/memtable_list.cc:519] [default] Level-0 commit table #203 started
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.435672) [db/memtable_list.cc:722] [default] Level-0 commit table #203: memtable #1 done
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.435694) EVENT_LOG_v1 {"time_micros": 1769166038435686, "job": 129, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.435714) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 129] Try to delete WAL files size 2802674, prev total WAL file size 2802674, number of live WAL files 2.
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000199.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.436656) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038353334' seq:72057594037927935, type:22 .. '7061786F730038373836' seq:0, type:0; will stop at (end)
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 130] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 129 Base level 0, inputs: [203(1811KB)], [201(11MB)]
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038436713, "job": 130, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [203], "files_L6": [201], "score": -1, "input_data_size": 14350975, "oldest_snapshot_seqno": -1}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 130] Generated table #204: 11176 keys, 12422336 bytes, temperature: kUnknown
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038560714, "cf_name": "default", "job": 130, "event": "table_file_creation", "file_number": 204, "file_size": 12422336, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12353190, "index_size": 40105, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27973, "raw_key_size": 296761, "raw_average_key_size": 26, "raw_value_size": 12160871, "raw_average_value_size": 1088, "num_data_blocks": 1509, "num_entries": 11176, "num_filter_entries": 11176, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166038, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 204, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.561015) [db/compaction/compaction_job.cc:1663] [default] [JOB 130] Compacted 1@0 + 1@6 files to L6 => 12422336 bytes
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.586092) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 115.7 rd, 100.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 11.9 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(14.4) write-amplify(6.7) OK, records in: 11695, records dropped: 519 output_compression: NoCompression
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.586118) EVENT_LOG_v1 {"time_micros": 1769166038586107, "job": 130, "event": "compaction_finished", "compaction_time_micros": 124085, "compaction_time_cpu_micros": 29531, "output_level": 6, "num_output_files": 1, "total_output_size": 12422336, "num_input_records": 11695, "num_output_records": 11176, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000203.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038586534, "job": 130, "event": "table_file_deletion", "file_number": 203}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000201.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038588578, "job": 130, "event": "table_file_deletion", "file_number": 201}
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.436603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.588653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.588662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.588665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.588668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:00:38.588671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:39 np0005593233 nova_compute[222017]: 2026-01-23 11:00:39.241 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:39.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:40.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:41 np0005593233 nova_compute[222017]: 2026-01-23 11:00:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:41 np0005593233 nova_compute[222017]: 2026-01-23 11:00:41.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:00:41 np0005593233 nova_compute[222017]: 2026-01-23 11:00:41.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:00:41 np0005593233 nova_compute[222017]: 2026-01-23 11:00:41.424 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:00:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:41.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:41 np0005593233 nova_compute[222017]: 2026-01-23 11:00:41.625 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:42.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:00:42.724 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:00:42.725 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:00:42.725 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:43 np0005593233 podman[308886]: 2026-01-23 11:00:43.083764195 +0000 UTC m=+0.087538843 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:00:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:43.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:44.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:44 np0005593233 nova_compute[222017]: 2026-01-23 11:00:44.242 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:45.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:46.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:46 np0005593233 nova_compute[222017]: 2026-01-23 11:00:46.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:47.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:48.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:49 np0005593233 nova_compute[222017]: 2026-01-23 11:00:49.245 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:49 np0005593233 podman[309076]: 2026-01-23 11:00:49.568623241 +0000 UTC m=+0.097375411 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 06:00:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:49.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:49 np0005593233 podman[309076]: 2026-01-23 11:00:49.696487767 +0000 UTC m=+0.225239887 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 06:00:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:50.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:00:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:00:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:51.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:51 np0005593233 nova_compute[222017]: 2026-01-23 11:00:51.669 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:52.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:00:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:00:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:00:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:53.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:54.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:54 np0005593233 nova_compute[222017]: 2026-01-23 11:00:54.248 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:55.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:56.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:56 np0005593233 nova_compute[222017]: 2026-01-23 11:00:56.673 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:00:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:57.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:00:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:58.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:59 np0005593233 nova_compute[222017]: 2026-01-23 11:00:59.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:59 np0005593233 nova_compute[222017]: 2026-01-23 11:00:59.418 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:00:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:59.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:59 np0005593233 podman[309355]: 2026-01-23 11:00:59.71137512 +0000 UTC m=+0.142649603 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 06:01:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:00.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:01:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:01:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 06:01:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:01.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 06:01:01 np0005593233 nova_compute[222017]: 2026-01-23 11:01:01.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:02.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:03.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:04.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:04 np0005593233 nova_compute[222017]: 2026-01-23 11:01:04.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:05.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:06.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:06 np0005593233 nova_compute[222017]: 2026-01-23 11:01:06.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:07.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:08.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:09 np0005593233 nova_compute[222017]: 2026-01-23 11:01:09.299 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:09.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:10.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:11.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:11 np0005593233 nova_compute[222017]: 2026-01-23 11:01:11.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:12.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:13.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:14 np0005593233 podman[309418]: 2026-01-23 11:01:14.092097272 +0000 UTC m=+0.091733721 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 06:01:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:14.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:14 np0005593233 nova_compute[222017]: 2026-01-23 11:01:14.303 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:15.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:16.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:16 np0005593233 nova_compute[222017]: 2026-01-23 11:01:16.715 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:17.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:01:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.8 total, 600.0 interval#012Cumulative writes: 67K writes, 256K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.03 MB/s#012Cumulative WAL: 67K writes, 25K syncs, 2.63 writes per sync, written: 0.24 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1932 writes, 8335 keys, 1932 commit groups, 1.0 writes per commit group, ingest: 7.98 MB, 0.01 MB/s#012Interval WAL: 1932 writes, 726 syncs, 2.66 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:01:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:18.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:19 np0005593233 nova_compute[222017]: 2026-01-23 11:01:19.345 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:19.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:20.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:01:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:21.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:01:21 np0005593233 nova_compute[222017]: 2026-01-23 11:01:21.754 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:22.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.434 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.435 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.435 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.436 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.436 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:01:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:01:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2487296406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:01:22 np0005593233 nova_compute[222017]: 2026-01-23 11:01:22.977 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.218 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.219 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4272MB free_disk=20.942676544189453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.219 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.220 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.304 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.305 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.321 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.338 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.338 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.357 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.385 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.403 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:01:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:23.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:01:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3195443617' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.894 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.904 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.928 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.931 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:01:23 np0005593233 nova_compute[222017]: 2026-01-23 11:01:23.932 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:24.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:24 np0005593233 nova_compute[222017]: 2026-01-23 11:01:24.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:25.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:25 np0005593233 nova_compute[222017]: 2026-01-23 11:01:25.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:01:25.945 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=94, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=93) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:01:25 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:01:25.946 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:01:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:26.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:26 np0005593233 nova_compute[222017]: 2026-01-23 11:01:26.757 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:27.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:27 np0005593233 nova_compute[222017]: 2026-01-23 11:01:27.933 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:28.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:28 np0005593233 nova_compute[222017]: 2026-01-23 11:01:28.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:28 np0005593233 nova_compute[222017]: 2026-01-23 11:01:28.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:28 np0005593233 nova_compute[222017]: 2026-01-23 11:01:28.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:01:29 np0005593233 nova_compute[222017]: 2026-01-23 11:01:29.352 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:29.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:30 np0005593233 podman[309481]: 2026-01-23 11:01:30.117279584 +0000 UTC m=+0.116410946 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 06:01:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:30.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:31.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:31 np0005593233 nova_compute[222017]: 2026-01-23 11:01:31.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:32.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:32 np0005593233 nova_compute[222017]: 2026-01-23 11:01:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:32 np0005593233 nova_compute[222017]: 2026-01-23 11:01:32.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:33.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:34.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:34 np0005593233 nova_compute[222017]: 2026-01-23 11:01:34.388 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:34 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:01:34.948 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '94'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:35.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:36.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:36 np0005593233 nova_compute[222017]: 2026-01-23 11:01:36.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:36 np0005593233 nova_compute[222017]: 2026-01-23 11:01:36.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:36 np0005593233 nova_compute[222017]: 2026-01-23 11:01:36.806 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:37.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:38.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:39 np0005593233 nova_compute[222017]: 2026-01-23 11:01:39.390 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:39.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:40.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:41.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:41 np0005593233 nova_compute[222017]: 2026-01-23 11:01:41.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:01:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:42.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:01:42 np0005593233 nova_compute[222017]: 2026-01-23 11:01:42.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:42 np0005593233 nova_compute[222017]: 2026-01-23 11:01:42.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:01:42 np0005593233 nova_compute[222017]: 2026-01-23 11:01:42.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:01:42 np0005593233 nova_compute[222017]: 2026-01-23 11:01:42.414 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:01:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:01:42.725 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:01:42.726 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:01:42.726 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:43.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:44.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:44 np0005593233 nova_compute[222017]: 2026-01-23 11:01:44.392 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:45 np0005593233 podman[309507]: 2026-01-23 11:01:45.073019391 +0000 UTC m=+0.082984875 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:01:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:45.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:46.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:46 np0005593233 nova_compute[222017]: 2026-01-23 11:01:46.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:47.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:48.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:49 np0005593233 nova_compute[222017]: 2026-01-23 11:01:49.426 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:49.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:50.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:51.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:51 np0005593233 nova_compute[222017]: 2026-01-23 11:01:51.909 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:52.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:53.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:54.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:54 np0005593233 nova_compute[222017]: 2026-01-23 11:01:54.478 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:55.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:56.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:01:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.0 total, 600.0 interval#012Cumulative writes: 18K writes, 97K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.03 MB/s#012Cumulative WAL: 18K writes, 18K syncs, 1.00 writes per sync, written: 0.19 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1395 writes, 6933 keys, 1395 commit groups, 1.0 writes per commit group, ingest: 15.00 MB, 0.03 MB/s#012Interval WAL: 1395 writes, 1395 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     48.8      2.49              0.55        65    0.038       0      0       0.0       0.0#012  L6      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.6      0.0       0.0   5.4     76.4     65.5     10.06              2.40        64    0.157    515K    34K       0.0       0.0#012 Sum      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.8      0.1       0.0   6.4     61.3     62.2     12.55              2.95       129    0.097    515K    34K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9     51.6     51.4      1.36              0.28        10    0.136     58K   2599       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.6       0.6      0.0       0.0   0.0     76.4     65.5     10.06              2.40        64    0.157    515K    34K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     48.8      2.49              0.55        64    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7200.0 total, 600.0 interval#012Flush(GB): cumulative 0.118, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.76 GB write, 0.11 MB/s write, 0.75 GB read, 0.11 MB/s read, 12.6 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 84.06 MB table_size: 0 occupancy: 18446744073709551615 collections: 13 last_copies: 0 last_secs: 0.001103 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4783,80.45 MB,26.4648%) FilterBlock(129,1.40 MB,0.462015%) IndexBlock(129,2.20 MB,0.724998%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 06:01:56 np0005593233 nova_compute[222017]: 2026-01-23 11:01:56.917 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:57.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:01:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:58.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:01:59 np0005593233 nova_compute[222017]: 2026-01-23 11:01:59.479 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:01:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:59.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:00.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:00 np0005593233 podman[309625]: 2026-01-23 11:02:00.298863047 +0000 UTC m=+0.110516350 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:02:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:01.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:01 np0005593233 nova_compute[222017]: 2026-01-23 11:02:01.922 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:02.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:03.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:04.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:04 np0005593233 nova_compute[222017]: 2026-01-23 11:02:04.480 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:05 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:05.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:06.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:02:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:02:06 np0005593233 nova_compute[222017]: 2026-01-23 11:02:06.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:08.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:09 np0005593233 nova_compute[222017]: 2026-01-23 11:02:09.481 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:09.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:10.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:11.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:11 np0005593233 nova_compute[222017]: 2026-01-23 11:02:11.937 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:12.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:13.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:02:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:14.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:02:14 np0005593233 nova_compute[222017]: 2026-01-23 11:02:14.484 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:14 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:15.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:16 np0005593233 podman[309852]: 2026-01-23 11:02:16.046404597 +0000 UTC m=+0.064250758 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 06:02:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:16.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:16 np0005593233 nova_compute[222017]: 2026-01-23 11:02:16.982 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:17.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:18.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:19 np0005593233 nova_compute[222017]: 2026-01-23 11:02:19.487 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:19.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:20.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:21.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:21 np0005593233 nova_compute[222017]: 2026-01-23 11:02:21.986 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:22.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.421 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.422 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.422 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:02:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1525243383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:02:22 np0005593233 nova_compute[222017]: 2026-01-23 11:02:22.928 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.164 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.166 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4279MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.167 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.167 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.258 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.259 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.280 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:23.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:02:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1105649424' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.808 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.816 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.852 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.855 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:02:23 np0005593233 nova_compute[222017]: 2026-01-23 11:02:23.855 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:24.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:24 np0005593233 nova_compute[222017]: 2026-01-23 11:02:24.491 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:25.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:26.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:26 np0005593233 nova_compute[222017]: 2026-01-23 11:02:26.989 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:02:27.380 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=95, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=94) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:02:27 np0005593233 nova_compute[222017]: 2026-01-23 11:02:27.380 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:02:27.381 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:02:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:27.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:28.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:29 np0005593233 nova_compute[222017]: 2026-01-23 11:02:29.493 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:29.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:29 np0005593233 nova_compute[222017]: 2026-01-23 11:02:29.856 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:29 np0005593233 nova_compute[222017]: 2026-01-23 11:02:29.856 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:29 np0005593233 nova_compute[222017]: 2026-01-23 11:02:29.857 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:29 np0005593233 nova_compute[222017]: 2026-01-23 11:02:29.857 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:02:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:30.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:31 np0005593233 podman[309917]: 2026-01-23 11:02:31.155340225 +0000 UTC m=+0.153658804 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:02:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:02:31.383 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '95'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:31.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:31 np0005593233 nova_compute[222017]: 2026-01-23 11:02:31.991 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:32.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:32 np0005593233 nova_compute[222017]: 2026-01-23 11:02:32.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:32 np0005593233 nova_compute[222017]: 2026-01-23 11:02:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:33.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:34.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:34 np0005593233 nova_compute[222017]: 2026-01-23 11:02:34.496 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 06:02:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:35.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:36.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:36 np0005593233 nova_compute[222017]: 2026-01-23 11:02:36.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:36 np0005593233 nova_compute[222017]: 2026-01-23 11:02:36.994 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:37.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:38.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:38 np0005593233 nova_compute[222017]: 2026-01-23 11:02:38.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:38 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:02:38 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:02:39 np0005593233 nova_compute[222017]: 2026-01-23 11:02:39.498 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:39.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:40.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:41 np0005593233 nova_compute[222017]: 2026-01-23 11:02:41.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:42.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:42 np0005593233 nova_compute[222017]: 2026-01-23 11:02:42.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:42 np0005593233 nova_compute[222017]: 2026-01-23 11:02:42.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:02:42 np0005593233 nova_compute[222017]: 2026-01-23 11:02:42.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:02:42 np0005593233 nova_compute[222017]: 2026-01-23 11:02:42.409 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:02:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:02:42.726 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:02:42.726 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:02:42.726 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:44.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:44 np0005593233 nova_compute[222017]: 2026-01-23 11:02:44.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:45.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:46.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:47 np0005593233 nova_compute[222017]: 2026-01-23 11:02:46.999 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:47 np0005593233 podman[309945]: 2026-01-23 11:02:47.057988778 +0000 UTC m=+0.068427896 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 06:02:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:47.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:48.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:49 np0005593233 nova_compute[222017]: 2026-01-23 11:02:49.547 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:49.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:50.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:51.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:52 np0005593233 nova_compute[222017]: 2026-01-23 11:02:52.003 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:52.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:53.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:54.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:54 np0005593233 nova_compute[222017]: 2026-01-23 11:02:54.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:55.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:02:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:02:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:57 np0005593233 nova_compute[222017]: 2026-01-23 11:02:57.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:57.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:59 np0005593233 nova_compute[222017]: 2026-01-23 11:02:59.552 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:02:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:59.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:01.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:02 np0005593233 nova_compute[222017]: 2026-01-23 11:03:02.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:02 np0005593233 podman[309964]: 2026-01-23 11:03:02.112870805 +0000 UTC m=+0.117067754 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:03:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:03.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:04.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:04 np0005593233 nova_compute[222017]: 2026-01-23 11:03:04.403 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:04 np0005593233 nova_compute[222017]: 2026-01-23 11:03:04.555 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:03:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:06.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:03:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:07 np0005593233 nova_compute[222017]: 2026-01-23 11:03:07.013 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:07.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:08.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:09 np0005593233 nova_compute[222017]: 2026-01-23 11:03:09.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:09.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:10.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:11.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:12 np0005593233 nova_compute[222017]: 2026-01-23 11:03:12.016 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:12.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:13.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:14.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:14 np0005593233 nova_compute[222017]: 2026-01-23 11:03:14.556 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 06:03:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 06:03:14 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:15.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:16.015707704 +0000 UTC m=+0.108950055 container create 85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hopper, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:15.935535809 +0000 UTC m=+0.028778240 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 06:03:16 np0005593233 systemd[1]: Started libpod-conmon-85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d.scope.
Jan 23 06:03:16 np0005593233 systemd[1]: Started libcrun container.
Jan 23 06:03:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:16.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:16.336325413 +0000 UTC m=+0.429567854 container init 85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hopper, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:16.350099141 +0000 UTC m=+0.443341502 container start 85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hopper, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:16.353429704 +0000 UTC m=+0.446672075 container attach 85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hopper, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 23 06:03:16 np0005593233 suspicious_hopper[310276]: 167 167
Jan 23 06:03:16 np0005593233 systemd[1]: libpod-85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d.scope: Deactivated successfully.
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:16.360066931 +0000 UTC m=+0.453309282 container died 85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hopper, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, OSD_FLAVOR=default)
Jan 23 06:03:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay-0b68d7f9ffcc5319a35807a4bd7889774948717f777e940b432d779fae94f294-merged.mount: Deactivated successfully.
Jan 23 06:03:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:16 np0005593233 podman[310260]: 2026-01-23 11:03:16.589604568 +0000 UTC m=+0.682846949 container remove 85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=suspicious_hopper, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 23 06:03:16 np0005593233 systemd[1]: libpod-conmon-85a9f4bd14e4a6ab61e65868014a41247bc40cf45832dfd750d57f62dbd4715d.scope: Deactivated successfully.
Jan 23 06:03:16 np0005593233 podman[310301]: 2026-01-23 11:03:16.840708041 +0000 UTC m=+0.072294314 container create be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_morse, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 23 06:03:16 np0005593233 systemd[1]: Started libpod-conmon-be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7.scope.
Jan 23 06:03:16 np0005593233 podman[310301]: 2026-01-23 11:03:16.810768769 +0000 UTC m=+0.042355092 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 06:03:16 np0005593233 systemd[1]: Started libcrun container.
Jan 23 06:03:16 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21b92eb8a8fa70ce78ba2b199712d4a4f09fca94a46d972da98fcc5acb7b12d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 06:03:16 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21b92eb8a8fa70ce78ba2b199712d4a4f09fca94a46d972da98fcc5acb7b12d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 06:03:16 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21b92eb8a8fa70ce78ba2b199712d4a4f09fca94a46d972da98fcc5acb7b12d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 06:03:16 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21b92eb8a8fa70ce78ba2b199712d4a4f09fca94a46d972da98fcc5acb7b12d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 06:03:16 np0005593233 podman[310301]: 2026-01-23 11:03:16.941184658 +0000 UTC m=+0.172770971 container init be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_morse, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:03:16 np0005593233 podman[310301]: 2026-01-23 11:03:16.954548284 +0000 UTC m=+0.186134557 container start be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_morse, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:03:16 np0005593233 podman[310301]: 2026-01-23 11:03:16.959270267 +0000 UTC m=+0.190856530 container attach be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_morse, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 23 06:03:17 np0005593233 nova_compute[222017]: 2026-01-23 11:03:17.021 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:17.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:18 np0005593233 podman[310332]: 2026-01-23 11:03:18.062080297 +0000 UTC m=+0.071016937 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:03:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:18.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:18 np0005593233 trusting_morse[310317]: [
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:    {
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "available": false,
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "ceph_device": false,
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "lsm_data": {},
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "lvs": [],
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "path": "/dev/sr0",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "rejected_reasons": [
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "Has a FileSystem",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "Insufficient space (<5GB)"
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        ],
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        "sys_api": {
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "actuators": null,
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "device_nodes": "sr0",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "devname": "sr0",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "human_readable_size": "482.00 KB",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "id_bus": "ata",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "model": "QEMU DVD-ROM",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "nr_requests": "2",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "parent": "/dev/sr0",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "partitions": {},
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "path": "/dev/sr0",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "removable": "1",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "rev": "2.5+",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "ro": "0",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "rotational": "1",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "sas_address": "",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "sas_device_handle": "",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "scheduler_mode": "mq-deadline",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "sectors": 0,
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "sectorsize": "2048",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "size": 493568.0,
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "support_discard": "2048",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "type": "disk",
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:            "vendor": "QEMU"
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:        }
Jan 23 06:03:18 np0005593233 trusting_morse[310317]:    }
Jan 23 06:03:18 np0005593233 trusting_morse[310317]: ]
Jan 23 06:03:18 np0005593233 systemd[1]: libpod-be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7.scope: Deactivated successfully.
Jan 23 06:03:18 np0005593233 systemd[1]: libpod-be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7.scope: Consumed 1.417s CPU time.
Jan 23 06:03:18 np0005593233 podman[311594]: 2026-01-23 11:03:18.497894117 +0000 UTC m=+0.029957964 container died be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_morse, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 23 06:03:18 np0005593233 systemd[1]: var-lib-containers-storage-overlay-f21b92eb8a8fa70ce78ba2b199712d4a4f09fca94a46d972da98fcc5acb7b12d-merged.mount: Deactivated successfully.
Jan 23 06:03:18 np0005593233 podman[311594]: 2026-01-23 11:03:18.711501105 +0000 UTC m=+0.243564942 container remove be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 06:03:18 np0005593233 systemd[1]: libpod-conmon-be44c0b7d87b6c2c9575a11d76f6b1378c8b0b204d836777ff4ae967be563ae7.scope: Deactivated successfully.
Jan 23 06:03:19 np0005593233 nova_compute[222017]: 2026-01-23 11:03:19.558 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:19.614 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=96, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=95) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:03:19 np0005593233 nova_compute[222017]: 2026-01-23 11:03:19.614 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:19 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:19.616 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:03:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:19.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:20.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:21.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:22 np0005593233 nova_compute[222017]: 2026-01-23 11:03:22.057 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:22 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:03:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:22.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:23.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:03:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.407 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.407 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.407 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.407 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.408 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:03:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1315638975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:03:24 np0005593233 nova_compute[222017]: 2026-01-23 11:03:24.978 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.132 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.133 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4247MB free_disk=20.95954132080078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.133 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.134 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.253 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.254 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.365 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:25.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:03:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1639449038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.878 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.885 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.905 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.906 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:03:25 np0005593233 nova_compute[222017]: 2026-01-23 11:03:25.907 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:26.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:26 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:26.618 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '96'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:27 np0005593233 nova_compute[222017]: 2026-01-23 11:03:27.062 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000085s ======
Jan 23 06:03:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:27.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000085s
Jan 23 06:03:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:29 np0005593233 nova_compute[222017]: 2026-01-23 11:03:29.644 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:29.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:29 np0005593233 nova_compute[222017]: 2026-01-23 11:03:29.907 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:29 np0005593233 nova_compute[222017]: 2026-01-23 11:03:29.908 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:29 np0005593233 nova_compute[222017]: 2026-01-23 11:03:29.908 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:29 np0005593233 nova_compute[222017]: 2026-01-23 11:03:29.908 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:03:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:31.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:32 np0005593233 nova_compute[222017]: 2026-01-23 11:03:32.064 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:32.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:32 np0005593233 nova_compute[222017]: 2026-01-23 11:03:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:33 np0005593233 podman[311703]: 2026-01-23 11:03:33.216102373 +0000 UTC m=+0.219225069 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:03:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:33.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:34.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:34 np0005593233 nova_compute[222017]: 2026-01-23 11:03:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:34 np0005593233 nova_compute[222017]: 2026-01-23 11:03:34.653 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:35.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:37 np0005593233 nova_compute[222017]: 2026-01-23 11:03:37.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:37.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:38.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:38 np0005593233 nova_compute[222017]: 2026-01-23 11:03:38.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:39 np0005593233 nova_compute[222017]: 2026-01-23 11:03:39.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:39 np0005593233 nova_compute[222017]: 2026-01-23 11:03:39.656 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:39.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:40.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:41.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:42 np0005593233 nova_compute[222017]: 2026-01-23 11:03:42.070 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:42.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:42.727 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:42.728 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:42.728 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:43 np0005593233 nova_compute[222017]: 2026-01-23 11:03:43.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:43 np0005593233 nova_compute[222017]: 2026-01-23 11:03:43.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:03:43 np0005593233 nova_compute[222017]: 2026-01-23 11:03:43.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:03:43 np0005593233 nova_compute[222017]: 2026-01-23 11:03:43.400 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:03:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:43.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:44.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:44 np0005593233 nova_compute[222017]: 2026-01-23 11:03:44.657 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:45.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:47 np0005593233 nova_compute[222017]: 2026-01-23 11:03:47.073 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:03:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:47.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:03:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:48.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.886 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.887 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.902 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.969 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.969 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.975 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:03:48 np0005593233 nova_compute[222017]: 2026-01-23 11:03:48.976 222021 INFO nova.compute.claims [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 06:03:49 np0005593233 podman[311730]: 2026-01-23 11:03:49.085134961 +0000 UTC m=+0.093206333 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.116 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:03:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3908980968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.581 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.588 222021 DEBUG nova.compute.provider_tree [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.605 222021 DEBUG nova.scheduler.client.report [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.627 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.628 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.674 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.674 222021 DEBUG nova.network.neutron [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.694 222021 INFO nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.712 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.799 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.800 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.801 222021 INFO nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Creating image(s)#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.846 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.889 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.920 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:49 np0005593233 nova_compute[222017]: 2026-01-23 11:03:49.924 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.000 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.001 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.002 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.003 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.039 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.045 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 15466683-985e-412a-b13a-037d70f393ef_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:50.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:50 np0005593233 nova_compute[222017]: 2026-01-23 11:03:50.696 222021 DEBUG nova.policy [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.031 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 15466683-985e-412a-b13a-037d70f393ef_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.986s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.124 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image 15466683-985e-412a-b13a-037d70f393ef_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.416 222021 DEBUG nova.objects.instance [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.432 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.433 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Ensure instance console log exists: /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.434 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.434 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.434 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:51 np0005593233 nova_compute[222017]: 2026-01-23 11:03:51.569 222021 DEBUG nova.network.neutron [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Successfully created port: fa33e1ff-e04f-4862-a822-18bec48babca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:03:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:03:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:51.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.309 222021 DEBUG nova.network.neutron [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Successfully updated port: fa33e1ff-e04f-4862-a822-18bec48babca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.330 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.331 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.331 222021 DEBUG nova.network.neutron [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:03:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:52.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.416 222021 DEBUG nova.compute.manager [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.417 222021 DEBUG nova.compute.manager [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing instance network info cache due to event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.418 222021 DEBUG oslo_concurrency.lockutils [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:03:52 np0005593233 nova_compute[222017]: 2026-01-23 11:03:52.467 222021 DEBUG nova.network.neutron [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:03:53 np0005593233 nova_compute[222017]: 2026-01-23 11:03:53.618 222021 DEBUG nova.network.neutron [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:03:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:53.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:54.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.662 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.888 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.889 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance network_info: |[{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.890 222021 DEBUG oslo_concurrency.lockutils [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.891 222021 DEBUG nova.network.neutron [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.897 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Start _get_guest_xml network_info=[{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.905 222021 WARNING nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.911 222021 DEBUG nova.virt.libvirt.host [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.912 222021 DEBUG nova.virt.libvirt.host [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.916 222021 DEBUG nova.virt.libvirt.host [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.917 222021 DEBUG nova.virt.libvirt.host [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.919 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.919 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.920 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.920 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.920 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.920 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.921 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.921 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.921 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.921 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.921 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.922 222021 DEBUG nova.virt.hardware [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:03:54 np0005593233 nova_compute[222017]: 2026-01-23 11:03:54.925 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:03:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4049969727' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:03:55 np0005593233 nova_compute[222017]: 2026-01-23 11:03:55.415 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:55 np0005593233 nova_compute[222017]: 2026-01-23 11:03:55.458 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:55 np0005593233 nova_compute[222017]: 2026-01-23 11:03:55.465 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:55.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:03:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3761765950' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.121 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.656s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.123 222021 DEBUG nova.virt.libvirt.vif [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:03:49Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.123 222021 DEBUG nova.network.os_vif_util [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.125 222021 DEBUG nova.network.os_vif_util [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.126 222021 DEBUG nova.objects.instance [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.151 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <uuid>15466683-985e-412a-b13a-037d70f393ef</uuid>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <name>instance-000000d8</name>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1089282985</nova:name>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 11:03:54</nova:creationTime>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <nova:port uuid="fa33e1ff-e04f-4862-a822-18bec48babca">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <system>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <entry name="serial">15466683-985e-412a-b13a-037d70f393ef</entry>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <entry name="uuid">15466683-985e-412a-b13a-037d70f393ef</entry>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </system>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <os>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </os>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <features>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </features>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </clock>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  <devices>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/15466683-985e-412a-b13a-037d70f393ef_disk">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </source>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </auth>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </disk>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/15466683-985e-412a-b13a-037d70f393ef_disk.config">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </source>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      </auth>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </disk>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:93:d3:f5"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <target dev="tapfa33e1ff-e0"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </interface>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/console.log" append="off"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </serial>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <video>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </video>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </rng>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 06:03:56 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 06:03:56 np0005593233 nova_compute[222017]:  </devices>
Jan 23 06:03:56 np0005593233 nova_compute[222017]: </domain>
Jan 23 06:03:56 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.153 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Preparing to wait for external event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.154 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.154 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.154 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.155 222021 DEBUG nova.virt.libvirt.vif [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:03:49Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.155 222021 DEBUG nova.network.os_vif_util [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.156 222021 DEBUG nova.network.os_vif_util [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.157 222021 DEBUG os_vif [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.157 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.158 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.158 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.163 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.163 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa33e1ff-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.164 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa33e1ff-e0, col_values=(('external_ids', {'iface-id': 'fa33e1ff-e04f-4862-a822-18bec48babca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:93:d3:f5', 'vm-uuid': '15466683-985e-412a-b13a-037d70f393ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:56 np0005593233 NetworkManager[48871]: <info>  [1769166236.2140] manager: (tapfa33e1ff-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/414)
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.213 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.215 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.224 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.226 222021 INFO os_vif [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0')#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.339 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.340 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.340 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:93:d3:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.340 222021 INFO nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Using config drive#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.367 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:03:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:56.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:03:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.866 222021 INFO nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Creating config drive at /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/disk.config#033[00m
Jan 23 06:03:56 np0005593233 nova_compute[222017]: 2026-01-23 11:03:56.871 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmjqpis37 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.034 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmjqpis37" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.070 222021 DEBUG nova.storage.rbd_utils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 15466683-985e-412a-b13a-037d70f393ef_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.075 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/disk.config 15466683-985e-412a-b13a-037d70f393ef_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.190 222021 DEBUG nova.network.neutron [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updated VIF entry in instance network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.192 222021 DEBUG nova.network.neutron [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.226 222021 DEBUG oslo_concurrency.lockutils [req-14ec3928-bf8a-492c-9250-39a6421a2f42 req-e62cfc8e-4f0b-4b4c-b0cf-eac186bd6a4d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.763 222021 DEBUG oslo_concurrency.processutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/disk.config 15466683-985e-412a-b13a-037d70f393ef_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.688s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.764 222021 INFO nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Deleting local config drive /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/disk.config because it was imported into RBD.#033[00m
Jan 23 06:03:57 np0005593233 kernel: tapfa33e1ff-e0: entered promiscuous mode
Jan 23 06:03:57 np0005593233 NetworkManager[48871]: <info>  [1769166237.8446] manager: (tapfa33e1ff-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/415)
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.846 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T11:03:57Z|00912|binding|INFO|Claiming lport fa33e1ff-e04f-4862-a822-18bec48babca for this chassis.
Jan 23 06:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T11:03:57Z|00913|binding|INFO|fa33e1ff-e04f-4862-a822-18bec48babca: Claiming fa:16:3e:93:d3:f5 10.100.0.8
Jan 23 06:03:57 np0005593233 systemd-udevd[312072]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:03:57 np0005593233 systemd-machined[190954]: New machine qemu-98-instance-000000d8.
Jan 23 06:03:57 np0005593233 NetworkManager[48871]: <info>  [1769166237.9045] device (tapfa33e1ff-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:03:57 np0005593233 NetworkManager[48871]: <info>  [1769166237.9053] device (tapfa33e1ff-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:03:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:03:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:57.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:03:57 np0005593233 systemd[1]: Started Virtual Machine qemu-98-instance-000000d8.
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.923 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:57 np0005593233 ovn_controller[130653]: 2026-01-23T11:03:57Z|00914|binding|INFO|Setting lport fa33e1ff-e04f-4862-a822-18bec48babca ovn-installed in OVS
Jan 23 06:03:57 np0005593233 nova_compute[222017]: 2026-01-23 11:03:57.932 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:58 np0005593233 ovn_controller[130653]: 2026-01-23T11:03:58Z|00915|binding|INFO|Setting lport fa33e1ff-e04f-4862-a822-18bec48babca up in Southbound
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.089 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d3:f5 10.100.0.8'], port_security=['fa:16:3e:93:d3:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '15466683-985e-412a-b13a-037d70f393ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '05cfd882-0f82-42fb-b766-52bbef7ec922', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e248dcd5-89a8-4e77-af01-5f9fab92dfca, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=fa33e1ff-e04f-4862-a822-18bec48babca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.090 140224 INFO neutron.agent.ovn.metadata.agent [-] Port fa33e1ff-e04f-4862-a822-18bec48babca in datapath 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 bound to our chassis#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.091 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.109 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[05e1b213-8260-4062-90bf-0d9cdd76a52c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.110 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3ce7eb8b-e1 in ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.113 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3ce7eb8b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.113 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[76148d12-0782-4814-bd17-a9dd7b7df15f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.114 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8ade06a8-7620-4a58-b70c-5a990fa98a32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.127 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[10e30c64-bfbc-4049-b186-7d23b67ac75d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:58 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:58.155 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[57ebd6e5-ee29-427b-9dac-76f1b4c5a4d7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:58.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:58.804 222021 DEBUG nova.compute.manager [req-365132aa-64f4-4103-a002-1674f7ad3f4d req-90aa4ec6-fc25-4451-b094-6a287f7d82d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:58.804 222021 DEBUG oslo_concurrency.lockutils [req-365132aa-64f4-4103-a002-1674f7ad3f4d req-90aa4ec6-fc25-4451-b094-6a287f7d82d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:58.804 222021 DEBUG oslo_concurrency.lockutils [req-365132aa-64f4-4103-a002-1674f7ad3f4d req-90aa4ec6-fc25-4451-b094-6a287f7d82d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:58.805 222021 DEBUG oslo_concurrency.lockutils [req-365132aa-64f4-4103-a002-1674f7ad3f4d req-90aa4ec6-fc25-4451-b094-6a287f7d82d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:58.805 222021 DEBUG nova.compute.manager [req-365132aa-64f4-4103-a002-1674f7ad3f4d req-90aa4ec6-fc25-4451-b094-6a287f7d82d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Processing event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.188 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[9de1fa33-61b3-4a59-b047-04c83432af64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 NetworkManager[48871]: <info>  [1769166239.1987] manager: (tap3ce7eb8b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/416)
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.197 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9d78cc-9b06-4966-a760-5ab78238b588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.242 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[840c9054-bb74-4ea1-98a3-d3b9c9fd8b76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.247 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[f55586fd-c01e-4ad4-ab91-ee7dc14f412f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 NetworkManager[48871]: <info>  [1769166239.2748] device (tap3ce7eb8b-e0): carrier: link connected
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.282 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1f99df-27fc-4556-bbad-a7031c880b70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.303 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8f450eb0-20f6-4fd5-9b22-88e18064f91b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ce7eb8b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 272], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1015190, 'reachable_time': 30881, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312113, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.327 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d66cb1f9-13ee-4566-93a1-b7fce61ec081]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:862'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1015190, 'tstamp': 1015190}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312122, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.349 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a59260-0dbb-4b99-af41-6acf1a203361]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ce7eb8b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 272], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1015190, 'reachable_time': 30881, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312126, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.401 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[eed45f52-74bf-4117-979f-efb801017c8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.478 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[ab70c612-c371-473b-bd61-a0110746c1fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.481 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ce7eb8b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.481 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.482 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ce7eb8b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:59 np0005593233 NetworkManager[48871]: <info>  [1769166239.4865] manager: (tap3ce7eb8b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Jan 23 06:03:59 np0005593233 kernel: tap3ce7eb8b-e0: entered promiscuous mode
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.487 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.490 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ce7eb8b-e0, col_values=(('external_ids', {'iface-id': 'e3d8e09f-6a23-413b-b880-82ad4418b7d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:59 np0005593233 ovn_controller[130653]: 2026-01-23T11:03:59Z|00916|binding|INFO|Releasing lport e3d8e09f-6a23-413b-b880-82ad4418b7d7 from this chassis (sb_readonly=0)
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.492 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.521 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.523 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.525 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c881ef75-83b7-4fee-b00e-b2ec7857d5f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.527 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.pid.haproxy
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:03:59 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:03:59.528 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'env', 'PROCESS_TAG=haproxy-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.590 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166239.5901475, 15466683-985e-412a-b13a-037d70f393ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.591 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Started (Lifecycle Event)#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.593 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.597 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.600 222021 INFO nova.virt.libvirt.driver [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance spawned successfully.#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.600 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.664 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.804 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.808 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.835 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.837 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166239.5936983, 15466683-985e-412a-b13a-037d70f393ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.837 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.842 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.843 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.843 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.843 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.844 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.844 222021 DEBUG nova.virt.libvirt.driver [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.881 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.886 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166239.5959013, 15466683-985e-412a-b13a-037d70f393ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.886 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:03:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:03:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:59.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.920 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.924 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.934 222021 INFO nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Took 10.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.935 222021 DEBUG nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.948 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:03:59 np0005593233 nova_compute[222017]: 2026-01-23 11:03:59.998 222021 INFO nova.compute.manager [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Took 11.06 seconds to build instance.#033[00m
Jan 23 06:04:00 np0005593233 nova_compute[222017]: 2026-01-23 11:04:00.014 222021 DEBUG oslo_concurrency.lockutils [None req-0192aea4-d366-49be-9d67-3edb4ac765d6 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:00 np0005593233 podman[312183]: 2026-01-23 11:03:59.939400706 +0000 UTC m=+0.033845263 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:04:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:00.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:00 np0005593233 podman[312183]: 2026-01-23 11:04:00.620522146 +0000 UTC m=+0.714966683 container create d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 06:04:00 np0005593233 systemd[1]: Started libpod-conmon-d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2.scope.
Jan 23 06:04:00 np0005593233 systemd[1]: Started libcrun container.
Jan 23 06:04:00 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1476e069496228ec41dc559e3f7123de5e26fa3af37336f3453b6d28e4bcff8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:04:00 np0005593233 podman[312183]: 2026-01-23 11:04:00.758438155 +0000 UTC m=+0.852882712 container init d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:04:00 np0005593233 podman[312183]: 2026-01-23 11:04:00.76643304 +0000 UTC m=+0.860877597 container start d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:04:00 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [NOTICE]   (312202) : New worker (312204) forked
Jan 23 06:04:00 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [NOTICE]   (312202) : Loading success.
Jan 23 06:04:01 np0005593233 nova_compute[222017]: 2026-01-23 11:04:01.214 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:01.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:02.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:02 np0005593233 nova_compute[222017]: 2026-01-23 11:04:02.728 222021 DEBUG nova.compute.manager [req-2fcc027c-7780-4c89-8988-2ff45b750368 req-f82bab11-4e17-4b83-9ea8-04d589658afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:02 np0005593233 nova_compute[222017]: 2026-01-23 11:04:02.728 222021 DEBUG oslo_concurrency.lockutils [req-2fcc027c-7780-4c89-8988-2ff45b750368 req-f82bab11-4e17-4b83-9ea8-04d589658afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:02 np0005593233 nova_compute[222017]: 2026-01-23 11:04:02.728 222021 DEBUG oslo_concurrency.lockutils [req-2fcc027c-7780-4c89-8988-2ff45b750368 req-f82bab11-4e17-4b83-9ea8-04d589658afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:02 np0005593233 nova_compute[222017]: 2026-01-23 11:04:02.729 222021 DEBUG oslo_concurrency.lockutils [req-2fcc027c-7780-4c89-8988-2ff45b750368 req-f82bab11-4e17-4b83-9ea8-04d589658afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:02 np0005593233 nova_compute[222017]: 2026-01-23 11:04:02.729 222021 DEBUG nova.compute.manager [req-2fcc027c-7780-4c89-8988-2ff45b750368 req-f82bab11-4e17-4b83-9ea8-04d589658afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:02 np0005593233 nova_compute[222017]: 2026-01-23 11:04:02.729 222021 WARNING nova.compute.manager [req-2fcc027c-7780-4c89-8988-2ff45b750368 req-f82bab11-4e17-4b83-9ea8-04d589658afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state active and task_state None.#033[00m
Jan 23 06:04:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:03.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:04 np0005593233 podman[312213]: 2026-01-23 11:04:04.099163798 +0000 UTC m=+0.112924887 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:04:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:04.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:04 np0005593233 nova_compute[222017]: 2026-01-23 11:04:04.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:05 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:05Z|00917|binding|INFO|Releasing lport e3d8e09f-6a23-413b-b880-82ad4418b7d7 from this chassis (sb_readonly=0)
Jan 23 06:04:05 np0005593233 NetworkManager[48871]: <info>  [1769166245.6745] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/418)
Jan 23 06:04:05 np0005593233 NetworkManager[48871]: <info>  [1769166245.6765] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/419)
Jan 23 06:04:05 np0005593233 nova_compute[222017]: 2026-01-23 11:04:05.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:05 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:05Z|00918|binding|INFO|Releasing lport e3d8e09f-6a23-413b-b880-82ad4418b7d7 from this chassis (sb_readonly=0)
Jan 23 06:04:05 np0005593233 nova_compute[222017]: 2026-01-23 11:04:05.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:05 np0005593233 nova_compute[222017]: 2026-01-23 11:04:05.755 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:05.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:06 np0005593233 nova_compute[222017]: 2026-01-23 11:04:06.151 222021 DEBUG nova.compute.manager [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:06 np0005593233 nova_compute[222017]: 2026-01-23 11:04:06.152 222021 DEBUG nova.compute.manager [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing instance network info cache due to event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:04:06 np0005593233 nova_compute[222017]: 2026-01-23 11:04:06.152 222021 DEBUG oslo_concurrency.lockutils [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:04:06 np0005593233 nova_compute[222017]: 2026-01-23 11:04:06.153 222021 DEBUG oslo_concurrency.lockutils [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:04:06 np0005593233 nova_compute[222017]: 2026-01-23 11:04:06.153 222021 DEBUG nova.network.neutron [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:04:06 np0005593233 nova_compute[222017]: 2026-01-23 11:04:06.273 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:06.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:07 np0005593233 nova_compute[222017]: 2026-01-23 11:04:07.907 222021 DEBUG nova.network.neutron [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updated VIF entry in instance network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:04:07 np0005593233 nova_compute[222017]: 2026-01-23 11:04:07.908 222021 DEBUG nova.network.neutron [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:04:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:07.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:07 np0005593233 nova_compute[222017]: 2026-01-23 11:04:07.927 222021 DEBUG oslo_concurrency.lockutils [req-26a046a7-0e80-4d3b-8b17-eb2a055a6663 req-386dc08a-2fa8-4981-ad74-e5598ab1a077 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:04:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:08.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:09 np0005593233 nova_compute[222017]: 2026-01-23 11:04:09.669 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:09.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:10.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:11 np0005593233 nova_compute[222017]: 2026-01-23 11:04:11.276 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:11.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:12.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #205. Immutable memtables: 0.
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.636730) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 131] Flushing memtable with next log file: 205
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252636767, "job": 131, "event": "flush_started", "num_memtables": 1, "num_entries": 2343, "num_deletes": 251, "total_data_size": 6006339, "memory_usage": 6074720, "flush_reason": "Manual Compaction"}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 131] Level-0 flush table #206: started
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252667056, "cf_name": "default", "job": 131, "event": "table_file_creation", "file_number": 206, "file_size": 3896466, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 96750, "largest_seqno": 99088, "table_properties": {"data_size": 3886792, "index_size": 6167, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19489, "raw_average_key_size": 20, "raw_value_size": 3867713, "raw_average_value_size": 4033, "num_data_blocks": 269, "num_entries": 959, "num_filter_entries": 959, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166039, "oldest_key_time": 1769166039, "file_creation_time": 1769166252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 206, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 131] Flush lasted 30380 microseconds, and 9069 cpu microseconds.
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.667106) [db/flush_job.cc:967] [default] [JOB 131] Level-0 flush table #206: 3896466 bytes OK
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.667134) [db/memtable_list.cc:519] [default] Level-0 commit table #206 started
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.669805) [db/memtable_list.cc:722] [default] Level-0 commit table #206: memtable #1 done
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.669826) EVENT_LOG_v1 {"time_micros": 1769166252669819, "job": 131, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.669845) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 131] Try to delete WAL files size 5995843, prev total WAL file size 5995843, number of live WAL files 2.
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000202.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.671794) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038373835' seq:72057594037927935, type:22 .. '7061786F730039303337' seq:0, type:0; will stop at (end)
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 132] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 131 Base level 0, inputs: [206(3805KB)], [204(11MB)]
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252671830, "job": 132, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [206], "files_L6": [204], "score": -1, "input_data_size": 16318802, "oldest_snapshot_seqno": -1}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 132] Generated table #207: 11616 keys, 14318693 bytes, temperature: kUnknown
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252838906, "cf_name": "default", "job": 132, "event": "table_file_creation", "file_number": 207, "file_size": 14318693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14245046, "index_size": 43518, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29061, "raw_key_size": 306572, "raw_average_key_size": 26, "raw_value_size": 14043509, "raw_average_value_size": 1208, "num_data_blocks": 1651, "num_entries": 11616, "num_filter_entries": 11616, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 207, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.839358) [db/compaction/compaction_job.cc:1663] [default] [JOB 132] Compacted 1@0 + 1@6 files to L6 => 14318693 bytes
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.844159) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 97.5 rd, 85.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 11.8 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 12135, records dropped: 519 output_compression: NoCompression
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.844191) EVENT_LOG_v1 {"time_micros": 1769166252844177, "job": 132, "event": "compaction_finished", "compaction_time_micros": 167314, "compaction_time_cpu_micros": 34639, "output_level": 6, "num_output_files": 1, "total_output_size": 14318693, "num_input_records": 12135, "num_output_records": 11616, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000206.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252845464, "job": 132, "event": "table_file_deletion", "file_number": 206}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000204.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252848697, "job": 132, "event": "table_file_deletion", "file_number": 204}
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.671585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.848882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.848893) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.848897) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.848901) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:04:12.848904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:13.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:14.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:14 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:14Z|00135|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:93:d3:f5 10.100.0.8
Jan 23 06:04:14 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:14Z|00136|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:93:d3:f5 10.100.0.8
Jan 23 06:04:14 np0005593233 nova_compute[222017]: 2026-01-23 11:04:14.672 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:15.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:16 np0005593233 nova_compute[222017]: 2026-01-23 11:04:16.279 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:16.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:17.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:18.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:19 np0005593233 nova_compute[222017]: 2026-01-23 11:04:19.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:19 np0005593233 nova_compute[222017]: 2026-01-23 11:04:19.694 222021 INFO nova.compute.manager [None req-c5752774-4a20-4f6f-a00d-fc8c57a11dd2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Get console output#033[00m
Jan 23 06:04:19 np0005593233 nova_compute[222017]: 2026-01-23 11:04:19.702 264307 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:04:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:19.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:20 np0005593233 podman[312242]: 2026-01-23 11:04:20.036499747 +0000 UTC m=+0.050930044 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:04:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:20.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:21 np0005593233 nova_compute[222017]: 2026-01-23 11:04:21.281 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:21.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:22.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:23 np0005593233 nova_compute[222017]: 2026-01-23 11:04:23.221 222021 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:04:23 np0005593233 nova_compute[222017]: 2026-01-23 11:04:23.222 222021 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:04:23 np0005593233 nova_compute[222017]: 2026-01-23 11:04:23.222 222021 DEBUG nova.network.neutron [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:04:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:23.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.422 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.423 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.423 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.423 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:24.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:04:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/820329585' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:04:24 np0005593233 nova_compute[222017]: 2026-01-23 11:04:24.944 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.015 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.016 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000d8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.176 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.178 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4069MB free_disk=20.942890167236328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.178 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.178 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.235 222021 INFO nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating resource usage from migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.273 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.273 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.273 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.327 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.678 222021 DEBUG nova.network.neutron [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.694 222021 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:04:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:04:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/468712538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.779 222021 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.780 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Creating file /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/9c932153dadd46fd977070f26c818a29.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.780 222021 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/9c932153dadd46fd977070f26c818a29.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.815 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.828 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.846 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.868 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:04:25 np0005593233 nova_compute[222017]: 2026-01-23 11:04:25.869 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:25.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.240 222021 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/9c932153dadd46fd977070f26c818a29.tmp" returned: 1 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.241 222021 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/9c932153dadd46fd977070f26c818a29.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.241 222021 DEBUG nova.virt.libvirt.volume.remotefs [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Creating directory /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.242 222021 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.284 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:26.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.470 222021 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:26 np0005593233 nova_compute[222017]: 2026-01-23 11:04:26.475 222021 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 06:04:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:27.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:28.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:29 np0005593233 nova_compute[222017]: 2026-01-23 11:04:29.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:29 np0005593233 kernel: tapfa33e1ff-e0 (unregistering): left promiscuous mode
Jan 23 06:04:29 np0005593233 NetworkManager[48871]: <info>  [1769166269.8276] device (tapfa33e1ff-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:04:29 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:29Z|00919|binding|INFO|Releasing lport fa33e1ff-e04f-4862-a822-18bec48babca from this chassis (sb_readonly=0)
Jan 23 06:04:29 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:29Z|00920|binding|INFO|Setting lport fa33e1ff-e04f-4862-a822-18bec48babca down in Southbound
Jan 23 06:04:29 np0005593233 nova_compute[222017]: 2026-01-23 11:04:29.843 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:29 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:29Z|00921|binding|INFO|Removing iface tapfa33e1ff-e0 ovn-installed in OVS
Jan 23 06:04:29 np0005593233 nova_compute[222017]: 2026-01-23 11:04:29.871 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:29 np0005593233 nova_compute[222017]: 2026-01-23 11:04:29.872 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:29 np0005593233 nova_compute[222017]: 2026-01-23 11:04:29.874 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:29 np0005593233 nova_compute[222017]: 2026-01-23 11:04:29.874 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:29 np0005593233 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000d8.scope: Deactivated successfully.
Jan 23 06:04:29 np0005593233 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000d8.scope: Consumed 16.447s CPU time.
Jan 23 06:04:29 np0005593233 systemd-machined[190954]: Machine qemu-98-instance-000000d8 terminated.
Jan 23 06:04:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:29.911 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d3:f5 10.100.0.8'], port_security=['fa:16:3e:93:d3:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '15466683-985e-412a-b13a-037d70f393ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '05cfd882-0f82-42fb-b766-52bbef7ec922', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e248dcd5-89a8-4e77-af01-5f9fab92dfca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=fa33e1ff-e04f-4862-a822-18bec48babca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:04:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:29.913 140224 INFO neutron.agent.ovn.metadata.agent [-] Port fa33e1ff-e04f-4862-a822-18bec48babca in datapath 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 unbound from our chassis#033[00m
Jan 23 06:04:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:29.914 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:04:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:29.916 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d8897c1a-bbae-4ba0-b624-550e0b045618]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:29 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:29.917 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 namespace which is not needed anymore#033[00m
Jan 23 06:04:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:29.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.046 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Triggering sync for uuid 15466683-985e-412a-b13a-037d70f393ef _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.048 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.049 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "15466683-985e-412a-b13a-037d70f393ef" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.049 222021 INFO nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] During sync_power_state the instance has a pending task (resize_migrating). Skip.#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.050 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "15466683-985e-412a-b13a-037d70f393ef" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:30 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [NOTICE]   (312202) : haproxy version is 2.8.14-c23fe91
Jan 23 06:04:30 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [NOTICE]   (312202) : path to executable is /usr/sbin/haproxy
Jan 23 06:04:30 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [WARNING]  (312202) : Exiting Master process...
Jan 23 06:04:30 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [ALERT]    (312202) : Current worker (312204) exited with code 143 (Terminated)
Jan 23 06:04:30 np0005593233 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[312198]: [WARNING]  (312202) : All workers exited. Exiting... (0)
Jan 23 06:04:30 np0005593233 systemd[1]: libpod-d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2.scope: Deactivated successfully.
Jan 23 06:04:30 np0005593233 podman[312465]: 2026-01-23 11:04:30.118101417 +0000 UTC m=+0.067152520 container died d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 06:04:30 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2-userdata-shm.mount: Deactivated successfully.
Jan 23 06:04:30 np0005593233 systemd[1]: var-lib-containers-storage-overlay-b1476e069496228ec41dc559e3f7123de5e26fa3af37336f3453b6d28e4bcff8-merged.mount: Deactivated successfully.
Jan 23 06:04:30 np0005593233 podman[312465]: 2026-01-23 11:04:30.165254403 +0000 UTC m=+0.114305456 container cleanup d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:04:30 np0005593233 systemd[1]: libpod-conmon-d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2.scope: Deactivated successfully.
Jan 23 06:04:30 np0005593233 podman[312504]: 2026-01-23 11:04:30.233317028 +0000 UTC m=+0.042911809 container remove d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.238 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[805e6ca2-799e-43bb-a3ab-c19311a2717b]: (4, ('Fri Jan 23 11:04:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 (d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2)\nd68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2\nFri Jan 23 11:04:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 (d68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2)\nd68f0733571215f6d9bf0297e5db7daf6134831e27f34fe5a9d0e4071d85f4b2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.240 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a57624a0-1d18-4f91-be90-970c1ef35a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.241 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ce7eb8b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.243 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:30 np0005593233 kernel: tap3ce7eb8b-e0: left promiscuous mode
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.258 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.261 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[64593a7b-06cc-4d8f-a10d-e751b82d1e69]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.276 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[07aa1873-bf9e-4fe9-81cb-1426d3086946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.278 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7dda9b-84a1-4f0f-83f5-702d5761ada9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.300 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b044aa93-cff1-49e0-a463-6093260ec9e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1015181, 'reachable_time': 19003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312523, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 systemd[1]: run-netns-ovnmeta\x2d3ce7eb8b\x2de719\x2d4e00\x2dbbf7\x2d177b1f60cd38.mount: Deactivated successfully.
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.305 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:04:30 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:30.306 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[be7585ef-6274-46d9-8d0f-de2ba7bddadc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.410 222021 DEBUG nova.compute.manager [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.410 222021 DEBUG oslo_concurrency.lockutils [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.411 222021 DEBUG oslo_concurrency.lockutils [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.411 222021 DEBUG oslo_concurrency.lockutils [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.411 222021 DEBUG nova.compute.manager [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.411 222021 WARNING nova.compute.manager [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 06:04:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:30.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:30 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.501 222021 INFO nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance shutdown successfully after 4 seconds.#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.507 222021 INFO nova.virt.libvirt.driver [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance destroyed successfully.#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.508 222021 DEBUG nova.virt.libvirt.vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:03:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:04:22Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.508 222021 DEBUG nova.network.os_vif_util [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.509 222021 DEBUG nova.network.os_vif_util [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.509 222021 DEBUG os_vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.511 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.511 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa33e1ff-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.512 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.515 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.518 222021 INFO os_vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0')#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.524 222021 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] skipping disk for instance-000000d8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.524 222021 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] skipping disk for instance-000000d8 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:04:30 np0005593233 nova_compute[222017]: 2026-01-23 11:04:30.677 222021 DEBUG neutronclient.v2_0.client [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fa33e1ff-e04f-4862-a822-18bec48babca for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 06:04:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:31 np0005593233 nova_compute[222017]: 2026-01-23 11:04:31.802 222021 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:31 np0005593233 nova_compute[222017]: 2026-01-23 11:04:31.802 222021 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:31 np0005593233 nova_compute[222017]: 2026-01-23 11:04:31.803 222021 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:31.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:32.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.565 222021 DEBUG nova.compute.manager [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.565 222021 DEBUG oslo_concurrency.lockutils [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.566 222021 DEBUG oslo_concurrency.lockutils [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.566 222021 DEBUG oslo_concurrency.lockutils [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.566 222021 DEBUG nova.compute.manager [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:32 np0005593233 nova_compute[222017]: 2026-01-23 11:04:32.567 222021 WARNING nova.compute.manager [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 06:04:33 np0005593233 nova_compute[222017]: 2026-01-23 11:04:33.378 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:33.379 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=97, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=96) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:04:33 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:33.380 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:04:33 np0005593233 nova_compute[222017]: 2026-01-23 11:04:33.757 222021 DEBUG nova.compute.manager [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:33 np0005593233 nova_compute[222017]: 2026-01-23 11:04:33.758 222021 DEBUG nova.compute.manager [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing instance network info cache due to event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:04:33 np0005593233 nova_compute[222017]: 2026-01-23 11:04:33.758 222021 DEBUG oslo_concurrency.lockutils [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:04:33 np0005593233 nova_compute[222017]: 2026-01-23 11:04:33.759 222021 DEBUG oslo_concurrency.lockutils [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:04:33 np0005593233 nova_compute[222017]: 2026-01-23 11:04:33.759 222021 DEBUG nova.network.neutron [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:04:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:33.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:04:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:34 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:04:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:34.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:34 np0005593233 nova_compute[222017]: 2026-01-23 11:04:34.679 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:35 np0005593233 podman[312524]: 2026-01-23 11:04:35.105405417 +0000 UTC m=+0.118945587 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 06:04:35 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:35.382 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '97'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:35 np0005593233 nova_compute[222017]: 2026-01-23 11:04:35.513 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:35.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:35 np0005593233 nova_compute[222017]: 2026-01-23 11:04:35.983 222021 DEBUG nova.network.neutron [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updated VIF entry in instance network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:04:35 np0005593233 nova_compute[222017]: 2026-01-23 11:04:35.984 222021 DEBUG nova.network.neutron [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:04:36 np0005593233 nova_compute[222017]: 2026-01-23 11:04:36.009 222021 DEBUG oslo_concurrency.lockutils [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:04:36 np0005593233 nova_compute[222017]: 2026-01-23 11:04:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:36.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e425 e425: 3 total, 3 up, 3 in
Jan 23 06:04:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:37.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:38.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:39 np0005593233 nova_compute[222017]: 2026-01-23 11:04:39.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:39.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:40 np0005593233 nova_compute[222017]: 2026-01-23 11:04:40.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:40 np0005593233 nova_compute[222017]: 2026-01-23 11:04:40.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:40.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:40 np0005593233 nova_compute[222017]: 2026-01-23 11:04:40.516 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:41.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:42.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:42 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:42.728 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:42.729 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:04:42.730 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.786 222021 DEBUG nova.compute.manager [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.786 222021 DEBUG oslo_concurrency.lockutils [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.787 222021 DEBUG oslo_concurrency.lockutils [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.787 222021 DEBUG oslo_concurrency.lockutils [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.787 222021 DEBUG nova.compute.manager [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.787 222021 WARNING nova.compute.manager [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state resized and task_state None.#033[00m
Jan 23 06:04:43 np0005593233 nova_compute[222017]: 2026-01-23 11:04:43.808 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:04:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:43.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:44.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:44 np0005593233 nova_compute[222017]: 2026-01-23 11:04:44.685 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:44 np0005593233 nova_compute[222017]: 2026-01-23 11:04:44.959 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:44 np0005593233 nova_compute[222017]: 2026-01-23 11:04:44.960 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:44 np0005593233 nova_compute[222017]: 2026-01-23 11:04:44.960 222021 DEBUG nova.compute.manager [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Going to confirm migration 24 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 23 06:04:45 np0005593233 nova_compute[222017]: 2026-01-23 11:04:45.079 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166270.0779831, 15466683-985e-412a-b13a-037d70f393ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:04:45 np0005593233 nova_compute[222017]: 2026-01-23 11:04:45.079 222021 INFO nova.compute.manager [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:04:45 np0005593233 nova_compute[222017]: 2026-01-23 11:04:45.254 222021 DEBUG nova.compute.manager [None req-f5bbe915-820e-4732-b377-69022e08b34a - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:04:45 np0005593233 nova_compute[222017]: 2026-01-23 11:04:45.257 222021 DEBUG nova.compute.manager [None req-f5bbe915-820e-4732-b377-69022e08b34a - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:04:45 np0005593233 nova_compute[222017]: 2026-01-23 11:04:45.518 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:45 np0005593233 nova_compute[222017]: 2026-01-23 11:04:45.622 222021 INFO nova.compute.manager [None req-f5bbe915-820e-4732-b377-69022e08b34a - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Jan 23 06:04:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:45.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:46.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.736 222021 DEBUG neutronclient.v2_0.client [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fa33e1ff-e04f-4862-a822-18bec48babca for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.738 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.738 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.738 222021 DEBUG nova.network.neutron [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.738 222021 DEBUG nova.objects.instance [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'info_cache' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.825 222021 DEBUG nova.compute.manager [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.825 222021 DEBUG oslo_concurrency.lockutils [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.826 222021 DEBUG oslo_concurrency.lockutils [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.826 222021 DEBUG oslo_concurrency.lockutils [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.826 222021 DEBUG nova.compute.manager [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:46 np0005593233 nova_compute[222017]: 2026-01-23 11:04:46.826 222021 WARNING nova.compute.manager [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state resized and task_state None.#033[00m
Jan 23 06:04:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:47.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:04:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:48.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:04:49 np0005593233 nova_compute[222017]: 2026-01-23 11:04:49.687 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:49.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:50.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:50 np0005593233 nova_compute[222017]: 2026-01-23 11:04:50.520 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:50 np0005593233 nova_compute[222017]: 2026-01-23 11:04:50.691 222021 DEBUG nova.network.neutron [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:04:51 np0005593233 podman[312602]: 2026-01-23 11:04:51.099980657 +0000 UTC m=+0.106278980 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 06:04:51 np0005593233 nova_compute[222017]: 2026-01-23 11:04:51.709 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:04:51 np0005593233 nova_compute[222017]: 2026-01-23 11:04:51.710 222021 DEBUG nova.objects.instance [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:51 np0005593233 nova_compute[222017]: 2026-01-23 11:04:51.968 222021 DEBUG nova.storage.rbd_utils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] removing snapshot(nova-resize) on rbd image(15466683-985e-412a-b13a-037d70f393ef_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 06:04:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:51.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:52.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e426 e426: 3 total, 3 up, 3 in
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.597 222021 DEBUG nova.virt.libvirt.vif [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:04:41Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.598 222021 DEBUG nova.network.os_vif_util [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.599 222021 DEBUG nova.network.os_vif_util [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.600 222021 DEBUG os_vif [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.603 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa33e1ff-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.603 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.606 222021 INFO os_vif [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0')#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.606 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:52 np0005593233 nova_compute[222017]: 2026-01-23 11:04:52.607 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:53 np0005593233 nova_compute[222017]: 2026-01-23 11:04:53.013 222021 DEBUG oslo_concurrency.processutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:04:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3071460998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:04:53 np0005593233 nova_compute[222017]: 2026-01-23 11:04:53.479 222021 DEBUG oslo_concurrency.processutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:53 np0005593233 nova_compute[222017]: 2026-01-23 11:04:53.490 222021 DEBUG nova.compute.provider_tree [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:04:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:54.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:54 np0005593233 nova_compute[222017]: 2026-01-23 11:04:54.251 222021 DEBUG nova.scheduler.client.report [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:04:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:54.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:54 np0005593233 nova_compute[222017]: 2026-01-23 11:04:54.691 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:55 np0005593233 nova_compute[222017]: 2026-01-23 11:04:55.067 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:55 np0005593233 nova_compute[222017]: 2026-01-23 11:04:55.237 222021 INFO nova.scheduler.client.report [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocation for migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447#033[00m
Jan 23 06:04:55 np0005593233 nova_compute[222017]: 2026-01-23 11:04:55.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:55 np0005593233 nova_compute[222017]: 2026-01-23 11:04:55.864 222021 DEBUG oslo_concurrency.lockutils [None req-2a49e74c-7dbe-421a-96c8-2377378ce17e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 10.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:56.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:56.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:04:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:58.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:04:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:04:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:58.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:59 np0005593233 nova_compute[222017]: 2026-01-23 11:04:59.707 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:59 np0005593233 ovn_controller[130653]: 2026-01-23T11:04:59Z|00922|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 23 06:05:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:00.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:00.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:00 np0005593233 nova_compute[222017]: 2026-01-23 11:05:00.525 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:02.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:02.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 e427: 3 total, 3 up, 3 in
Jan 23 06:05:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:04.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:04.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:04 np0005593233 nova_compute[222017]: 2026-01-23 11:05:04.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:05 np0005593233 nova_compute[222017]: 2026-01-23 11:05:05.528 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:06.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:06 np0005593233 podman[312680]: 2026-01-23 11:05:06.145691915 +0000 UTC m=+0.147504410 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:05:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:06.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:08.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:08.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:08 np0005593233 nova_compute[222017]: 2026-01-23 11:05:08.802 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:09 np0005593233 nova_compute[222017]: 2026-01-23 11:05:09.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:10.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:10 np0005593233 nova_compute[222017]: 2026-01-23 11:05:10.531 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:12.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:12.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:14.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:14 np0005593233 nova_compute[222017]: 2026-01-23 11:05:14.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:14.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:14 np0005593233 nova_compute[222017]: 2026-01-23 11:05:14.780 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:05:14.805 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=98, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=97) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:05:14 np0005593233 nova_compute[222017]: 2026-01-23 11:05:14.805 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:14 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:05:14.806 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:05:15 np0005593233 nova_compute[222017]: 2026-01-23 11:05:15.534 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:16.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:16.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:05:16.810 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '98'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:05:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:18.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:18 np0005593233 nova_compute[222017]: 2026-01-23 11:05:18.744 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:18 np0005593233 nova_compute[222017]: 2026-01-23 11:05:18.818 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:19 np0005593233 nova_compute[222017]: 2026-01-23 11:05:19.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:20.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:20.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:20 np0005593233 nova_compute[222017]: 2026-01-23 11:05:20.536 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:22.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:22 np0005593233 podman[312709]: 2026-01-23 11:05:22.054452152 +0000 UTC m=+0.063753705 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:05:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:22.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:24.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:24.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.785 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.809 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.855 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.856 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.856 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.856 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:05:24 np0005593233 nova_compute[222017]: 2026-01-23 11:05:24.857 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.340 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.536 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.537 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4284MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.538 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.538 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.539 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.935 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.936 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:05:25 np0005593233 nova_compute[222017]: 2026-01-23 11:05:25.963 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:05:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:26.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:26.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:05:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3741063465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:05:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:27 np0005593233 nova_compute[222017]: 2026-01-23 11:05:27.087 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:05:27 np0005593233 nova_compute[222017]: 2026-01-23 11:05:27.097 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:05:27 np0005593233 nova_compute[222017]: 2026-01-23 11:05:27.182 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:05:27 np0005593233 nova_compute[222017]: 2026-01-23 11:05:27.382 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:05:27 np0005593233 nova_compute[222017]: 2026-01-23 11:05:27.382 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:28.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:28.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:29 np0005593233 nova_compute[222017]: 2026-01-23 11:05:29.789 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:30.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:30 np0005593233 nova_compute[222017]: 2026-01-23 11:05:30.541 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:30.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:30 np0005593233 nova_compute[222017]: 2026-01-23 11:05:30.959 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:30 np0005593233 nova_compute[222017]: 2026-01-23 11:05:30.960 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:31 np0005593233 nova_compute[222017]: 2026-01-23 11:05:31.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:31 np0005593233 nova_compute[222017]: 2026-01-23 11:05:31.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:05:31 np0005593233 nova_compute[222017]: 2026-01-23 11:05:31.427 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:05:31 np0005593233 nova_compute[222017]: 2026-01-23 11:05:31.428 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:31 np0005593233 nova_compute[222017]: 2026-01-23 11:05:31.428 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:05:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:32.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:32 np0005593233 nova_compute[222017]: 2026-01-23 11:05:32.448 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:32 np0005593233 nova_compute[222017]: 2026-01-23 11:05:32.449 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:05:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:32.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:34.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:34 np0005593233 nova_compute[222017]: 2026-01-23 11:05:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:34.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:34 np0005593233 nova_compute[222017]: 2026-01-23 11:05:34.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:35 np0005593233 nova_compute[222017]: 2026-01-23 11:05:35.609 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:36.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:36.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:37 np0005593233 podman[312773]: 2026-01-23 11:05:37.10011565 +0000 UTC m=+0.110571322 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 06:05:37 np0005593233 nova_compute[222017]: 2026-01-23 11:05:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:38.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:38.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:39 np0005593233 nova_compute[222017]: 2026-01-23 11:05:39.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:40.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:40 np0005593233 nova_compute[222017]: 2026-01-23 11:05:40.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:40.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:40 np0005593233 nova_compute[222017]: 2026-01-23 11:05:40.613 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:41 np0005593233 nova_compute[222017]: 2026-01-23 11:05:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:42.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:42.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:05:42.730 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:05:42.731 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:05:42.731 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:44.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:05:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:44 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:05:44 np0005593233 nova_compute[222017]: 2026-01-23 11:05:44.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:44 np0005593233 nova_compute[222017]: 2026-01-23 11:05:44.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:05:44 np0005593233 nova_compute[222017]: 2026-01-23 11:05:44.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:05:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:44.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:44 np0005593233 nova_compute[222017]: 2026-01-23 11:05:44.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:45 np0005593233 nova_compute[222017]: 2026-01-23 11:05:45.616 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:46.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:46.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:48.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:48.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:48 np0005593233 nova_compute[222017]: 2026-01-23 11:05:48.745 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:05:49 np0005593233 nova_compute[222017]: 2026-01-23 11:05:49.802 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:50.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:50.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:50 np0005593233 nova_compute[222017]: 2026-01-23 11:05:50.619 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:52.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:52.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:53 np0005593233 podman[312930]: 2026-01-23 11:05:53.047103 +0000 UTC m=+0.061780859 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 06:05:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:54.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:54.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:54 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:54 np0005593233 nova_compute[222017]: 2026-01-23 11:05:54.805 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:55 np0005593233 nova_compute[222017]: 2026-01-23 11:05:55.667 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:56.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:05:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:56.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:05:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:05:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:58.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:05:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:05:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:58.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:59 np0005593233 nova_compute[222017]: 2026-01-23 11:05:59.808 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:00.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:00 np0005593233 nova_compute[222017]: 2026-01-23 11:06:00.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:00.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:02.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:02 np0005593233 ovn_controller[130653]: 2026-01-23T11:06:02Z|00923|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 06:06:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:02.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:04.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #208. Immutable memtables: 0.
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.230647) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 133] Flushing memtable with next log file: 208
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364230751, "job": 133, "event": "flush_started", "num_memtables": 1, "num_entries": 1401, "num_deletes": 258, "total_data_size": 3163171, "memory_usage": 3222544, "flush_reason": "Manual Compaction"}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 133] Level-0 flush table #209: started
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364248161, "cf_name": "default", "job": 133, "event": "table_file_creation", "file_number": 209, "file_size": 2056536, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 99093, "largest_seqno": 100489, "table_properties": {"data_size": 2050540, "index_size": 3262, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12993, "raw_average_key_size": 19, "raw_value_size": 2038280, "raw_average_value_size": 3107, "num_data_blocks": 144, "num_entries": 656, "num_filter_entries": 656, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166253, "oldest_key_time": 1769166253, "file_creation_time": 1769166364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 209, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 133] Flush lasted 17549 microseconds, and 7068 cpu microseconds.
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.248218) [db/flush_job.cc:967] [default] [JOB 133] Level-0 flush table #209: 2056536 bytes OK
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.248245) [db/memtable_list.cc:519] [default] Level-0 commit table #209 started
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.249734) [db/memtable_list.cc:722] [default] Level-0 commit table #209: memtable #1 done
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.249751) EVENT_LOG_v1 {"time_micros": 1769166364249746, "job": 133, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.249778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 133] Try to delete WAL files size 3156466, prev total WAL file size 3172603, number of live WAL files 2.
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000205.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.251472) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323734' seq:0, type:0; will stop at (end)
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 134] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 133 Base level 0, inputs: [209(2008KB)], [207(13MB)]
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364251574, "job": 134, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [209], "files_L6": [207], "score": -1, "input_data_size": 16375229, "oldest_snapshot_seqno": -1}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 134] Generated table #210: 11737 keys, 16240537 bytes, temperature: kUnknown
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364497090, "cf_name": "default", "job": 134, "event": "table_file_creation", "file_number": 210, "file_size": 16240537, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16163825, "index_size": 46263, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29381, "raw_key_size": 310042, "raw_average_key_size": 26, "raw_value_size": 15957928, "raw_average_value_size": 1359, "num_data_blocks": 1766, "num_entries": 11737, "num_filter_entries": 11737, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 210, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.497602) [db/compaction/compaction_job.cc:1663] [default] [JOB 134] Compacted 1@0 + 1@6 files to L6 => 16240537 bytes
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.499250) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 66.6 rd, 66.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.7 +0.0 blob) out(15.5 +0.0 blob), read-write-amplify(15.9) write-amplify(7.9) OK, records in: 12272, records dropped: 535 output_compression: NoCompression
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.499277) EVENT_LOG_v1 {"time_micros": 1769166364499262, "job": 134, "event": "compaction_finished", "compaction_time_micros": 245720, "compaction_time_cpu_micros": 43174, "output_level": 6, "num_output_files": 1, "total_output_size": 16240537, "num_input_records": 12272, "num_output_records": 11737, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000209.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364499932, "job": 134, "event": "table_file_deletion", "file_number": 209}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000207.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364504264, "job": 134, "event": "table_file_deletion", "file_number": 207}
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.251240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.504439) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.504446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.504448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.504450) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:04.504452) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:06:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:04.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:06:04 np0005593233 nova_compute[222017]: 2026-01-23 11:06:04.822 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:05 np0005593233 nova_compute[222017]: 2026-01-23 11:06:05.672 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:06.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:06.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:08.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:08 np0005593233 podman[313000]: 2026-01-23 11:06:08.144244606 +0000 UTC m=+0.159050815 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 06:06:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:08.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:09 np0005593233 nova_compute[222017]: 2026-01-23 11:06:09.825 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:06:09.967 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=99, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=98) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:06:09 np0005593233 nova_compute[222017]: 2026-01-23 11:06:09.967 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:06:09.969 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:06:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:10.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:10 np0005593233 nova_compute[222017]: 2026-01-23 11:06:10.674 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:10.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:12.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:12.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:14.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:14.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:14 np0005593233 nova_compute[222017]: 2026-01-23 11:06:14.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:15 np0005593233 nova_compute[222017]: 2026-01-23 11:06:15.676 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:16.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:16.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:06:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:18.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:06:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:18.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:18 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:06:18.970 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '99'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:19 np0005593233 nova_compute[222017]: 2026-01-23 11:06:19.859 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:20.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:20 np0005593233 nova_compute[222017]: 2026-01-23 11:06:20.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:20.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:22.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:22.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #211. Immutable memtables: 0.
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.055183) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 135] Flushing memtable with next log file: 211
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384055354, "job": 135, "event": "flush_started", "num_memtables": 1, "num_entries": 435, "num_deletes": 256, "total_data_size": 555805, "memory_usage": 564912, "flush_reason": "Manual Compaction"}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 135] Level-0 flush table #212: started
Jan 23 06:06:24 np0005593233 podman[313026]: 2026-01-23 11:06:24.060100332 +0000 UTC m=+0.067633374 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384061995, "cf_name": "default", "job": 135, "event": "table_file_creation", "file_number": 212, "file_size": 290117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 100494, "largest_seqno": 100924, "table_properties": {"data_size": 287744, "index_size": 472, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6387, "raw_average_key_size": 20, "raw_value_size": 283030, "raw_average_value_size": 898, "num_data_blocks": 21, "num_entries": 315, "num_filter_entries": 315, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166364, "oldest_key_time": 1769166364, "file_creation_time": 1769166384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 212, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 135] Flush lasted 6893 microseconds, and 2954 cpu microseconds.
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.062070) [db/flush_job.cc:967] [default] [JOB 135] Level-0 flush table #212: 290117 bytes OK
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.062112) [db/memtable_list.cc:519] [default] Level-0 commit table #212 started
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.063739) [db/memtable_list.cc:722] [default] Level-0 commit table #212: memtable #1 done
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.063761) EVENT_LOG_v1 {"time_micros": 1769166384063753, "job": 135, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.063789) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 135] Try to delete WAL files size 553044, prev total WAL file size 553044, number of live WAL files 2.
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000208.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.064611) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353136' seq:72057594037927935, type:22 .. '6D6772737461740033373733' seq:0, type:0; will stop at (end)
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 136] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 135 Base level 0, inputs: [212(283KB)], [210(15MB)]
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384064706, "job": 136, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [212], "files_L6": [210], "score": -1, "input_data_size": 16530654, "oldest_snapshot_seqno": -1}
Jan 23 06:06:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:24.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 136] Generated table #213: 11537 keys, 12676222 bytes, temperature: kUnknown
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384328503, "cf_name": "default", "job": 136, "event": "table_file_creation", "file_number": 213, "file_size": 12676222, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12605564, "index_size": 40721, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 306069, "raw_average_key_size": 26, "raw_value_size": 12407879, "raw_average_value_size": 1075, "num_data_blocks": 1535, "num_entries": 11537, "num_filter_entries": 11537, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 213, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.328828) [db/compaction/compaction_job.cc:1663] [default] [JOB 136] Compacted 1@0 + 1@6 files to L6 => 12676222 bytes
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.331409) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 62.7 rd, 48.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 15.5 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(100.7) write-amplify(43.7) OK, records in: 12052, records dropped: 515 output_compression: NoCompression
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.331466) EVENT_LOG_v1 {"time_micros": 1769166384331446, "job": 136, "event": "compaction_finished", "compaction_time_micros": 263721, "compaction_time_cpu_micros": 40892, "output_level": 6, "num_output_files": 1, "total_output_size": 12676222, "num_input_records": 12052, "num_output_records": 11537, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000212.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384331796, "job": 136, "event": "table_file_deletion", "file_number": 212}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000210.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384335566, "job": 136, "event": "table_file_deletion", "file_number": 210}
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.064490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.335606) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.335611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.335613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.335615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:06:24.335616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:24.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:24 np0005593233 nova_compute[222017]: 2026-01-23 11:06:24.862 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:25 np0005593233 nova_compute[222017]: 2026-01-23 11:06:25.680 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:26.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:26 np0005593233 nova_compute[222017]: 2026-01-23 11:06:26.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:26 np0005593233 nova_compute[222017]: 2026-01-23 11:06:26.568 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:26 np0005593233 nova_compute[222017]: 2026-01-23 11:06:26.569 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:26 np0005593233 nova_compute[222017]: 2026-01-23 11:06:26.569 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:26 np0005593233 nova_compute[222017]: 2026-01-23 11:06:26.569 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:06:26 np0005593233 nova_compute[222017]: 2026-01-23 11:06:26.569 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:06:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:26.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:06:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:06:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2899577754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.128 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.559s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.428 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.429 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4267MB free_disk=20.949779510498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.429 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.429 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.547 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.548 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.624 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.659 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.660 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.685 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.710 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:06:27 np0005593233 nova_compute[222017]: 2026-01-23 11:06:27.733 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:28.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:06:28 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4149931365' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:06:28 np0005593233 nova_compute[222017]: 2026-01-23 11:06:28.246 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:28 np0005593233 nova_compute[222017]: 2026-01-23 11:06:28.253 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:06:28 np0005593233 nova_compute[222017]: 2026-01-23 11:06:28.298 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:06:28 np0005593233 nova_compute[222017]: 2026-01-23 11:06:28.300 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:06:28 np0005593233 nova_compute[222017]: 2026-01-23 11:06:28.301 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:28.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:29 np0005593233 nova_compute[222017]: 2026-01-23 11:06:29.897 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:30.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:30 np0005593233 nova_compute[222017]: 2026-01-23 11:06:30.682 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:30.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:32.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:32 np0005593233 nova_compute[222017]: 2026-01-23 11:06:32.301 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:32 np0005593233 nova_compute[222017]: 2026-01-23 11:06:32.301 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:32.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:33 np0005593233 nova_compute[222017]: 2026-01-23 11:06:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:33 np0005593233 nova_compute[222017]: 2026-01-23 11:06:33.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:06:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:34.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:34 np0005593233 nova_compute[222017]: 2026-01-23 11:06:34.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:34.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:34 np0005593233 nova_compute[222017]: 2026-01-23 11:06:34.901 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593233 nova_compute[222017]: 2026-01-23 11:06:35.723 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:36.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:36.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:37 np0005593233 nova_compute[222017]: 2026-01-23 11:06:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:38.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:38.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:39 np0005593233 podman[313091]: 2026-01-23 11:06:39.089116109 +0000 UTC m=+0.103441260 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Jan 23 06:06:39 np0005593233 nova_compute[222017]: 2026-01-23 11:06:39.903 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:40.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:40 np0005593233 nova_compute[222017]: 2026-01-23 11:06:40.725 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:40.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:42.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:42 np0005593233 nova_compute[222017]: 2026-01-23 11:06:42.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:06:42.732 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:06:42.732 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:06:42.733 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:42.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:43 np0005593233 nova_compute[222017]: 2026-01-23 11:06:43.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:06:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:44.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:06:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:44.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:44 np0005593233 nova_compute[222017]: 2026-01-23 11:06:44.948 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:45 np0005593233 nova_compute[222017]: 2026-01-23 11:06:45.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:45 np0005593233 nova_compute[222017]: 2026-01-23 11:06:45.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:06:45 np0005593233 nova_compute[222017]: 2026-01-23 11:06:45.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:06:45 np0005593233 nova_compute[222017]: 2026-01-23 11:06:45.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:45 np0005593233 nova_compute[222017]: 2026-01-23 11:06:45.809 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:06:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:46.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:46.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:48.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:48.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:49 np0005593233 nova_compute[222017]: 2026-01-23 11:06:49.950 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:50.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:50 np0005593233 nova_compute[222017]: 2026-01-23 11:06:50.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:50.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:52.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:52.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:54.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:54 np0005593233 podman[313141]: 2026-01-23 11:06:54.259834405 +0000 UTC m=+0.062578592 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 06:06:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:06:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:54.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:06:54 np0005593233 nova_compute[222017]: 2026-01-23 11:06:54.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:55 np0005593233 nova_compute[222017]: 2026-01-23 11:06:55.753 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:56.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:56.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:58 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:06:58 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:06:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:58.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:06:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:58.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:59 np0005593233 nova_compute[222017]: 2026-01-23 11:06:59.956 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:00.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:07:00 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:07:00 np0005593233 nova_compute[222017]: 2026-01-23 11:07:00.756 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:00.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:01 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:07:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:02.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:02.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:04.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:04.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:04 np0005593233 nova_compute[222017]: 2026-01-23 11:07:04.961 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:05 np0005593233 nova_compute[222017]: 2026-01-23 11:07:05.758 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:06.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:06.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:08.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:08.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:07:09.458 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=100, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=99) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:07:09 np0005593233 nova_compute[222017]: 2026-01-23 11:07:09.458 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:07:09.459 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:07:09 np0005593233 nova_compute[222017]: 2026-01-23 11:07:09.804 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:09 np0005593233 nova_compute[222017]: 2026-01-23 11:07:09.962 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593233 podman[313386]: 2026-01-23 11:07:10.111170435 +0000 UTC m=+0.108098912 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 06:07:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:10.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:10 np0005593233 nova_compute[222017]: 2026-01-23 11:07:10.760 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:10.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:11 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:07:11.462 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '100'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:12.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:07:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:07:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:12.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:14.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:14.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:14 np0005593233 nova_compute[222017]: 2026-01-23 11:07:14.965 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:15 np0005593233 nova_compute[222017]: 2026-01-23 11:07:15.762 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:16.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:16.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:18.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:18.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:19 np0005593233 nova_compute[222017]: 2026-01-23 11:07:19.968 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:20.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:20 np0005593233 nova_compute[222017]: 2026-01-23 11:07:20.765 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:20.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:22.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:22.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:24.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:24.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:24 np0005593233 nova_compute[222017]: 2026-01-23 11:07:24.971 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:25 np0005593233 podman[313463]: 2026-01-23 11:07:25.079829834 +0000 UTC m=+0.084371164 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 06:07:25 np0005593233 nova_compute[222017]: 2026-01-23 11:07:25.768 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:26.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:27 np0005593233 nova_compute[222017]: 2026-01-23 11:07:27.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:28.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:28 np0005593233 nova_compute[222017]: 2026-01-23 11:07:28.406 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:28 np0005593233 nova_compute[222017]: 2026-01-23 11:07:28.407 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:28 np0005593233 nova_compute[222017]: 2026-01-23 11:07:28.407 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:28 np0005593233 nova_compute[222017]: 2026-01-23 11:07:28.408 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:07:28 np0005593233 nova_compute[222017]: 2026-01-23 11:07:28.408 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:28.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:28 np0005593233 nova_compute[222017]: 2026-01-23 11:07:28.928 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.124 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.126 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4277MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.126 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.127 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.210 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.211 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.228 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:07:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1156905020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.826 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.598s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.836 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:07:29 np0005593233 nova_compute[222017]: 2026-01-23 11:07:29.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:30.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:30 np0005593233 nova_compute[222017]: 2026-01-23 11:07:30.364 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:07:30 np0005593233 nova_compute[222017]: 2026-01-23 11:07:30.366 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:07:30 np0005593233 nova_compute[222017]: 2026-01-23 11:07:30.367 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:30 np0005593233 nova_compute[222017]: 2026-01-23 11:07:30.770 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:30.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:32.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:32.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:33 np0005593233 nova_compute[222017]: 2026-01-23 11:07:33.369 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:33 np0005593233 nova_compute[222017]: 2026-01-23 11:07:33.370 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:34.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:34 np0005593233 nova_compute[222017]: 2026-01-23 11:07:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:34 np0005593233 nova_compute[222017]: 2026-01-23 11:07:34.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:07:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:34.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:34 np0005593233 nova_compute[222017]: 2026-01-23 11:07:34.975 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:35 np0005593233 nova_compute[222017]: 2026-01-23 11:07:35.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:35 np0005593233 nova_compute[222017]: 2026-01-23 11:07:35.773 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:36.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:37 np0005593233 nova_compute[222017]: 2026-01-23 11:07:37.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:38.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:38.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:39 np0005593233 nova_compute[222017]: 2026-01-23 11:07:39.978 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:40.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:40 np0005593233 nova_compute[222017]: 2026-01-23 11:07:40.775 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:40.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:41 np0005593233 podman[313526]: 2026-01-23 11:07:41.107107704 +0000 UTC m=+0.109460320 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 06:07:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:07:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:42.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:07:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:07:42.733 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:07:42.733 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:07:42.734 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:42.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:44.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:44 np0005593233 nova_compute[222017]: 2026-01-23 11:07:44.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:44.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:44 np0005593233 nova_compute[222017]: 2026-01-23 11:07:44.980 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:45 np0005593233 nova_compute[222017]: 2026-01-23 11:07:45.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:45 np0005593233 nova_compute[222017]: 2026-01-23 11:07:45.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:07:45 np0005593233 nova_compute[222017]: 2026-01-23 11:07:45.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:07:45 np0005593233 nova_compute[222017]: 2026-01-23 11:07:45.404 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:07:45 np0005593233 nova_compute[222017]: 2026-01-23 11:07:45.404 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:45 np0005593233 nova_compute[222017]: 2026-01-23 11:07:45.777 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:46.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:46.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:48.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #214. Immutable memtables: 0.
Jan 23 06:07:48 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:48.738158) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 137] Flushing memtable with next log file: 214
Jan 23 06:07:48 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468738205, "job": 137, "event": "flush_started", "num_memtables": 1, "num_entries": 1028, "num_deletes": 251, "total_data_size": 2219871, "memory_usage": 2242176, "flush_reason": "Manual Compaction"}
Jan 23 06:07:48 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 137] Level-0 flush table #215: started
Jan 23 06:07:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:48.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166469028239, "cf_name": "default", "job": 137, "event": "table_file_creation", "file_number": 215, "file_size": 1466995, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 100929, "largest_seqno": 101952, "table_properties": {"data_size": 1462255, "index_size": 2327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10433, "raw_average_key_size": 19, "raw_value_size": 1452808, "raw_average_value_size": 2772, "num_data_blocks": 102, "num_entries": 524, "num_filter_entries": 524, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166384, "oldest_key_time": 1769166384, "file_creation_time": 1769166468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 215, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 137] Flush lasted 290155 microseconds, and 4672 cpu microseconds.
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.028305) [db/flush_job.cc:967] [default] [JOB 137] Level-0 flush table #215: 1466995 bytes OK
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.028331) [db/memtable_list.cc:519] [default] Level-0 commit table #215 started
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.128728) [db/memtable_list.cc:722] [default] Level-0 commit table #215: memtable #1 done
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.128803) EVENT_LOG_v1 {"time_micros": 1769166469128787, "job": 137, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.128842) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 137] Try to delete WAL files size 2214704, prev total WAL file size 2214968, number of live WAL files 2.
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000211.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.181970) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039303336' seq:72057594037927935, type:22 .. '7061786F730039323838' seq:0, type:0; will stop at (end)
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 138] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 137 Base level 0, inputs: [215(1432KB)], [213(12MB)]
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166469182034, "job": 138, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [215], "files_L6": [213], "score": -1, "input_data_size": 14143217, "oldest_snapshot_seqno": -1}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 138] Generated table #216: 11544 keys, 12191321 bytes, temperature: kUnknown
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166469593299, "cf_name": "default", "job": 138, "event": "table_file_creation", "file_number": 216, "file_size": 12191321, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12121070, "index_size": 40314, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 306932, "raw_average_key_size": 26, "raw_value_size": 11923645, "raw_average_value_size": 1032, "num_data_blocks": 1512, "num_entries": 11544, "num_filter_entries": 11544, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 216, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.593662) [db/compaction/compaction_job.cc:1663] [default] [JOB 138] Compacted 1@0 + 1@6 files to L6 => 12191321 bytes
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.714870) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 34.4 rd, 29.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.1 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(18.0) write-amplify(8.3) OK, records in: 12061, records dropped: 517 output_compression: NoCompression
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.714970) EVENT_LOG_v1 {"time_micros": 1769166469714949, "job": 138, "event": "compaction_finished", "compaction_time_micros": 411373, "compaction_time_cpu_micros": 48570, "output_level": 6, "num_output_files": 1, "total_output_size": 12191321, "num_input_records": 12061, "num_output_records": 11544, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000215.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166469715749, "job": 138, "event": "table_file_deletion", "file_number": 215}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000213.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166469720422, "job": 138, "event": "table_file_deletion", "file_number": 213}
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.181784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.720502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.720507) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.720508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.720509) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:07:49.720511) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593233 nova_compute[222017]: 2026-01-23 11:07:49.982 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:50.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:50 np0005593233 nova_compute[222017]: 2026-01-23 11:07:50.780 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:50.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:52.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:52.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:54.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:54.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:54 np0005593233 nova_compute[222017]: 2026-01-23 11:07:54.985 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:55 np0005593233 nova_compute[222017]: 2026-01-23 11:07:55.782 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593233 podman[313553]: 2026-01-23 11:07:56.054397575 +0000 UTC m=+0.066729778 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:07:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:07:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:56.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:07:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:56.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:58.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:07:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:58.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:00 np0005593233 nova_compute[222017]: 2026-01-23 11:08:00.074 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:00.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:00 np0005593233 nova_compute[222017]: 2026-01-23 11:08:00.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:00.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:02.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:02.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:04.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:04.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:05 np0005593233 nova_compute[222017]: 2026-01-23 11:08:05.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:05 np0005593233 nova_compute[222017]: 2026-01-23 11:08:05.786 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:06.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:06.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:08.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:10 np0005593233 nova_compute[222017]: 2026-01-23 11:08:10.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:10.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:10 np0005593233 nova_compute[222017]: 2026-01-23 11:08:10.789 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:10.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:11 np0005593233 podman[313597]: 2026-01-23 11:08:11.741982036 +0000 UTC m=+0.083262713 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 06:08:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:12.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:12.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 06:08:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 06:08:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:13 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:14.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:08:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:14.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:08:15 np0005593233 nova_compute[222017]: 2026-01-23 11:08:15.082 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:15 np0005593233 nova_compute[222017]: 2026-01-23 11:08:15.791 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:08:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:16.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:16.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:08:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:18.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:08:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:18.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:08:20 np0005593233 nova_compute[222017]: 2026-01-23 11:08:20.084 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:20.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:20 np0005593233 nova_compute[222017]: 2026-01-23 11:08:20.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:20.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:22.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:22.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:24.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:24.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:25 np0005593233 nova_compute[222017]: 2026-01-23 11:08:25.087 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:25 np0005593233 nova_compute[222017]: 2026-01-23 11:08:25.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:26.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:26.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:27 np0005593233 podman[313730]: 2026-01-23 11:08:27.055066207 +0000 UTC m=+0.067856140 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 06:08:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:28.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:28.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.417 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.417 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.417 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.417 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.418 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:08:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:08:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1685628104' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:08:29 np0005593233 nova_compute[222017]: 2026-01-23 11:08:29.886 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.090 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.117 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.119 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4271MB free_disk=20.942890167236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.119 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.120 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.250 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.251 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.266 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:08:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:30.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:08:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2831274365' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.966 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.700s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:08:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:30.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:30 np0005593233 nova_compute[222017]: 2026-01-23 11:08:30.975 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:08:31 np0005593233 nova_compute[222017]: 2026-01-23 11:08:31.001 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:08:31 np0005593233 nova_compute[222017]: 2026-01-23 11:08:31.031 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:08:31 np0005593233 nova_compute[222017]: 2026-01-23 11:08:31.032 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:08:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:32.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:08:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:32.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:33 np0005593233 nova_compute[222017]: 2026-01-23 11:08:33.031 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:33 np0005593233 nova_compute[222017]: 2026-01-23 11:08:33.032 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:34.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:34.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:35 np0005593233 nova_compute[222017]: 2026-01-23 11:08:35.092 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593233 nova_compute[222017]: 2026-01-23 11:08:35.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:36.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:36 np0005593233 nova_compute[222017]: 2026-01-23 11:08:36.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:36 np0005593233 nova_compute[222017]: 2026-01-23 11:08:36.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:08:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:36.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:37 np0005593233 nova_compute[222017]: 2026-01-23 11:08:37.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:37 np0005593233 nova_compute[222017]: 2026-01-23 11:08:37.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:38.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:38.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:08:39.005 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=101, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=100) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:08:39 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:08:39.006 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:08:39 np0005593233 nova_compute[222017]: 2026-01-23 11:08:39.018 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593233 nova_compute[222017]: 2026-01-23 11:08:40.097 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:40.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:40 np0005593233 nova_compute[222017]: 2026-01-23 11:08:40.802 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:40.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:42 np0005593233 podman[313844]: 2026-01-23 11:08:42.103170813 +0000 UTC m=+0.113412101 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:08:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:42.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:08:42.734 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:08:42.734 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:08:42.734 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:42.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:44.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:44.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:45 np0005593233 nova_compute[222017]: 2026-01-23 11:08:45.099 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:45 np0005593233 nova_compute[222017]: 2026-01-23 11:08:45.805 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:46.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:46 np0005593233 nova_compute[222017]: 2026-01-23 11:08:46.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:47.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:08:47.009 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '101'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:47 np0005593233 nova_compute[222017]: 2026-01-23 11:08:47.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:47 np0005593233 nova_compute[222017]: 2026-01-23 11:08:47.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:08:47 np0005593233 nova_compute[222017]: 2026-01-23 11:08:47.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:08:47 np0005593233 nova_compute[222017]: 2026-01-23 11:08:47.683 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:08:47 np0005593233 nova_compute[222017]: 2026-01-23 11:08:47.684 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:08:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:48.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:08:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:49.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:50 np0005593233 nova_compute[222017]: 2026-01-23 11:08:50.149 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:50.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:50 np0005593233 nova_compute[222017]: 2026-01-23 11:08:50.807 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:51.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:52.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:53.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:54.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:08:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:55.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:08:55 np0005593233 nova_compute[222017]: 2026-01-23 11:08:55.151 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:55 np0005593233 nova_compute[222017]: 2026-01-23 11:08:55.809 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:56.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:57.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:58 np0005593233 podman[313870]: 2026-01-23 11:08:58.08985758 +0000 UTC m=+0.098745588 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 06:08:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:58.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:08:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:59.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:00 np0005593233 nova_compute[222017]: 2026-01-23 11:09:00.218 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:00.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:00 np0005593233 nova_compute[222017]: 2026-01-23 11:09:00.811 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:01.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:02.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:03.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:04.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:05.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:05 np0005593233 nova_compute[222017]: 2026-01-23 11:09:05.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:05 np0005593233 nova_compute[222017]: 2026-01-23 11:09:05.813 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:06.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:07.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:08.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:09.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:10 np0005593233 nova_compute[222017]: 2026-01-23 11:09:10.224 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:10.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:10 np0005593233 nova_compute[222017]: 2026-01-23 11:09:10.815 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:11.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:12.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:13.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:13 np0005593233 podman[313890]: 2026-01-23 11:09:13.238842844 +0000 UTC m=+0.234151219 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 06:09:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:13 np0005593233 nova_compute[222017]: 2026-01-23 11:09:13.679 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:14.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:15.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:15 np0005593233 nova_compute[222017]: 2026-01-23 11:09:15.227 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:15 np0005593233 nova_compute[222017]: 2026-01-23 11:09:15.817 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:16.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:17.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:18.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:19.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:20 np0005593233 nova_compute[222017]: 2026-01-23 11:09:20.266 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:20.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:20 np0005593233 nova_compute[222017]: 2026-01-23 11:09:20.821 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:21.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:22.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:23.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:24.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:25.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:25 np0005593233 nova_compute[222017]: 2026-01-23 11:09:25.269 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:25 np0005593233 nova_compute[222017]: 2026-01-23 11:09:25.824 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:26.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:27.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:28.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:29.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:29 np0005593233 podman[313917]: 2026-01-23 11:09:29.083673832 +0000 UTC m=+0.093173692 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.416 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.417 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.418 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:09:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:09:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/415830614' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:09:29 np0005593233 nova_compute[222017]: 2026-01-23 11:09:29.924 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.115 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.116 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4274MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.116 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.117 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.271 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.374 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.375 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:09:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:30.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.505 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.826 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:09:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1218113661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.982 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:09:30 np0005593233 nova_compute[222017]: 2026-01-23 11:09:30.991 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:09:31 np0005593233 nova_compute[222017]: 2026-01-23 11:09:31.012 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:09:31 np0005593233 nova_compute[222017]: 2026-01-23 11:09:31.034 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:09:31 np0005593233 nova_compute[222017]: 2026-01-23 11:09:31.035 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:09:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:31.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:32.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:33 np0005593233 nova_compute[222017]: 2026-01-23 11:09:33.035 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:33.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:33 np0005593233 nova_compute[222017]: 2026-01-23 11:09:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:34.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:35.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:35 np0005593233 nova_compute[222017]: 2026-01-23 11:09:35.273 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:35 np0005593233 nova_compute[222017]: 2026-01-23 11:09:35.829 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:36.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:36 np0005593233 nova_compute[222017]: 2026-01-23 11:09:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:36 np0005593233 nova_compute[222017]: 2026-01-23 11:09:36.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:09:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:37.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 06:09:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:09:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:09:37 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:09:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:38.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:39.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:39 np0005593233 nova_compute[222017]: 2026-01-23 11:09:39.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:39 np0005593233 nova_compute[222017]: 2026-01-23 11:09:39.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:40 np0005593233 nova_compute[222017]: 2026-01-23 11:09:40.275 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:40.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:40 np0005593233 nova_compute[222017]: 2026-01-23 11:09:40.832 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:41.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:42.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:09:42.735 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:09:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:09:42.735 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:09:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:09:42.735 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:09:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:43.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:44 np0005593233 podman[314112]: 2026-01-23 11:09:44.101998181 +0000 UTC m=+0.103771380 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 06:09:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:44.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:45.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:45 np0005593233 nova_compute[222017]: 2026-01-23 11:09:45.279 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:45 np0005593233 nova_compute[222017]: 2026-01-23 11:09:45.834 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:46.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:47.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:48 np0005593233 nova_compute[222017]: 2026-01-23 11:09:48.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:48.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:49.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:49 np0005593233 nova_compute[222017]: 2026-01-23 11:09:49.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:49 np0005593233 nova_compute[222017]: 2026-01-23 11:09:49.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:09:49 np0005593233 nova_compute[222017]: 2026-01-23 11:09:49.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:09:49 np0005593233 nova_compute[222017]: 2026-01-23 11:09:49.411 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:09:49 np0005593233 nova_compute[222017]: 2026-01-23 11:09:49.414 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:09:50 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:09:50 np0005593233 nova_compute[222017]: 2026-01-23 11:09:50.322 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:50.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:50 np0005593233 nova_compute[222017]: 2026-01-23 11:09:50.835 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:51.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:51 np0005593233 nova_compute[222017]: 2026-01-23 11:09:51.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:09:51.246 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=102, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=101) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:09:51 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:09:51.247 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:09:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:52.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:53.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:53 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:54.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:55.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:55 np0005593233 nova_compute[222017]: 2026-01-23 11:09:55.325 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:55 np0005593233 nova_compute[222017]: 2026-01-23 11:09:55.838 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:56.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:57.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:57 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:09:57.249 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '102'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:09:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:09:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:58.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:09:58 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:09:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:09:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:59.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:10:00 np0005593233 podman[314192]: 2026-01-23 11:10:00.039018701 +0000 UTC m=+0.054667329 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 06:10:00 np0005593233 nova_compute[222017]: 2026-01-23 11:10:00.326 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:00 np0005593233 nova_compute[222017]: 2026-01-23 11:10:00.840 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:00.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:01.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:01 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 06:10:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:02.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:03.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:04.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:05 np0005593233 nova_compute[222017]: 2026-01-23 11:10:05.330 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:05 np0005593233 nova_compute[222017]: 2026-01-23 11:10:05.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:06.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:08.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:09.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:10 np0005593233 nova_compute[222017]: 2026-01-23 11:10:10.332 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:10 np0005593233 nova_compute[222017]: 2026-01-23 11:10:10.844 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:10.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:11.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:12.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:13.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:14.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:15 np0005593233 podman[314211]: 2026-01-23 11:10:15.152521285 +0000 UTC m=+0.167312837 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 06:10:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:15.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:15 np0005593233 nova_compute[222017]: 2026-01-23 11:10:15.334 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:15 np0005593233 nova_compute[222017]: 2026-01-23 11:10:15.847 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:16.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 06:10:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 06:10:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:18.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:18 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:19.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:19 np0005593233 nova_compute[222017]: 2026-01-23 11:10:19.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:20 np0005593233 nova_compute[222017]: 2026-01-23 11:10:20.337 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:20 np0005593233 nova_compute[222017]: 2026-01-23 11:10:20.849 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:20.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:10:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:21.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:10:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:22.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:23.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:24.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:25.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:25 np0005593233 nova_compute[222017]: 2026-01-23 11:10:25.340 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:25 np0005593233 nova_compute[222017]: 2026-01-23 11:10:25.852 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:26.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:27.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:28.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:28 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:29.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.345 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.426 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.476 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.476 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.477 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.477 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.477 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.855 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:10:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4169869043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:10:30 np0005593233 nova_compute[222017]: 2026-01-23 11:10:30.917 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:30.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:31 np0005593233 podman[314260]: 2026-01-23 11:10:31.041116764 +0000 UTC m=+0.057930310 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.114 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.115 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4287MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.115 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.116 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:31.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.661 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance d31f326a-af44-4efe-96ba-07071e3c6059 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.662 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.662 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.739 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.740 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.767 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.813 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:31 np0005593233 nova_compute[222017]: 2026-01-23 11:10:31.889 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:10:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3372971634' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.485 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.672s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.493 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.514 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.517 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.517 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.402s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.518 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.527 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.528 222021 INFO nova.compute.claims [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 06:10:32 np0005593233 nova_compute[222017]: 2026-01-23 11:10:32.696 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:10:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:32.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:10:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:10:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4116155229' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.181 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:33.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.191 222021 DEBUG nova.compute.provider_tree [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.337 222021 DEBUG nova.scheduler.client.report [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.580 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.604 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.605 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.674 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.674 222021 DEBUG nova.network.neutron [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.692 222021 INFO nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.709 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.809 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.810 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.810 222021 INFO nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Creating image(s)#033[00m
Jan 23 06:10:33 np0005593233 nova_compute[222017]: 2026-01-23 11:10:33.844 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.149 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.260 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.267 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.339 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.340 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.341 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.342 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.392 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593233 nova_compute[222017]: 2026-01-23 11:10:34.397 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d31f326a-af44-4efe-96ba-07071e3c6059_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:34.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:35.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:35 np0005593233 nova_compute[222017]: 2026-01-23 11:10:35.302 222021 DEBUG nova.network.neutron [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Successfully created port: 49296d60-7879-484f-bb75-1d8f5f61ce38 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:10:35 np0005593233 nova_compute[222017]: 2026-01-23 11:10:35.349 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:35 np0005593233 nova_compute[222017]: 2026-01-23 11:10:35.580 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:35 np0005593233 nova_compute[222017]: 2026-01-23 11:10:35.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:36 np0005593233 nova_compute[222017]: 2026-01-23 11:10:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:36 np0005593233 nova_compute[222017]: 2026-01-23 11:10:36.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:10:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:36.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:37.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:37 np0005593233 nova_compute[222017]: 2026-01-23 11:10:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:37 np0005593233 nova_compute[222017]: 2026-01-23 11:10:37.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.104 222021 DEBUG nova.network.neutron [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Successfully updated port: 49296d60-7879-484f-bb75-1d8f5f61ce38 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.158 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "refresh_cache-d31f326a-af44-4efe-96ba-07071e3c6059" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.159 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquired lock "refresh_cache-d31f326a-af44-4efe-96ba-07071e3c6059" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.159 222021 DEBUG nova.network.neutron [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.222 222021 DEBUG nova.compute.manager [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-changed-49296d60-7879-484f-bb75-1d8f5f61ce38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.223 222021 DEBUG nova.compute.manager [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Refreshing instance network info cache due to event network-changed-49296d60-7879-484f-bb75-1d8f5f61ce38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.224 222021 DEBUG oslo_concurrency.lockutils [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d31f326a-af44-4efe-96ba-07071e3c6059" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:10:38 np0005593233 nova_compute[222017]: 2026-01-23 11:10:38.334 222021 DEBUG nova.network.neutron [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:10:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:38.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:39.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.026 222021 DEBUG nova.network.neutron [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Updating instance_info_cache with network_info: [{"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.321 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Releasing lock "refresh_cache-d31f326a-af44-4efe-96ba-07071e3c6059" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.322 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Instance network_info: |[{"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.322 222021 DEBUG oslo_concurrency.lockutils [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d31f326a-af44-4efe-96ba-07071e3c6059" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.323 222021 DEBUG nova.network.neutron [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Refreshing network info cache for port 49296d60-7879-484f-bb75-1d8f5f61ce38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.352 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.501 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:40 np0005593233 nova_compute[222017]: 2026-01-23 11:10:40.861 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:10:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:40.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:10:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:41.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:41 np0005593233 nova_compute[222017]: 2026-01-23 11:10:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:42.736 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:42.737 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:42.737 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:42.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:43 np0005593233 nova_compute[222017]: 2026-01-23 11:10:43.023 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d31f326a-af44-4efe-96ba-07071e3c6059_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 8.625s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:43 np0005593233 nova_compute[222017]: 2026-01-23 11:10:43.109 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] resizing rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:10:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:43.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:43 np0005593233 nova_compute[222017]: 2026-01-23 11:10:43.484 222021 DEBUG nova.objects.instance [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lazy-loading 'migration_context' on Instance uuid d31f326a-af44-4efe-96ba-07071e3c6059 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:10:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:44.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:45.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:45 np0005593233 nova_compute[222017]: 2026-01-23 11:10:45.354 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:45 np0005593233 nova_compute[222017]: 2026-01-23 11:10:45.864 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:46 np0005593233 podman[314488]: 2026-01-23 11:10:46.108642066 +0000 UTC m=+0.109376307 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.899 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.900 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Ensure instance console log exists: /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.900 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.900 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.901 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.902 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Start _get_guest_xml network_info=[{"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.908 222021 WARNING nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.914 222021 DEBUG nova.virt.libvirt.host [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.915 222021 DEBUG nova.virt.libvirt.host [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.918 222021 DEBUG nova.virt.libvirt.host [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.918 222021 DEBUG nova.virt.libvirt.host [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.920 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.920 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.920 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.921 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.921 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.921 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.921 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.921 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.922 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.922 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.922 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.922 222021 DEBUG nova.virt.hardware [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:10:46 np0005593233 nova_compute[222017]: 2026-01-23 11:10:46.925 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:46.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.202 222021 DEBUG nova.network.neutron [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Updated VIF entry in instance network info cache for port 49296d60-7879-484f-bb75-1d8f5f61ce38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.203 222021 DEBUG nova.network.neutron [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Updating instance_info_cache with network_info: [{"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:10:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:47.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:10:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1520339722' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.404 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.434 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.439 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:10:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/718833496' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.888 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.890 222021 DEBUG nova.virt.libvirt.vif [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:10:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-900332348',display_name='tempest-TestServerMultinode-server-900332348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-900332348',id=221,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4691e06029a4b11bbda2856a451bd88',ramdisk_id='',reservation_id='r-44k0cess',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1152571872',owner_user_name='tempest-TestServerMultinode-1152571872-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:10:33Z,user_data=None,user_id='ac51edf400184ec0b11ee5acc335ff21',uuid=d31f326a-af44-4efe-96ba-07071e3c6059,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.891 222021 DEBUG nova.network.os_vif_util [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converting VIF {"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.892 222021 DEBUG nova.network.os_vif_util [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:10:47 np0005593233 nova_compute[222017]: 2026-01-23 11:10:47.893 222021 DEBUG nova.objects.instance [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lazy-loading 'pci_devices' on Instance uuid d31f326a-af44-4efe-96ba-07071e3c6059 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.247 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <uuid>d31f326a-af44-4efe-96ba-07071e3c6059</uuid>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <name>instance-000000dd</name>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestServerMultinode-server-900332348</nova:name>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 11:10:46</nova:creationTime>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:user uuid="ac51edf400184ec0b11ee5acc335ff21">tempest-TestServerMultinode-1152571872-project-admin</nova:user>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:project uuid="d4691e06029a4b11bbda2856a451bd88">tempest-TestServerMultinode-1152571872</nova:project>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <nova:port uuid="49296d60-7879-484f-bb75-1d8f5f61ce38">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <system>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <entry name="serial">d31f326a-af44-4efe-96ba-07071e3c6059</entry>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <entry name="uuid">d31f326a-af44-4efe-96ba-07071e3c6059</entry>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </system>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <os>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </os>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <features>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </features>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </clock>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  <devices>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/d31f326a-af44-4efe-96ba-07071e3c6059_disk">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </source>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </auth>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </disk>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/d31f326a-af44-4efe-96ba-07071e3c6059_disk.config">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </source>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      </auth>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </disk>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:9a:12:2d"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <target dev="tap49296d60-78"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </interface>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/console.log" append="off"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </serial>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <video>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </video>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </rng>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 06:10:48 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 06:10:48 np0005593233 nova_compute[222017]:  </devices>
Jan 23 06:10:48 np0005593233 nova_compute[222017]: </domain>
Jan 23 06:10:48 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.248 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Preparing to wait for external event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.249 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.249 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.249 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.250 222021 DEBUG nova.virt.libvirt.vif [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:10:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-900332348',display_name='tempest-TestServerMultinode-server-900332348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-900332348',id=221,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4691e06029a4b11bbda2856a451bd88',ramdisk_id='',reservation_id='r-44k0cess',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1152571872',owner_user_name='tempest-TestServerMultinode-1152571872-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:10:33Z,user_data=None,user_id='ac51edf400184ec0b11ee5acc335ff21',uuid=d31f326a-af44-4efe-96ba-07071e3c6059,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.250 222021 DEBUG nova.network.os_vif_util [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converting VIF {"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.251 222021 DEBUG nova.network.os_vif_util [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.251 222021 DEBUG os_vif [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.252 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.252 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.253 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.253 222021 DEBUG oslo_concurrency.lockutils [req-e27f6492-3f78-4390-a382-d51b0fe23f0f req-4f6b46fd-dc9f-418c-b5fc-ab3b8d99d590 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d31f326a-af44-4efe-96ba-07071e3c6059" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.257 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.257 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49296d60-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.258 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap49296d60-78, col_values=(('external_ids', {'iface-id': '49296d60-7879-484f-bb75-1d8f5f61ce38', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:12:2d', 'vm-uuid': 'd31f326a-af44-4efe-96ba-07071e3c6059'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.259 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593233 NetworkManager[48871]: <info>  [1769166648.2607] manager: (tap49296d60-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.262 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.269 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.271 222021 INFO os_vif [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78')#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.918 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.918 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.918 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] No VIF found with MAC fa:16:3e:9a:12:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.919 222021 INFO nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Using config drive#033[00m
Jan 23 06:10:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:48.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:48 np0005593233 nova_compute[222017]: 2026-01-23 11:10:48.957 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:49.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:49 np0005593233 nova_compute[222017]: 2026-01-23 11:10:49.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:49 np0005593233 nova_compute[222017]: 2026-01-23 11:10:49.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:49 np0005593233 nova_compute[222017]: 2026-01-23 11:10:49.991 222021 INFO nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Creating config drive at /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/disk.config#033[00m
Jan 23 06:10:49 np0005593233 nova_compute[222017]: 2026-01-23 11:10:49.996 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy7ot1mtq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.135 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy7ot1mtq" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.169 222021 DEBUG nova.storage.rbd_utils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image d31f326a-af44-4efe-96ba-07071e3c6059_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.173 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/disk.config d31f326a-af44-4efe-96ba-07071e3c6059_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.356 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.489 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 06:10:50 np0005593233 nova_compute[222017]: 2026-01-23 11:10:50.489 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:10:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:50.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:51.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:51 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:10:51 np0005593233 nova_compute[222017]: 2026-01-23 11:10:51.714 222021 DEBUG oslo_concurrency.processutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/disk.config d31f326a-af44-4efe-96ba-07071e3c6059_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:51 np0005593233 nova_compute[222017]: 2026-01-23 11:10:51.715 222021 INFO nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Deleting local config drive /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059/disk.config because it was imported into RBD.#033[00m
Jan 23 06:10:51 np0005593233 kernel: tap49296d60-78: entered promiscuous mode
Jan 23 06:10:51 np0005593233 NetworkManager[48871]: <info>  [1769166651.7918] manager: (tap49296d60-78): new Tun device (/org/freedesktop/NetworkManager/Devices/421)
Jan 23 06:10:51 np0005593233 nova_compute[222017]: 2026-01-23 11:10:51.888 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593233 ovn_controller[130653]: 2026-01-23T11:10:51Z|00924|binding|INFO|Claiming lport 49296d60-7879-484f-bb75-1d8f5f61ce38 for this chassis.
Jan 23 06:10:51 np0005593233 ovn_controller[130653]: 2026-01-23T11:10:51Z|00925|binding|INFO|49296d60-7879-484f-bb75-1d8f5f61ce38: Claiming fa:16:3e:9a:12:2d 10.100.0.10
Jan 23 06:10:51 np0005593233 nova_compute[222017]: 2026-01-23 11:10:51.895 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593233 systemd-machined[190954]: New machine qemu-99-instance-000000dd.
Jan 23 06:10:51 np0005593233 systemd-udevd[314783]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:10:51 np0005593233 NetworkManager[48871]: <info>  [1769166651.9297] device (tap49296d60-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:10:51 np0005593233 NetworkManager[48871]: <info>  [1769166651.9303] device (tap49296d60-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:10:51 np0005593233 systemd[1]: Started Virtual Machine qemu-99-instance-000000dd.
Jan 23 06:10:51 np0005593233 ovn_controller[130653]: 2026-01-23T11:10:51Z|00926|binding|INFO|Setting lport 49296d60-7879-484f-bb75-1d8f5f61ce38 ovn-installed in OVS
Jan 23 06:10:51 np0005593233 nova_compute[222017]: 2026-01-23 11:10:51.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:10:52 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:10:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:52.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:53 np0005593233 nova_compute[222017]: 2026-01-23 11:10:53.107 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166653.1063652, d31f326a-af44-4efe-96ba-07071e3c6059 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:10:53 np0005593233 nova_compute[222017]: 2026-01-23 11:10:53.108 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] VM Started (Lifecycle Event)#033[00m
Jan 23 06:10:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:53.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:53 np0005593233 nova_compute[222017]: 2026-01-23 11:10:53.260 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.954 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:12:2d 10.100.0.10'], port_security=['fa:16:3e:9a:12:2d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd31f326a-af44-4efe-96ba-07071e3c6059', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4691e06029a4b11bbda2856a451bd88', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ca02cfe-9b98-40f4-8c92-4cc40f5f9499', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9205c727-159c-48df-8bc6-3771f4de4cfc, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=49296d60-7879-484f-bb75-1d8f5f61ce38) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:10:53 np0005593233 ovn_controller[130653]: 2026-01-23T11:10:53Z|00927|binding|INFO|Setting lport 49296d60-7879-484f-bb75-1d8f5f61ce38 up in Southbound
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.955 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 49296d60-7879-484f-bb75-1d8f5f61ce38 in datapath f4706ca2-15b6-4141-8d7b-8d4cab159f24 bound to our chassis#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.956 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4706ca2-15b6-4141-8d7b-8d4cab159f24#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.970 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[8af3e729-3d0d-496b-849b-73a2a0ee4d71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.971 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4706ca2-11 in ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.974 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4706ca2-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.974 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[7944ac12-ad3c-49f8-b842-092e4e7598d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.975 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[37dd3da0-5c5e-4d95-9e57-9ac720680b58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:53 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:53.987 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[97e1241d-8fd7-4e97-8d63-f8f838c1f9d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.007 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[baa52451-adbc-44de-9587-066bb03d3bd3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.046 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1573fb-a6c3-48b8-8644-d0270c738271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.053 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[9e535b94-cac5-4b1e-ac7e-695e68aa4eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 systemd-udevd[314785]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:10:54 np0005593233 NetworkManager[48871]: <info>  [1769166654.0550] manager: (tapf4706ca2-10): new Veth device (/org/freedesktop/NetworkManager/Devices/422)
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.090 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[22e96d9f-bcca-423b-bafd-04f9d98344e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.094 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[767122cb-9ce0-4a57-a180-3ebd49b49ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 NetworkManager[48871]: <info>  [1769166654.1175] device (tapf4706ca2-10): carrier: link connected
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.122 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbf3500-97dc-45af-bed3-04dec96d0d83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.140 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[773edd91-0be5-4148-a4ff-62451e2c7d98]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4706ca2-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:aa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 275], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1056674, 'reachable_time': 28181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314858, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.156 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.159 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe6950f-02cb-4024-a452-2126c0c74ab3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:aa5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1056674, 'tstamp': 1056674}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314859, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.161 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166653.1065822, d31f326a-af44-4efe-96ba-07071e3c6059 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.161 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.176 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[a1edd626-670d-4cfe-a991-0085bcc2b030]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4706ca2-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:aa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 275], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1056674, 'reachable_time': 28181, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 314860, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.212 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[0772b176-6e77-4764-97ac-718d58245b74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.275 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[546ce465-e911-4b90-a330-279288633577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.277 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4706ca2-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.277 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.278 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4706ca2-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.279 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:54 np0005593233 NetworkManager[48871]: <info>  [1769166654.2805] manager: (tapf4706ca2-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/423)
Jan 23 06:10:54 np0005593233 kernel: tapf4706ca2-10: entered promiscuous mode
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.284 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4706ca2-10, col_values=(('external_ids', {'iface-id': '5655a848-aba1-4fa8-84e5-387dc4198f8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:54 np0005593233 ovn_controller[130653]: 2026-01-23T11:10:54Z|00928|binding|INFO|Releasing lport 5655a848-aba1-4fa8-84e5-387dc4198f8a from this chassis (sb_readonly=0)
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.285 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.287 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4706ca2-15b6-4141-8d7b-8d4cab159f24.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4706ca2-15b6-4141-8d7b-8d4cab159f24.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.288 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[bffe13df-fa04-4732-a8f2-a437a86d00a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.289 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-f4706ca2-15b6-4141-8d7b-8d4cab159f24
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/f4706ca2-15b6-4141-8d7b-8d4cab159f24.pid.haproxy
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID f4706ca2-15b6-4141-8d7b-8d4cab159f24
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:10:54 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:54.290 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'env', 'PROCESS_TAG=haproxy-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4706ca2-15b6-4141-8d7b-8d4cab159f24.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.299 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.370 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.374 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:10:54 np0005593233 nova_compute[222017]: 2026-01-23 11:10:54.406 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:10:54 np0005593233 podman[314893]: 2026-01-23 11:10:54.733196001 +0000 UTC m=+0.065516934 container create cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 06:10:54 np0005593233 systemd[1]: Started libpod-conmon-cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6.scope.
Jan 23 06:10:54 np0005593233 podman[314893]: 2026-01-23 11:10:54.695782239 +0000 UTC m=+0.028103202 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:10:54 np0005593233 systemd[1]: Started libcrun container.
Jan 23 06:10:54 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1753490187c8df627c0dd85b361bd55be131424feb8e64ab5824c0dff06f8e47/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:10:54 np0005593233 podman[314893]: 2026-01-23 11:10:54.830885839 +0000 UTC m=+0.163206772 container init cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:10:54 np0005593233 podman[314893]: 2026-01-23 11:10:54.837379982 +0000 UTC m=+0.169700915 container start cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 06:10:54 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [NOTICE]   (314912) : New worker (314914) forked
Jan 23 06:10:54 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [NOTICE]   (314912) : Loading success.
Jan 23 06:10:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:54.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:55.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:55 np0005593233 nova_compute[222017]: 2026-01-23 11:10:55.431 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:56.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:56.968 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=103, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=102) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:10:56 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:10:56.969 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:10:56 np0005593233 nova_compute[222017]: 2026-01-23 11:10:56.969 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:57.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:10:58 np0005593233 nova_compute[222017]: 2026-01-23 11:10:58.263 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:58.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:10:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:10:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:59.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:00 np0005593233 nova_compute[222017]: 2026-01-23 11:11:00.433 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:00.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:01.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:02 np0005593233 podman[314924]: 2026-01-23 11:11:02.073300844 +0000 UTC m=+0.078158599 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.152 222021 DEBUG nova.compute.manager [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.153 222021 DEBUG oslo_concurrency.lockutils [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.153 222021 DEBUG oslo_concurrency.lockutils [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.154 222021 DEBUG oslo_concurrency.lockutils [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.154 222021 DEBUG nova.compute.manager [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Processing event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.154 222021 DEBUG nova.compute.manager [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.154 222021 DEBUG oslo_concurrency.lockutils [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.154 222021 DEBUG oslo_concurrency.lockutils [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.155 222021 DEBUG oslo_concurrency.lockutils [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.155 222021 DEBUG nova.compute.manager [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] No waiting events found dispatching network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.155 222021 WARNING nova.compute.manager [req-3add463d-3cba-4f8b-b58f-0861c5381939 req-bea1f271-b494-452a-abbb-5c7c8896f4f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received unexpected event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.156 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.160 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166662.1596844, d31f326a-af44-4efe-96ba-07071e3c6059 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.160 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.162 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.165 222021 INFO nova.virt.libvirt.driver [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Instance spawned successfully.#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.166 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.244 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.251 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.251 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.252 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.252 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.252 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.253 222021 DEBUG nova.virt.libvirt.driver [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.257 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.299 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.322 222021 INFO nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Took 28.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.323 222021 DEBUG nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.395 222021 INFO nova.compute.manager [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Took 30.53 seconds to build instance.#033[00m
Jan 23 06:11:02 np0005593233 nova_compute[222017]: 2026-01-23 11:11:02.416 222021 DEBUG oslo_concurrency.lockutils [None req-a7edaefc-a9c3-4557-a3c0-6313de59afee ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 30.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:11:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:11:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:02.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:03.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:03 np0005593233 nova_compute[222017]: 2026-01-23 11:11:03.265 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:04.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:05.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:05 np0005593233 nova_compute[222017]: 2026-01-23 11:11:05.437 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:06 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:06.972 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '103'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:11:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:06.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:07.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:08 np0005593233 nova_compute[222017]: 2026-01-23 11:11:08.267 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:08.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:09.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:10 np0005593233 nova_compute[222017]: 2026-01-23 11:11:10.442 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:10.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:11.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:12.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:13.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:13 np0005593233 nova_compute[222017]: 2026-01-23 11:11:13.270 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:14.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:11:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:15.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:11:15 np0005593233 nova_compute[222017]: 2026-01-23 11:11:15.475 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:15 np0005593233 nova_compute[222017]: 2026-01-23 11:11:15.484 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.301 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.302 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.303 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.303 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.303 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.304 222021 INFO nova.compute.manager [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Terminating instance#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.305 222021 DEBUG nova.compute.manager [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:11:16 np0005593233 kernel: tap49296d60-78 (unregistering): left promiscuous mode
Jan 23 06:11:16 np0005593233 NetworkManager[48871]: <info>  [1769166676.4284] device (tap49296d60-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:11:16 np0005593233 ovn_controller[130653]: 2026-01-23T11:11:16Z|00929|binding|INFO|Releasing lport 49296d60-7879-484f-bb75-1d8f5f61ce38 from this chassis (sb_readonly=0)
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.436 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593233 ovn_controller[130653]: 2026-01-23T11:11:16Z|00930|binding|INFO|Setting lport 49296d60-7879-484f-bb75-1d8f5f61ce38 down in Southbound
Jan 23 06:11:16 np0005593233 ovn_controller[130653]: 2026-01-23T11:11:16Z|00931|binding|INFO|Removing iface tap49296d60-78 ovn-installed in OVS
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.439 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593233 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000dd.scope: Deactivated successfully.
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.477 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593233 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000dd.scope: Consumed 14.513s CPU time.
Jan 23 06:11:16 np0005593233 systemd-machined[190954]: Machine qemu-99-instance-000000dd terminated.
Jan 23 06:11:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:16.524 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:12:2d 10.100.0.10'], port_security=['fa:16:3e:9a:12:2d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd31f326a-af44-4efe-96ba-07071e3c6059', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4691e06029a4b11bbda2856a451bd88', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ca02cfe-9b98-40f4-8c92-4cc40f5f9499', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9205c727-159c-48df-8bc6-3771f4de4cfc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=49296d60-7879-484f-bb75-1d8f5f61ce38) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:11:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:16.526 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 49296d60-7879-484f-bb75-1d8f5f61ce38 in datapath f4706ca2-15b6-4141-8d7b-8d4cab159f24 unbound from our chassis#033[00m
Jan 23 06:11:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:16.527 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4706ca2-15b6-4141-8d7b-8d4cab159f24, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:11:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:16.529 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[2da1c235-f181-4581-b3e7-f7c3efb37251]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:16 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:16.529 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 namespace which is not needed anymore#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.572 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.585 222021 INFO nova.virt.libvirt.driver [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Instance destroyed successfully.#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.585 222021 DEBUG nova.objects.instance [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lazy-loading 'resources' on Instance uuid d31f326a-af44-4efe-96ba-07071e3c6059 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:11:16 np0005593233 podman[314995]: 2026-01-23 11:11:16.590563018 +0000 UTC m=+0.133884967 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:11:16 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [NOTICE]   (314912) : haproxy version is 2.8.14-c23fe91
Jan 23 06:11:16 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [NOTICE]   (314912) : path to executable is /usr/sbin/haproxy
Jan 23 06:11:16 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [WARNING]  (314912) : Exiting Master process...
Jan 23 06:11:16 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [WARNING]  (314912) : Exiting Master process...
Jan 23 06:11:16 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [ALERT]    (314912) : Current worker (314914) exited with code 143 (Terminated)
Jan 23 06:11:16 np0005593233 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[314908]: [WARNING]  (314912) : All workers exited. Exiting... (0)
Jan 23 06:11:16 np0005593233 systemd[1]: libpod-cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6.scope: Deactivated successfully.
Jan 23 06:11:16 np0005593233 podman[315051]: 2026-01-23 11:11:16.68554411 +0000 UTC m=+0.056254094 container died cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.718 222021 DEBUG nova.virt.libvirt.vif [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:10:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-900332348',display_name='tempest-TestServerMultinode-server-900332348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-900332348',id=221,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:11:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4691e06029a4b11bbda2856a451bd88',ramdisk_id='',reservation_id='r-44k0cess',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1152571872',owner_user_name='tempest-TestServerMultinode-1152571872-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:11:02Z,user_data=None,user_id='ac51edf400184ec0b11ee5acc335ff21',uuid=d31f326a-af44-4efe-96ba-07071e3c6059,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.719 222021 DEBUG nova.network.os_vif_util [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converting VIF {"id": "49296d60-7879-484f-bb75-1d8f5f61ce38", "address": "fa:16:3e:9a:12:2d", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49296d60-78", "ovs_interfaceid": "49296d60-7879-484f-bb75-1d8f5f61ce38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.721 222021 DEBUG nova.network.os_vif_util [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.723 222021 DEBUG os_vif [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.725 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.726 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49296d60-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.729 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.731 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:11:16 np0005593233 nova_compute[222017]: 2026-01-23 11:11:16.734 222021 INFO os_vif [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:12:2d,bridge_name='br-int',has_traffic_filtering=True,id=49296d60-7879-484f-bb75-1d8f5f61ce38,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49296d60-78')#033[00m
Jan 23 06:11:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay-1753490187c8df627c0dd85b361bd55be131424feb8e64ab5824c0dff06f8e47-merged.mount: Deactivated successfully.
Jan 23 06:11:16 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6-userdata-shm.mount: Deactivated successfully.
Jan 23 06:11:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:11:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:16.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:11:17 np0005593233 podman[315051]: 2026-01-23 11:11:17.013044562 +0000 UTC m=+0.383754576 container cleanup cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 06:11:17 np0005593233 systemd[1]: libpod-conmon-cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6.scope: Deactivated successfully.
Jan 23 06:11:17 np0005593233 podman[315101]: 2026-01-23 11:11:17.089314178 +0000 UTC m=+0.049109703 container remove cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.097 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[346a7b85-aff2-42b5-a333-a4b35c8b6433]: (4, ('Fri Jan 23 11:11:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 (cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6)\ncb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6\nFri Jan 23 11:11:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 (cb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6)\ncb3ffa812aebf950f9b747934de925d68d83042b4d2a250396b46c12b671abd6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.099 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[29c71825-5397-42af-89c4-5ff886cf7bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.101 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4706ca2-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.103 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:17 np0005593233 kernel: tapf4706ca2-10: left promiscuous mode
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.116 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.118 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6df527cd-a455-40d6-839b-091b6d6f5449]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.134 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[d5587db3-cb3a-483c-9162-8d5f6ae085ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.135 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[71837e4e-2efb-4ff9-bf9a-cc786a6c3626]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.150 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[aaee43de-74b5-4f47-a8ce-8ff2fd3b18d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1056667, 'reachable_time': 32598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315116, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 systemd[1]: run-netns-ovnmeta\x2df4706ca2\x2d15b6\x2d4141\x2d8d7b\x2d8d4cab159f24.mount: Deactivated successfully.
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.154 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:11:17 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:17.154 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[4f26c302-8569-45a4-808b-639f337538ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:17.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.652 222021 DEBUG nova.compute.manager [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-vif-unplugged-49296d60-7879-484f-bb75-1d8f5f61ce38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.653 222021 DEBUG oslo_concurrency.lockutils [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.653 222021 DEBUG oslo_concurrency.lockutils [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.653 222021 DEBUG oslo_concurrency.lockutils [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.654 222021 DEBUG nova.compute.manager [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] No waiting events found dispatching network-vif-unplugged-49296d60-7879-484f-bb75-1d8f5f61ce38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.654 222021 DEBUG nova.compute.manager [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-vif-unplugged-49296d60-7879-484f-bb75-1d8f5f61ce38 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.654 222021 DEBUG nova.compute.manager [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.654 222021 DEBUG oslo_concurrency.lockutils [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.654 222021 DEBUG oslo_concurrency.lockutils [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.654 222021 DEBUG oslo_concurrency.lockutils [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.655 222021 DEBUG nova.compute.manager [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] No waiting events found dispatching network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.655 222021 WARNING nova.compute.manager [req-c5cc0e7c-6f09-46e2-b5cc-c5162295b291 req-6ac6ac4f-b849-4ca8-b41f-61ba55ddfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received unexpected event network-vif-plugged-49296d60-7879-484f-bb75-1d8f5f61ce38 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.881 222021 INFO nova.virt.libvirt.driver [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Deleting instance files /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059_del#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.883 222021 INFO nova.virt.libvirt.driver [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Deletion of /var/lib/nova/instances/d31f326a-af44-4efe-96ba-07071e3c6059_del complete#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.949 222021 INFO nova.compute.manager [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Took 1.64 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.949 222021 DEBUG oslo.service.loopingcall [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.950 222021 DEBUG nova.compute.manager [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:11:17 np0005593233 nova_compute[222017]: 2026-01-23 11:11:17.951 222021 DEBUG nova.network.neutron [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:11:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:11:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.8 total, 600.0 interval#012Cumulative writes: 69K writes, 266K keys, 69K commit groups, 1.0 writes per commit group, ingest: 0.25 GB, 0.03 MB/s#012Cumulative WAL: 69K writes, 26K syncs, 2.63 writes per sync, written: 0.25 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2524 writes, 10K keys, 2524 commit groups, 1.0 writes per commit group, ingest: 10.14 MB, 0.02 MB/s#012Interval WAL: 2524 writes, 998 syncs, 2.53 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:11:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:19.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:19.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.295 222021 DEBUG nova.compute.manager [req-5d72e1a0-b3fa-4620-8c2f-bcb634e41eda req-5e8d8f95-6c37-4d67-ad2a-0165ef8af5b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Received event network-vif-deleted-49296d60-7879-484f-bb75-1d8f5f61ce38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.296 222021 INFO nova.compute.manager [req-5d72e1a0-b3fa-4620-8c2f-bcb634e41eda req-5e8d8f95-6c37-4d67-ad2a-0165ef8af5b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Neutron deleted interface 49296d60-7879-484f-bb75-1d8f5f61ce38; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.296 222021 DEBUG nova.network.neutron [req-5d72e1a0-b3fa-4620-8c2f-bcb634e41eda req-5e8d8f95-6c37-4d67-ad2a-0165ef8af5b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.335 222021 DEBUG nova.network.neutron [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.352 222021 INFO nova.compute.manager [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Took 2.40 seconds to deallocate network for instance.#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.358 222021 DEBUG nova.compute.manager [req-5d72e1a0-b3fa-4620-8c2f-bcb634e41eda req-5e8d8f95-6c37-4d67-ad2a-0165ef8af5b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Detach interface failed, port_id=49296d60-7879-484f-bb75-1d8f5f61ce38, reason: Instance d31f326a-af44-4efe-96ba-07071e3c6059 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.395 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.395 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.456 222021 DEBUG oslo_concurrency.processutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.501 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.959 222021 DEBUG oslo_concurrency.processutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:11:20 np0005593233 nova_compute[222017]: 2026-01-23 11:11:20.967 222021 DEBUG nova.compute.provider_tree [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:11:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:21.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:21 np0005593233 nova_compute[222017]: 2026-01-23 11:11:21.076 222021 DEBUG nova.scheduler.client.report [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:11:21 np0005593233 nova_compute[222017]: 2026-01-23 11:11:21.101 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:21 np0005593233 nova_compute[222017]: 2026-01-23 11:11:21.135 222021 INFO nova.scheduler.client.report [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Deleted allocations for instance d31f326a-af44-4efe-96ba-07071e3c6059#033[00m
Jan 23 06:11:21 np0005593233 nova_compute[222017]: 2026-01-23 11:11:21.208 222021 DEBUG oslo_concurrency.lockutils [None req-741da42c-2a9e-4788-b268-d46484a13101 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "d31f326a-af44-4efe-96ba-07071e3c6059" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:21.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:21 np0005593233 nova_compute[222017]: 2026-01-23 11:11:21.764 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:23.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:23.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:25.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:25.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:25 np0005593233 nova_compute[222017]: 2026-01-23 11:11:25.479 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:26 np0005593233 nova_compute[222017]: 2026-01-23 11:11:26.768 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:27.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:27.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:29.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:11:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:29.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:11:30 np0005593233 nova_compute[222017]: 2026-01-23 11:11:30.482 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:31.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:31.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:31 np0005593233 nova_compute[222017]: 2026-01-23 11:11:31.584 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166676.5829809, d31f326a-af44-4efe-96ba-07071e3c6059 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:11:31 np0005593233 nova_compute[222017]: 2026-01-23 11:11:31.584 222021 INFO nova.compute.manager [-] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:11:31 np0005593233 nova_compute[222017]: 2026-01-23 11:11:31.660 222021 DEBUG nova.compute.manager [None req-338780b2-e44f-477a-82d2-71599216470a - - - - - -] [instance: d31f326a-af44-4efe-96ba-07071e3c6059] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:11:31 np0005593233 nova_compute[222017]: 2026-01-23 11:11:31.771 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.414 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.414 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.415 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:11:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:11:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2672103795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:11:32 np0005593233 nova_compute[222017]: 2026-01-23 11:11:32.980 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:11:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:33.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:33 np0005593233 podman[315162]: 2026-01-23 11:11:33.053938494 +0000 UTC m=+0.068005405 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.085 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.182 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.183 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4233MB free_disk=20.98827362060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.184 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.184 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.234 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.235 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.250 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.269 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.270 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.297 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:11:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:33.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.325 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:11:33 np0005593233 nova_compute[222017]: 2026-01-23 11:11:33.341 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:11:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:11:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1715112870' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:11:34 np0005593233 nova_compute[222017]: 2026-01-23 11:11:34.452 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:11:34 np0005593233 nova_compute[222017]: 2026-01-23 11:11:34.462 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:11:34 np0005593233 nova_compute[222017]: 2026-01-23 11:11:34.478 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:11:34 np0005593233 nova_compute[222017]: 2026-01-23 11:11:34.501 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:11:34 np0005593233 nova_compute[222017]: 2026-01-23 11:11:34.502 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:11:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:35.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:11:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:35.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:35 np0005593233 nova_compute[222017]: 2026-01-23 11:11:35.485 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:36 np0005593233 nova_compute[222017]: 2026-01-23 11:11:36.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:37.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:37.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:37 np0005593233 nova_compute[222017]: 2026-01-23 11:11:37.501 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:38 np0005593233 nova_compute[222017]: 2026-01-23 11:11:38.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:38 np0005593233 nova_compute[222017]: 2026-01-23 11:11:38.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:11:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:39.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:39.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:40 np0005593233 nova_compute[222017]: 2026-01-23 11:11:40.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:40 np0005593233 nova_compute[222017]: 2026-01-23 11:11:40.487 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:41.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:41.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:41 np0005593233 nova_compute[222017]: 2026-01-23 11:11:41.777 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:42.737 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:42.738 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:11:42.738 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:43.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:43 np0005593233 nova_compute[222017]: 2026-01-23 11:11:43.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #217. Immutable memtables: 0.
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.335055) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 139] Flushing memtable with next log file: 217
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704335093, "job": 139, "event": "flush_started", "num_memtables": 1, "num_entries": 2347, "num_deletes": 251, "total_data_size": 5851729, "memory_usage": 5923200, "flush_reason": "Manual Compaction"}
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 139] Level-0 flush table #218: started
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704464616, "cf_name": "default", "job": 139, "event": "table_file_creation", "file_number": 218, "file_size": 3844408, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 101958, "largest_seqno": 104299, "table_properties": {"data_size": 3834799, "index_size": 6102, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19802, "raw_average_key_size": 20, "raw_value_size": 3815687, "raw_average_value_size": 3941, "num_data_blocks": 265, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166469, "oldest_key_time": 1769166469, "file_creation_time": 1769166704, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 218, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 139] Flush lasted 129617 microseconds, and 8078 cpu microseconds.
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.464668) [db/flush_job.cc:967] [default] [JOB 139] Level-0 flush table #218: 3844408 bytes OK
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.464689) [db/memtable_list.cc:519] [default] Level-0 commit table #218 started
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.577355) [db/memtable_list.cc:722] [default] Level-0 commit table #218: memtable #1 done
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.577412) EVENT_LOG_v1 {"time_micros": 1769166704577401, "job": 139, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.577437) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 139] Try to delete WAL files size 5841217, prev total WAL file size 5856661, number of live WAL files 2.
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000214.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.611393) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039323837' seq:72057594037927935, type:22 .. '7061786F730039353339' seq:0, type:0; will stop at (end)
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 140] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 139 Base level 0, inputs: [218(3754KB)], [216(11MB)]
Jan 23 06:11:44 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704611501, "job": 140, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [218], "files_L6": [216], "score": -1, "input_data_size": 16035729, "oldest_snapshot_seqno": -1}
Jan 23 06:11:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:45.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:45.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:45 np0005593233 nova_compute[222017]: 2026-01-23 11:11:45.489 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 140] Generated table #219: 11995 keys, 14003183 bytes, temperature: kUnknown
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166705799114, "cf_name": "default", "job": 140, "event": "table_file_creation", "file_number": 219, "file_size": 14003183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13928392, "index_size": 43702, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30021, "raw_key_size": 317164, "raw_average_key_size": 26, "raw_value_size": 13721663, "raw_average_value_size": 1143, "num_data_blocks": 1651, "num_entries": 11995, "num_filter_entries": 11995, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166704, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 219, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.799417) [db/compaction/compaction_job.cc:1663] [default] [JOB 140] Compacted 1@0 + 1@6 files to L6 => 14003183 bytes
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.965753) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 13.5 rd, 11.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 11.6 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 12512, records dropped: 517 output_compression: NoCompression
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.965864) EVENT_LOG_v1 {"time_micros": 1769166705965839, "job": 140, "event": "compaction_finished", "compaction_time_micros": 1187694, "compaction_time_cpu_micros": 64017, "output_level": 6, "num_output_files": 1, "total_output_size": 14003183, "num_input_records": 12512, "num_output_records": 11995, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000218.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166705967512, "job": 140, "event": "table_file_deletion", "file_number": 218}
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000216.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166705971560, "job": 140, "event": "table_file_deletion", "file_number": 216}
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:44.611200) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.971604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.971611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.971613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.971616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:45 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:11:45.971619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:46 np0005593233 nova_compute[222017]: 2026-01-23 11:11:46.780 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:47.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:47 np0005593233 podman[315205]: 2026-01-23 11:11:47.087998225 +0000 UTC m=+0.100831928 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 06:11:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:47.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:11:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:49.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:11:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:49.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:49 np0005593233 nova_compute[222017]: 2026-01-23 11:11:49.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:50 np0005593233 nova_compute[222017]: 2026-01-23 11:11:50.492 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:51.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:51.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:51 np0005593233 nova_compute[222017]: 2026-01-23 11:11:51.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:51 np0005593233 nova_compute[222017]: 2026-01-23 11:11:51.784 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:52 np0005593233 nova_compute[222017]: 2026-01-23 11:11:52.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:52 np0005593233 nova_compute[222017]: 2026-01-23 11:11:52.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:11:52 np0005593233 nova_compute[222017]: 2026-01-23 11:11:52.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:11:52 np0005593233 nova_compute[222017]: 2026-01-23 11:11:52.401 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:11:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:53.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:53.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:55.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:55.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:55 np0005593233 nova_compute[222017]: 2026-01-23 11:11:55.493 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:11:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.0 total, 600.0 interval#012Cumulative writes: 20K writes, 104K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.21 GB, 0.03 MB/s#012Cumulative WAL: 20K writes, 20K syncs, 1.00 writes per sync, written: 0.21 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1392 writes, 6943 keys, 1392 commit groups, 1.0 writes per commit group, ingest: 15.31 MB, 0.03 MB/s#012Interval WAL: 1391 writes, 1391 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     44.7      2.96              0.58        70    0.042       0      0       0.0       0.0#012  L6      1/0   13.35 MB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   5.5     68.5     58.8     12.34              2.63        69    0.179    576K    36K       0.0       0.0#012 Sum      1/0   13.35 MB   0.0      0.8     0.1      0.7       0.8      0.1       0.0   6.5     55.2     56.1     15.30              3.21       139    0.110    576K    36K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0     27.5     28.1      2.75              0.26        10    0.275     61K   2603       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   0.0     68.5     58.8     12.34              2.63        69    0.179    576K    36K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     44.7      2.96              0.58        69    0.043       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7800.0 total, 600.0 interval#012Flush(GB): cumulative 0.129, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.84 GB write, 0.11 MB/s write, 0.83 GB read, 0.11 MB/s read, 15.3 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 2.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55962d2cb1f0#2 capacity: 304.00 MB usage: 89.94 MB table_size: 0 occupancy: 18446744073709551615 collections: 14 last_copies: 0 last_secs: 0.000801 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5114,85.93 MB,28.2672%) FilterBlock(139,1.57 MB,0.516405%) IndexBlock(139,2.44 MB,0.80322%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 06:11:56 np0005593233 nova_compute[222017]: 2026-01-23 11:11:56.787 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:57.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:11:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:57.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:59.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:11:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:11:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:59.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:00 np0005593233 nova_compute[222017]: 2026-01-23 11:12:00.536 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:01.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:01.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:01 np0005593233 nova_compute[222017]: 2026-01-23 11:12:01.824 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:03.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:03 np0005593233 podman[315332]: 2026-01-23 11:12:03.204768571 +0000 UTC m=+0.083059078 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:12:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:12:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:03.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:12:03 np0005593233 podman[315421]: 2026-01-23 11:12:03.724154071 +0000 UTC m=+0.073583601 container exec 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 23 06:12:03 np0005593233 podman[315421]: 2026-01-23 11:12:03.824799032 +0000 UTC m=+0.174228622 container exec_died 0b0f174e85f8a0f730afd47db251e5da157a09cdb57ae414e8fafdc5156516dc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-1, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 06:12:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:04 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:05.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:05.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:05 np0005593233 nova_compute[222017]: 2026-01-23 11:12:05.538 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:12:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:06 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:12:06 np0005593233 nova_compute[222017]: 2026-01-23 11:12:06.858 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:07.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:07.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:07 np0005593233 ovn_controller[130653]: 2026-01-23T11:12:07Z|00932|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 06:12:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:09.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:09.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:10 np0005593233 nova_compute[222017]: 2026-01-23 11:12:10.574 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:11.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:11.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:11 np0005593233 nova_compute[222017]: 2026-01-23 11:12:11.906 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:13.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:15 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:15.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:15.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:15 np0005593233 nova_compute[222017]: 2026-01-23 11:12:15.605 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:16 np0005593233 nova_compute[222017]: 2026-01-23 11:12:16.911 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:17.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:17.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:12:17 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/45780279' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:12:18 np0005593233 podman[315727]: 2026-01-23 11:12:18.050115043 +0000 UTC m=+0.138033824 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:12:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:19.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:19.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:20 np0005593233 nova_compute[222017]: 2026-01-23 11:12:20.608 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:21.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:21.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:12:21.718 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=104, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=103) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:12:21 np0005593233 nova_compute[222017]: 2026-01-23 11:12:21.718 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:21 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:12:21.719 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:12:21 np0005593233 nova_compute[222017]: 2026-01-23 11:12:21.913 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:23.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:23.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:25.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:25.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:25 np0005593233 nova_compute[222017]: 2026-01-23 11:12:25.610 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:26 np0005593233 nova_compute[222017]: 2026-01-23 11:12:26.916 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:12:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:27.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:12:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:27.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:12:28.721 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '104'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:12:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:29.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:29.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:30 np0005593233 nova_compute[222017]: 2026-01-23 11:12:30.645 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:31.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:31.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:31 np0005593233 nova_compute[222017]: 2026-01-23 11:12:31.949 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:33.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:33.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:34 np0005593233 podman[315753]: 2026-01-23 11:12:34.078857764 +0000 UTC m=+0.075382772 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.413 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.415 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.415 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.416 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:12:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:12:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3306028728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:12:34 np0005593233 nova_compute[222017]: 2026-01-23 11:12:34.868 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.052 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.053 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4256MB free_disk=20.946685791015625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.054 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.054 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:12:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:35.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.129 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.129 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.151 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:12:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:35.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:12:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/18476170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.663 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.669 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.693 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.747 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.749 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:12:35 np0005593233 nova_compute[222017]: 2026-01-23 11:12:35.749 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:12:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:36 np0005593233 nova_compute[222017]: 2026-01-23 11:12:36.749 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:36 np0005593233 nova_compute[222017]: 2026-01-23 11:12:36.953 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:37.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:38 np0005593233 nova_compute[222017]: 2026-01-23 11:12:38.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:38 np0005593233 nova_compute[222017]: 2026-01-23 11:12:38.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:12:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:39.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:39.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:40 np0005593233 nova_compute[222017]: 2026-01-23 11:12:40.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:40 np0005593233 nova_compute[222017]: 2026-01-23 11:12:40.694 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:41.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:12:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:41.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:12:41 np0005593233 nova_compute[222017]: 2026-01-23 11:12:41.955 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:12:42.739 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:12:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:12:42.739 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:12:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:12:42.739 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:12:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:43.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:43.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:44 np0005593233 nova_compute[222017]: 2026-01-23 11:12:44.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:45.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:45.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:45 np0005593233 nova_compute[222017]: 2026-01-23 11:12:45.699 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:46 np0005593233 nova_compute[222017]: 2026-01-23 11:12:46.959 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:47.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:47.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:49 np0005593233 podman[315816]: 2026-01-23 11:12:49.104769675 +0000 UTC m=+0.110157680 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 23 06:12:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:49.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:49.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:50 np0005593233 nova_compute[222017]: 2026-01-23 11:12:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:50 np0005593233 nova_compute[222017]: 2026-01-23 11:12:50.699 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:51.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:51.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:51 np0005593233 nova_compute[222017]: 2026-01-23 11:12:51.963 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:52 np0005593233 nova_compute[222017]: 2026-01-23 11:12:52.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:53.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:53 np0005593233 nova_compute[222017]: 2026-01-23 11:12:53.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:53 np0005593233 nova_compute[222017]: 2026-01-23 11:12:53.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:12:53 np0005593233 nova_compute[222017]: 2026-01-23 11:12:53.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:12:53 np0005593233 nova_compute[222017]: 2026-01-23 11:12:53.413 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:12:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:53.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:55.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:12:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:55.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:12:55 np0005593233 nova_compute[222017]: 2026-01-23 11:12:55.735 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:56 np0005593233 nova_compute[222017]: 2026-01-23 11:12:56.966 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:57.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:57.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:12:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:59.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:12:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:12:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:59.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:00 np0005593233 nova_compute[222017]: 2026-01-23 11:13:00.774 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:01.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:01.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:01 np0005593233 nova_compute[222017]: 2026-01-23 11:13:01.969 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:03.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:03.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:05 np0005593233 podman[315842]: 2026-01-23 11:13:05.05543672 +0000 UTC m=+0.062711215 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:13:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:05.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:05.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:05 np0005593233 nova_compute[222017]: 2026-01-23 11:13:05.776 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e428 e428: 3 total, 3 up, 3 in
Jan 23 06:13:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e428 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:06 np0005593233 nova_compute[222017]: 2026-01-23 11:13:06.972 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:07.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e429 e429: 3 total, 3 up, 3 in
Jan 23 06:13:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:07.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e430 e430: 3 total, 3 up, 3 in
Jan 23 06:13:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:09.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:09.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:10 np0005593233 nova_compute[222017]: 2026-01-23 11:13:10.778 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:11.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:11 np0005593233 nova_compute[222017]: 2026-01-23 11:13:11.975 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:13.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:13.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 06:13:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 06:13:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 06:13:15 np0005593233 radosgw[84337]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 06:13:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:15.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:15.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:15 np0005593233 nova_compute[222017]: 2026-01-23 11:13:15.781 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e431 e431: 3 total, 3 up, 3 in
Jan 23 06:13:16 np0005593233 nova_compute[222017]: 2026-01-23 11:13:16.979 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:17.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:17.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:17 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:13:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:18 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:13:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:19.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:19 np0005593233 nova_compute[222017]: 2026-01-23 11:13:19.409 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:19.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:20 np0005593233 podman[315994]: 2026-01-23 11:13:20.105176641 +0000 UTC m=+0.117825955 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 06:13:20 np0005593233 nova_compute[222017]: 2026-01-23 11:13:20.783 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:21.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:21.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:22 np0005593233 nova_compute[222017]: 2026-01-23 11:13:22.024 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:23.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:23.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:25.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:25.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:25 np0005593233 nova_compute[222017]: 2026-01-23 11:13:25.827 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:25 np0005593233 systemd[1]: Starting dnf makecache...
Jan 23 06:13:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:26 np0005593233 dnf[316020]: Metadata cache refreshed recently.
Jan 23 06:13:26 np0005593233 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 06:13:26 np0005593233 systemd[1]: Finished dnf makecache.
Jan 23 06:13:27 np0005593233 nova_compute[222017]: 2026-01-23 11:13:27.060 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:27.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:27 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:13:28.151 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=105, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=104) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:13:28 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:13:28.152 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:13:28 np0005593233 nova_compute[222017]: 2026-01-23 11:13:28.153 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:29.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:30 np0005593233 nova_compute[222017]: 2026-01-23 11:13:30.862 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:31 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:13:31.155 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '105'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:31.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:31.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:31 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #220. Immutable memtables: 0.
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.736245) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 141] Flushing memtable with next log file: 220
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811736310, "job": 141, "event": "flush_started", "num_memtables": 1, "num_entries": 1357, "num_deletes": 256, "total_data_size": 2997956, "memory_usage": 3045056, "flush_reason": "Manual Compaction"}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 141] Level-0 flush table #221: started
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811756882, "cf_name": "default", "job": 141, "event": "table_file_creation", "file_number": 221, "file_size": 1958201, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 104304, "largest_seqno": 105656, "table_properties": {"data_size": 1952288, "index_size": 3243, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12748, "raw_average_key_size": 19, "raw_value_size": 1940261, "raw_average_value_size": 3041, "num_data_blocks": 141, "num_entries": 638, "num_filter_entries": 638, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166704, "oldest_key_time": 1769166704, "file_creation_time": 1769166811, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 221, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 141] Flush lasted 20820 microseconds, and 10756 cpu microseconds.
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.757056) [db/flush_job.cc:967] [default] [JOB 141] Level-0 flush table #221: 1958201 bytes OK
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.757097) [db/memtable_list.cc:519] [default] Level-0 commit table #221 started
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.759882) [db/memtable_list.cc:722] [default] Level-0 commit table #221: memtable #1 done
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.759992) EVENT_LOG_v1 {"time_micros": 1769166811759977, "job": 141, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.760026) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 141] Try to delete WAL files size 2991453, prev total WAL file size 3006897, number of live WAL files 2.
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000217.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.761160) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323733' seq:72057594037927935, type:22 .. '6C6F676D0034353234' seq:0, type:0; will stop at (end)
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 142] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 141 Base level 0, inputs: [221(1912KB)], [219(13MB)]
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811761202, "job": 142, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [221], "files_L6": [219], "score": -1, "input_data_size": 15961384, "oldest_snapshot_seqno": -1}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 142] Generated table #222: 12102 keys, 15831371 bytes, temperature: kUnknown
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811950809, "cf_name": "default", "job": 142, "event": "table_file_creation", "file_number": 222, "file_size": 15831371, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15753611, "index_size": 46415, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30277, "raw_key_size": 320457, "raw_average_key_size": 26, "raw_value_size": 15542755, "raw_average_value_size": 1284, "num_data_blocks": 1762, "num_entries": 12102, "num_filter_entries": 12102, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166811, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 222, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.951364) [db/compaction/compaction_job.cc:1663] [default] [JOB 142] Compacted 1@0 + 1@6 files to L6 => 15831371 bytes
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.953082) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 84.1 rd, 83.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.4 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(16.2) write-amplify(8.1) OK, records in: 12633, records dropped: 531 output_compression: NoCompression
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.953115) EVENT_LOG_v1 {"time_micros": 1769166811953100, "job": 142, "event": "compaction_finished", "compaction_time_micros": 189833, "compaction_time_cpu_micros": 39423, "output_level": 6, "num_output_files": 1, "total_output_size": 15831371, "num_input_records": 12633, "num_output_records": 12102, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000221.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811953817, "job": 142, "event": "table_file_deletion", "file_number": 221}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000219.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811958360, "job": 142, "event": "table_file_deletion", "file_number": 219}
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.761096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.958565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.958576) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.958578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.958579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:13:31.958581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:32 np0005593233 nova_compute[222017]: 2026-01-23 11:13:32.062 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:33.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:33.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:35.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:35 np0005593233 nova_compute[222017]: 2026-01-23 11:13:35.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:35 np0005593233 nova_compute[222017]: 2026-01-23 11:13:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:13:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:35.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:13:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e432 e432: 3 total, 3 up, 3 in
Jan 23 06:13:35 np0005593233 nova_compute[222017]: 2026-01-23 11:13:35.865 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:36 np0005593233 podman[316072]: 2026-01-23 11:13:36.069195951 +0000 UTC m=+0.069609259 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 06:13:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e432 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:36 np0005593233 nova_compute[222017]: 2026-01-23 11:13:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:36 np0005593233 nova_compute[222017]: 2026-01-23 11:13:36.464 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:36 np0005593233 nova_compute[222017]: 2026-01-23 11:13:36.465 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:36 np0005593233 nova_compute[222017]: 2026-01-23 11:13:36.465 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:36 np0005593233 nova_compute[222017]: 2026-01-23 11:13:36.465 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:13:36 np0005593233 nova_compute[222017]: 2026-01-23 11:13:36.465 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:13:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1247197691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.009 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.065 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.199 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.201 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4267MB free_disk=20.94275665283203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.201 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.202 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:37.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.270 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.270 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.321 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:13:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:37.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:13:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:13:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1264482497' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.803 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.812 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.830 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.859 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:13:37 np0005593233 nova_compute[222017]: 2026-01-23 11:13:37.860 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:39.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:39.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:39 np0005593233 nova_compute[222017]: 2026-01-23 11:13:39.861 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:39 np0005593233 nova_compute[222017]: 2026-01-23 11:13:39.861 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:13:40 np0005593233 nova_compute[222017]: 2026-01-23 11:13:40.867 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e432 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:41.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:41 np0005593233 nova_compute[222017]: 2026-01-23 11:13:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:41.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 e433: 3 total, 3 up, 3 in
Jan 23 06:13:42 np0005593233 nova_compute[222017]: 2026-01-23 11:13:42.068 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:13:42.739 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:13:42.740 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:13:42.740 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:43.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:43.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:44 np0005593233 nova_compute[222017]: 2026-01-23 11:13:44.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:13:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:45.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:13:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:13:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:45.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:13:45 np0005593233 nova_compute[222017]: 2026-01-23 11:13:45.869 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:47 np0005593233 nova_compute[222017]: 2026-01-23 11:13:47.071 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:47.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:47.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:49.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:49.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:50 np0005593233 nova_compute[222017]: 2026-01-23 11:13:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:50 np0005593233 nova_compute[222017]: 2026-01-23 11:13:50.917 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:51 np0005593233 podman[316135]: 2026-01-23 11:13:51.136901038 +0000 UTC m=+0.147560462 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 06:13:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:13:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:51.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:13:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:51.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:52 np0005593233 nova_compute[222017]: 2026-01-23 11:13:52.074 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:52 np0005593233 nova_compute[222017]: 2026-01-23 11:13:52.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:53.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:53 np0005593233 nova_compute[222017]: 2026-01-23 11:13:53.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:53 np0005593233 nova_compute[222017]: 2026-01-23 11:13:53.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:13:53 np0005593233 nova_compute[222017]: 2026-01-23 11:13:53.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:13:53 np0005593233 nova_compute[222017]: 2026-01-23 11:13:53.454 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:13:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:53.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:13:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:55.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:55 np0005593233 nova_compute[222017]: 2026-01-23 11:13:55.921 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:57 np0005593233 nova_compute[222017]: 2026-01-23 11:13:57.078 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:57.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:57.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:13:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:13:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:59.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:00 np0005593233 nova_compute[222017]: 2026-01-23 11:14:00.969 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:01.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:01.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:02 np0005593233 nova_compute[222017]: 2026-01-23 11:14:02.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:03.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:03.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:05.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:05.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:05 np0005593233 nova_compute[222017]: 2026-01-23 11:14:05.972 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:07 np0005593233 podman[316161]: 2026-01-23 11:14:07.081997754 +0000 UTC m=+0.086568826 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 06:14:07 np0005593233 nova_compute[222017]: 2026-01-23 11:14:07.082 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:07.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:07.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:09.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:09.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:14:09.940 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=106, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=105) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:14:09 np0005593233 nova_compute[222017]: 2026-01-23 11:14:09.941 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:14:09.942 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:14:09 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:14:09.943 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '106'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:10 np0005593233 nova_compute[222017]: 2026-01-23 11:14:10.974 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:14:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:11.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:14:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:11.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:12 np0005593233 nova_compute[222017]: 2026-01-23 11:14:12.086 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:13.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:13.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:15.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:15.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:16 np0005593233 nova_compute[222017]: 2026-01-23 11:14:16.019 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:17 np0005593233 nova_compute[222017]: 2026-01-23 11:14:17.125 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:17.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:17.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 06:14:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:19.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 06:14:21 np0005593233 nova_compute[222017]: 2026-01-23 11:14:21.022 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:21.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:21.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:22 np0005593233 nova_compute[222017]: 2026-01-23 11:14:22.128 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:22 np0005593233 podman[316181]: 2026-01-23 11:14:22.130831252 +0000 UTC m=+0.142709946 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 06:14:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:23.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:23.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:25.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:25.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:26 np0005593233 nova_compute[222017]: 2026-01-23 11:14:26.025 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:27 np0005593233 nova_compute[222017]: 2026-01-23 11:14:27.130 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:27.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:27.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:14:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:14:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:14:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:29.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:29.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:31 np0005593233 nova_compute[222017]: 2026-01-23 11:14:31.028 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:31.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:31.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:32 np0005593233 nova_compute[222017]: 2026-01-23 11:14:32.133 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:33.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:33.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:14:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:35.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:14:35 np0005593233 nova_compute[222017]: 2026-01-23 11:14:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:35 np0005593233 nova_compute[222017]: 2026-01-23 11:14:35.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:35.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.032 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.532 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.533 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.535 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.535 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:14:36 np0005593233 nova_compute[222017]: 2026-01-23 11:14:36.536 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:14:36 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:14:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:14:36 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3095556044' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.012 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.168 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.267 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.269 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4263MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.269 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.270 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:37.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.360 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.361 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.379 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:37.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:14:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2029075520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.874 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.883 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.905 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.924 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:14:37 np0005593233 nova_compute[222017]: 2026-01-23 11:14:37.925 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:38 np0005593233 podman[316430]: 2026-01-23 11:14:38.064056856 +0000 UTC m=+0.073286662 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 06:14:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:39.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:41 np0005593233 nova_compute[222017]: 2026-01-23 11:14:41.034 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:41.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:41.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:41 np0005593233 nova_compute[222017]: 2026-01-23 11:14:41.926 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:41 np0005593233 nova_compute[222017]: 2026-01-23 11:14:41.926 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:41 np0005593233 nova_compute[222017]: 2026-01-23 11:14:41.926 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:14:42 np0005593233 nova_compute[222017]: 2026-01-23 11:14:42.170 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:14:42.740 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:14:42.741 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:14:42.741 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:45.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:45 np0005593233 nova_compute[222017]: 2026-01-23 11:14:45.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:45.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:46 np0005593233 nova_compute[222017]: 2026-01-23 11:14:46.035 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:47 np0005593233 nova_compute[222017]: 2026-01-23 11:14:47.214 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:47.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:47.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:49.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:49.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:50 np0005593233 nova_compute[222017]: 2026-01-23 11:14:50.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:51 np0005593233 nova_compute[222017]: 2026-01-23 11:14:51.037 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:51.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:51.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:52 np0005593233 nova_compute[222017]: 2026-01-23 11:14:52.217 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:53 np0005593233 podman[316450]: 2026-01-23 11:14:53.121040102 +0000 UTC m=+0.120405908 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:14:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:53.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:53 np0005593233 nova_compute[222017]: 2026-01-23 11:14:53.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:53.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:55.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:55 np0005593233 nova_compute[222017]: 2026-01-23 11:14:55.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:55 np0005593233 nova_compute[222017]: 2026-01-23 11:14:55.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:14:55 np0005593233 nova_compute[222017]: 2026-01-23 11:14:55.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:14:55 np0005593233 nova_compute[222017]: 2026-01-23 11:14:55.422 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:14:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:55.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:56 np0005593233 nova_compute[222017]: 2026-01-23 11:14:56.039 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:56 np0005593233 nova_compute[222017]: 2026-01-23 11:14:56.390 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #223. Immutable memtables: 0.
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.804316) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 143] Flushing memtable with next log file: 223
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896804379, "job": 143, "event": "flush_started", "num_memtables": 1, "num_entries": 1093, "num_deletes": 251, "total_data_size": 2267701, "memory_usage": 2304728, "flush_reason": "Manual Compaction"}
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 143] Level-0 flush table #224: started
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896818673, "cf_name": "default", "job": 143, "event": "table_file_creation", "file_number": 224, "file_size": 931973, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 105661, "largest_seqno": 106749, "table_properties": {"data_size": 928020, "index_size": 1604, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10798, "raw_average_key_size": 21, "raw_value_size": 919386, "raw_average_value_size": 1802, "num_data_blocks": 69, "num_entries": 510, "num_filter_entries": 510, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166811, "oldest_key_time": 1769166811, "file_creation_time": 1769166896, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 224, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 143] Flush lasted 14512 microseconds, and 7566 cpu microseconds.
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.818818) [db/flush_job.cc:967] [default] [JOB 143] Level-0 flush table #224: 931973 bytes OK
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.818863) [db/memtable_list.cc:519] [default] Level-0 commit table #224 started
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.820640) [db/memtable_list.cc:722] [default] Level-0 commit table #224: memtable #1 done
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.820680) EVENT_LOG_v1 {"time_micros": 1769166896820667, "job": 143, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.820719) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 143] Try to delete WAL files size 2262295, prev total WAL file size 2262295, number of live WAL files 2.
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000220.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.822381) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373732' seq:72057594037927935, type:22 .. '6D6772737461740034303233' seq:0, type:0; will stop at (end)
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 144] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 143 Base level 0, inputs: [224(910KB)], [222(15MB)]
Jan 23 06:14:56 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896822480, "job": 144, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [224], "files_L6": [222], "score": -1, "input_data_size": 16763344, "oldest_snapshot_seqno": -1}
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 144] Generated table #225: 12130 keys, 13491186 bytes, temperature: kUnknown
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166897000019, "cf_name": "default", "job": 144, "event": "table_file_creation", "file_number": 225, "file_size": 13491186, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13416867, "index_size": 42910, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30341, "raw_key_size": 321365, "raw_average_key_size": 26, "raw_value_size": 13209173, "raw_average_value_size": 1088, "num_data_blocks": 1616, "num_entries": 12130, "num_filter_entries": 12130, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166896, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 225, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.000304) [db/compaction/compaction_job.cc:1663] [default] [JOB 144] Compacted 1@0 + 1@6 files to L6 => 13491186 bytes
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.028554) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 94.5 rd, 76.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 15.1 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(32.5) write-amplify(14.5) OK, records in: 12612, records dropped: 482 output_compression: NoCompression
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.028616) EVENT_LOG_v1 {"time_micros": 1769166897028596, "job": 144, "event": "compaction_finished", "compaction_time_micros": 177323, "compaction_time_cpu_micros": 61557, "output_level": 6, "num_output_files": 1, "total_output_size": 13491186, "num_input_records": 12612, "num_output_records": 12130, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000224.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166897029284, "job": 144, "event": "table_file_deletion", "file_number": 224}
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000222.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166897034632, "job": 144, "event": "table_file_deletion", "file_number": 222}
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:56.822251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.034721) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.034726) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.034727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.034729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:14:57.034730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593233 nova_compute[222017]: 2026-01-23 11:14:57.220 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:14:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:57.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:14:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:57.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:14:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:59.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:14:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:14:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:59.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:01 np0005593233 nova_compute[222017]: 2026-01-23 11:15:01.042 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:01.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:01.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:02 np0005593233 nova_compute[222017]: 2026-01-23 11:15:02.229 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:03.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:03.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:05.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:05.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:06 np0005593233 nova_compute[222017]: 2026-01-23 11:15:06.077 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:07 np0005593233 nova_compute[222017]: 2026-01-23 11:15:07.286 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:07.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:07.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:09 np0005593233 podman[316477]: 2026-01-23 11:15:09.052332892 +0000 UTC m=+0.059270179 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:15:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:09.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:09.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:11 np0005593233 nova_compute[222017]: 2026-01-23 11:15:11.080 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:15:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:11.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:15:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:11.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:12 np0005593233 nova_compute[222017]: 2026-01-23 11:15:12.320 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:13.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:13.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #226. Immutable memtables: 0.
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.220534) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 145] Flushing memtable with next log file: 226
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914220586, "job": 145, "event": "flush_started", "num_memtables": 1, "num_entries": 403, "num_deletes": 251, "total_data_size": 457220, "memory_usage": 465736, "flush_reason": "Manual Compaction"}
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 145] Level-0 flush table #227: started
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914474546, "cf_name": "default", "job": 145, "event": "table_file_creation", "file_number": 227, "file_size": 301788, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 106750, "largest_seqno": 107152, "table_properties": {"data_size": 299466, "index_size": 485, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5716, "raw_average_key_size": 18, "raw_value_size": 294876, "raw_average_value_size": 963, "num_data_blocks": 21, "num_entries": 306, "num_filter_entries": 306, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166897, "oldest_key_time": 1769166897, "file_creation_time": 1769166914, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 227, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 145] Flush lasted 254091 microseconds, and 1928 cpu microseconds.
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.474615) [db/flush_job.cc:967] [default] [JOB 145] Level-0 flush table #227: 301788 bytes OK
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.474646) [db/memtable_list.cc:519] [default] Level-0 commit table #227 started
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.819257) [db/memtable_list.cc:722] [default] Level-0 commit table #227: memtable #1 done
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.819337) EVENT_LOG_v1 {"time_micros": 1769166914819319, "job": 145, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.819379) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 145] Try to delete WAL files size 454616, prev total WAL file size 454616, number of live WAL files 2.
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000223.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.820370) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039353338' seq:72057594037927935, type:22 .. '7061786F730039373930' seq:0, type:0; will stop at (end)
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 146] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 145 Base level 0, inputs: [227(294KB)], [225(12MB)]
Jan 23 06:15:14 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914820648, "job": 146, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [227], "files_L6": [225], "score": -1, "input_data_size": 13792974, "oldest_snapshot_seqno": -1}
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 146] Generated table #228: 11926 keys, 11803384 bytes, temperature: kUnknown
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166915285092, "cf_name": "default", "job": 146, "event": "table_file_creation", "file_number": 228, "file_size": 11803384, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11731854, "index_size": 40633, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29829, "raw_key_size": 317827, "raw_average_key_size": 26, "raw_value_size": 11528881, "raw_average_value_size": 966, "num_data_blocks": 1514, "num_entries": 11926, "num_filter_entries": 11926, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769166914, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 228, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.285512) [db/compaction/compaction_job.cc:1663] [default] [JOB 146] Compacted 1@0 + 1@6 files to L6 => 11803384 bytes
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.369477) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 29.7 rd, 25.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 12.9 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(84.8) write-amplify(39.1) OK, records in: 12436, records dropped: 510 output_compression: NoCompression
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.369539) EVENT_LOG_v1 {"time_micros": 1769166915369516, "job": 146, "event": "compaction_finished", "compaction_time_micros": 464510, "compaction_time_cpu_micros": 65166, "output_level": 6, "num_output_files": 1, "total_output_size": 11803384, "num_input_records": 12436, "num_output_records": 11926, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000227.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:15:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:14.820236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166915370118, "job": 0, "event": "table_file_deletion", "file_number": 227}
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.370134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.370145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.370150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.370154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:15:15.370268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:15.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000225.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:15:15 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166915375883, "job": 0, "event": "table_file_deletion", "file_number": 225}
Jan 23 06:15:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:15.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:16 np0005593233 nova_compute[222017]: 2026-01-23 11:15:16.134 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:17 np0005593233 nova_compute[222017]: 2026-01-23 11:15:17.324 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:17.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:17.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:19.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:19.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:21 np0005593233 nova_compute[222017]: 2026-01-23 11:15:21.137 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:21.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:21.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:22 np0005593233 nova_compute[222017]: 2026-01-23 11:15:22.466 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:15:22.979 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=107, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=106) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:15:22 np0005593233 nova_compute[222017]: 2026-01-23 11:15:22.980 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:22 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:15:22.980 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:15:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:23.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:23.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:24 np0005593233 podman[316496]: 2026-01-23 11:15:24.101022444 +0000 UTC m=+0.101513687 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:15:24 np0005593233 nova_compute[222017]: 2026-01-23 11:15:24.406 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:25.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:25.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:26 np0005593233 nova_compute[222017]: 2026-01-23 11:15:26.139 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:27.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:27 np0005593233 nova_compute[222017]: 2026-01-23 11:15:27.506 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:27.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:27 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:15:27.982 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '107'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:29.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:15:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:29.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:15:31 np0005593233 nova_compute[222017]: 2026-01-23 11:15:31.149 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:31.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:31.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:32 np0005593233 nova_compute[222017]: 2026-01-23 11:15:32.509 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:33 np0005593233 nova_compute[222017]: 2026-01-23 11:15:33.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:33.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:33.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:35.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:35.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:36 np0005593233 nova_compute[222017]: 2026-01-23 11:15:36.038 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:36 np0005593233 nova_compute[222017]: 2026-01-23 11:15:36.039 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:36 np0005593233 nova_compute[222017]: 2026-01-23 11:15:36.152 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:37.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.505 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.506 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.506 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.506 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.507 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.544 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:37.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:15:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/694482692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:15:37 np0005593233 nova_compute[222017]: 2026-01-23 11:15:37.972 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.146 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.147 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4252MB free_disk=20.98813247680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.147 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.148 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.724 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.725 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:15:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:38 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:38 np0005593233 nova_compute[222017]: 2026-01-23 11:15:38.967 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:39.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:15:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/394837812' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:15:39 np0005593233 nova_compute[222017]: 2026-01-23 11:15:39.487 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:39 np0005593233 nova_compute[222017]: 2026-01-23 11:15:39.495 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:15:39 np0005593233 nova_compute[222017]: 2026-01-23 11:15:39.566 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:15:39 np0005593233 nova_compute[222017]: 2026-01-23 11:15:39.568 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:15:39 np0005593233 nova_compute[222017]: 2026-01-23 11:15:39.568 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:39.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:40 np0005593233 podman[316699]: 2026-01-23 11:15:40.054529768 +0000 UTC m=+0.061716297 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:15:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:15:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:40 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.155 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.387 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.387 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:15:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:41.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.411 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.411 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:41 np0005593233 nova_compute[222017]: 2026-01-23 11:15:41.411 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:15:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:41.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:42 np0005593233 nova_compute[222017]: 2026-01-23 11:15:42.547 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:15:42.741 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:15:42.741 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:15:42.741 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:43.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:43 np0005593233 nova_compute[222017]: 2026-01-23 11:15:43.430 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:43.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:43 np0005593233 nova_compute[222017]: 2026-01-23 11:15:43.936 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:43 np0005593233 nova_compute[222017]: 2026-01-23 11:15:43.937 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:43 np0005593233 nova_compute[222017]: 2026-01-23 11:15:43.937 222021 INFO nova.compute.manager [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Unshelving#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.003 222021 INFO nova.virt.block_device [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Booting with volume 525f185e-d0f4-4a0b-bd48-9219445747c5 at /dev/vda#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.305 222021 DEBUG os_brick.utils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.307 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.324 229882 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.325 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[77818f52-b775-4d2f-9c12-4f34ffda7c89]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.325 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.338 229882 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.338 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[f383cd6f-0b4c-42d4-9e25-bcd61e2e88e7]: (4, ('InitiatorName=iqn.1994-05.com.redhat:6b3426e5528c', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.340 229882 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.352 229882 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.352 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[0d399e69-a858-4e00-bf40-a315b6965656]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.354 229882 DEBUG oslo.privsep.daemon [-] privsep: reply[57647d4d-8ea3-4c47-a7fc-2773ad9e5830]: (4, '5e159ac4-110b-464c-8264-d020fcde6246') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.354 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.399 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "nvme version" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.401 222021 DEBUG os_brick.initiator.connectors.lightos [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.402 222021 DEBUG os_brick.initiator.connectors.lightos [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.402 222021 DEBUG os_brick.initiator.connectors.lightos [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.402 222021 DEBUG os_brick.utils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] <== get_connector_properties: return (96ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:6b3426e5528c', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '5e159ac4-110b-464c-8264-d020fcde6246', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 06:15:44 np0005593233 nova_compute[222017]: 2026-01-23 11:15:44.403 222021 DEBUG nova.virt.block_device [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating existing volume attachment record: b5f73267-3591-4cf8-9f14-9564a497d0cd _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 06:15:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:45.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:45.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.157 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.887 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.887 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.899 222021 DEBUG nova.objects.instance [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'pci_requests' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.928 222021 DEBUG nova.objects.instance [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'numa_topology' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.944 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:15:46 np0005593233 nova_compute[222017]: 2026-01-23 11:15:46.945 222021 INFO nova.compute.claims [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.074 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:47.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.549 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:15:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3939394115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.682 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.688 222021 DEBUG nova.compute.provider_tree [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.713 222021 DEBUG nova.scheduler.client.report [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:15:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:47.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:47 np0005593233 nova_compute[222017]: 2026-01-23 11:15:47.749 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:48 np0005593233 nova_compute[222017]: 2026-01-23 11:15:48.436 222021 INFO nova.network.neutron [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 06:15:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:48 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:49.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:49.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:49 np0005593233 nova_compute[222017]: 2026-01-23 11:15:49.730 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:15:49 np0005593233 nova_compute[222017]: 2026-01-23 11:15:49.730 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:15:49 np0005593233 nova_compute[222017]: 2026-01-23 11:15:49.730 222021 DEBUG nova.network.neutron [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:15:49 np0005593233 nova_compute[222017]: 2026-01-23 11:15:49.931 222021 DEBUG nova.compute.manager [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:15:49 np0005593233 nova_compute[222017]: 2026-01-23 11:15:49.932 222021 DEBUG nova.compute.manager [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing instance network info cache due to event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:15:49 np0005593233 nova_compute[222017]: 2026-01-23 11:15:49.932 222021 DEBUG oslo_concurrency.lockutils [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:15:51 np0005593233 nova_compute[222017]: 2026-01-23 11:15:51.159 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:51 np0005593233 nova_compute[222017]: 2026-01-23 11:15:51.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:15:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:51.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:15:51 np0005593233 nova_compute[222017]: 2026-01-23 11:15:51.429 222021 DEBUG nova.network.neutron [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:15:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:51.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.397 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.399 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.399 222021 INFO nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Creating image(s)#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.400 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.400 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Ensure instance console log exists: /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.400 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.400 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.401 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.403 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Start _get_guest_xml network_info=[{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-525f185e-d0f4-4a0b-bd48-9219445747c5', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '525f185e-d0f4-4a0b-bd48-9219445747c5', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'attached_at': '', 'detached_at': '', 'volume_id': '525f185e-d0f4-4a0b-bd48-9219445747c5', 'serial': '525f185e-d0f4-4a0b-bd48-9219445747c5'}, 'delete_on_termination': True, 'guest_format': None, 'mount_device': '/dev/vda', 'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'b5f73267-3591-4cf8-9f14-9564a497d0cd', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.405 222021 DEBUG oslo_concurrency.lockutils [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.405 222021 DEBUG nova.network.neutron [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.412 222021 WARNING nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.418 222021 DEBUG nova.virt.libvirt.host [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.419 222021 DEBUG nova.virt.libvirt.host [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.422 222021 DEBUG nova.virt.libvirt.host [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.423 222021 DEBUG nova.virt.libvirt.host [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.425 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.425 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.425 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.425 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.426 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.426 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.426 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.426 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.426 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.427 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.427 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.427 222021 DEBUG nova.virt.hardware [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.427 222021 DEBUG nova.objects.instance [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'vcpu_model' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:52 np0005593233 nova_compute[222017]: 2026-01-23 11:15:52.551 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:15:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:53.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:15:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:53.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:54 np0005593233 nova_compute[222017]: 2026-01-23 11:15:54.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:54 np0005593233 nova_compute[222017]: 2026-01-23 11:15:54.754 222021 DEBUG nova.storage.rbd_utils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:15:54 np0005593233 nova_compute[222017]: 2026-01-23 11:15:54.761 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:55 np0005593233 podman[316835]: 2026-01-23 11:15:55.115282461 +0000 UTC m=+0.123215887 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller)
Jan 23 06:15:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:15:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1874377332' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:15:55 np0005593233 nova_compute[222017]: 2026-01-23 11:15:55.271 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:55 np0005593233 nova_compute[222017]: 2026-01-23 11:15:55.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:55 np0005593233 nova_compute[222017]: 2026-01-23 11:15:55.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:15:55 np0005593233 nova_compute[222017]: 2026-01-23 11:15:55.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:15:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:55.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:55.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:56 np0005593233 nova_compute[222017]: 2026-01-23 11:15:56.161 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:56 np0005593233 nova_compute[222017]: 2026-01-23 11:15:56.234 222021 DEBUG nova.network.neutron [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated VIF entry in instance network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:15:56 np0005593233 nova_compute[222017]: 2026-01-23 11:15:56.235 222021 DEBUG nova.network.neutron [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:15:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:15:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:57.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:15:57 np0005593233 nova_compute[222017]: 2026-01-23 11:15:57.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:57.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.367 222021 DEBUG oslo_concurrency.lockutils [req-090044bf-06f6-41df-b25f-6ad535431bd2 req-3fc98f96-8191-4483-8e1b-8ed7d1697a65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.374 222021 DEBUG nova.virt.libvirt.vif [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:14:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2003932207',display_name='tempest-TestShelveInstance-server-2003932207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2003932207',id=224,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-609770395',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:14:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-p0f4vjg5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:15:43Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=ed71c532-711c-49b9-b0d5-eaf409f0bc76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.375 222021 DEBUG nova.network.os_vif_util [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.375 222021 DEBUG nova.network.os_vif_util [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.377 222021 DEBUG nova.objects.instance [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'pci_devices' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.378 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.378 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.379 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.379 222021 DEBUG nova.objects.instance [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.448 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <uuid>ed71c532-711c-49b9-b0d5-eaf409f0bc76</uuid>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <name>instance-000000e0</name>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <memory>131072</memory>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <vcpu>1</vcpu>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <metadata>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <nova:name>tempest-TestShelveInstance-server-2003932207</nova:name>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <nova:creationTime>2026-01-23 11:15:52</nova:creationTime>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <nova:flavor name="m1.nano">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:memory>128</nova:memory>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:disk>1</nova:disk>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:swap>0</nova:swap>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </nova:flavor>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <nova:owner>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:user uuid="5d6a458f5d9345379b05f0cdb69a7b0f">tempest-TestShelveInstance-869807080-project-member</nova:user>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:project uuid="3a245f7970f14fffa60af2ff972b4bfd">tempest-TestShelveInstance-869807080</nova:project>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </nova:owner>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <nova:ports>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <nova:port uuid="617c9ef0-df2c-4bd2-8d4c-fafc1723eb55">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        </nova:port>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </nova:ports>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </nova:instance>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </metadata>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <sysinfo type="smbios">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <system>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <entry name="serial">ed71c532-711c-49b9-b0d5-eaf409f0bc76</entry>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <entry name="uuid">ed71c532-711c-49b9-b0d5-eaf409f0bc76</entry>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </system>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </sysinfo>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <os>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <boot dev="hd"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <smbios mode="sysinfo"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </os>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <features>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <acpi/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <apic/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <vmcoreinfo/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </features>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <clock offset="utc">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <timer name="hpet" present="no"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </clock>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <cpu mode="custom" match="exact">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <model>Nehalem</model>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </cpu>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  <devices>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <disk type="network" device="cdrom">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <driver type="raw" cache="none"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="vms/ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <target dev="sda" bus="sata"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <disk type="network" device="disk">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <source protocol="rbd" name="volumes/volume-525f185e-d0f4-4a0b-bd48-9219445747c5">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </source>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <auth username="openstack">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      </auth>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <target dev="vda" bus="virtio"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <serial>525f185e-d0f4-4a0b-bd48-9219445747c5</serial>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </disk>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <interface type="ethernet">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <mac address="fa:16:3e:1d:a1:56"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <mtu size="1442"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <target dev="tap617c9ef0-df"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </interface>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <serial type="pty">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <log file="/var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/console.log" append="off"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </serial>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <video>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <model type="virtio"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </video>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <input type="tablet" bus="usb"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <input type="keyboard" bus="usb"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <rng model="virtio">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </rng>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <controller type="usb" index="0"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    <memballoon model="virtio">
Jan 23 06:15:58 np0005593233 nova_compute[222017]:      <stats period="10"/>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:    </memballoon>
Jan 23 06:15:58 np0005593233 nova_compute[222017]:  </devices>
Jan 23 06:15:58 np0005593233 nova_compute[222017]: </domain>
Jan 23 06:15:58 np0005593233 nova_compute[222017]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.450 222021 DEBUG nova.compute.manager [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Preparing to wait for external event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.451 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.451 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.451 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.452 222021 DEBUG nova.virt.libvirt.vif [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:14:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2003932207',display_name='tempest-TestShelveInstance-server-2003932207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2003932207',id=224,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-609770395',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:14:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-p0f4vjg5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:15:43Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=ed71c532-711c-49b9-b0d5-eaf409f0bc76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.452 222021 DEBUG nova.network.os_vif_util [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.453 222021 DEBUG nova.network.os_vif_util [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.453 222021 DEBUG os_vif [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.454 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.454 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.455 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.459 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.459 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap617c9ef0-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.460 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap617c9ef0-df, col_values=(('external_ids', {'iface-id': '617c9ef0-df2c-4bd2-8d4c-fafc1723eb55', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:a1:56', 'vm-uuid': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.512 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:58 np0005593233 NetworkManager[48871]: <info>  [1769166958.5149] manager: (tap617c9ef0-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.515 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.522 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.523 222021 INFO os_vif [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df')#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.584 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.584 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.584 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No VIF found with MAC fa:16:3e:1d:a1:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.585 222021 INFO nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Using config drive#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.619 222021 DEBUG nova.storage.rbd_utils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.639 222021 DEBUG nova.objects.instance [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'ec2_ids' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:58 np0005593233 nova_compute[222017]: 2026-01-23 11:15:58.687 222021 DEBUG nova.objects.instance [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'keypairs' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:59.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:15:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:59.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:01 np0005593233 nova_compute[222017]: 2026-01-23 11:16:01.162 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:01.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:01.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:03 np0005593233 nova_compute[222017]: 2026-01-23 11:16:03.142 222021 INFO nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Creating config drive at /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config#033[00m
Jan 23 06:16:03 np0005593233 nova_compute[222017]: 2026-01-23 11:16:03.147 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgk9twpis execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:03 np0005593233 nova_compute[222017]: 2026-01-23 11:16:03.284 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgk9twpis" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:03.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:03 np0005593233 nova_compute[222017]: 2026-01-23 11:16:03.462 222021 DEBUG nova.storage.rbd_utils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:16:03 np0005593233 nova_compute[222017]: 2026-01-23 11:16:03.467 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:03 np0005593233 nova_compute[222017]: 2026-01-23 11:16:03.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:03.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:05 np0005593233 nova_compute[222017]: 2026-01-23 11:16:05.268 222021 DEBUG nova.network.neutron [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:16:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:05.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:05.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:06 np0005593233 nova_compute[222017]: 2026-01-23 11:16:06.165 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:06 np0005593233 nova_compute[222017]: 2026-01-23 11:16:06.931 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:16:06 np0005593233 nova_compute[222017]: 2026-01-23 11:16:06.932 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.060 222021 DEBUG oslo_concurrency.processutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.592s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.060 222021 INFO nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deleting local config drive /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config because it was imported into RBD.#033[00m
Jan 23 06:16:07 np0005593233 kernel: tap617c9ef0-df: entered promiscuous mode
Jan 23 06:16:07 np0005593233 NetworkManager[48871]: <info>  [1769166967.1436] manager: (tap617c9ef0-df): new Tun device (/org/freedesktop/NetworkManager/Devices/425)
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.192 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:07Z|00933|binding|INFO|Claiming lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for this chassis.
Jan 23 06:16:07 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:07Z|00934|binding|INFO|617c9ef0-df2c-4bd2-8d4c-fafc1723eb55: Claiming fa:16:3e:1d:a1:56 10.100.0.9
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.195 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.200 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593233 NetworkManager[48871]: <info>  [1769166967.2062] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.205 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593233 NetworkManager[48871]: <info>  [1769166967.2084] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/427)
Jan 23 06:16:07 np0005593233 systemd-udevd[316936]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:16:07 np0005593233 systemd-machined[190954]: New machine qemu-100-instance-000000e0.
Jan 23 06:16:07 np0005593233 NetworkManager[48871]: <info>  [1769166967.2440] device (tap617c9ef0-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:16:07 np0005593233 NetworkManager[48871]: <info>  [1769166967.2449] device (tap617c9ef0-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.318 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593233 systemd[1]: Started Virtual Machine qemu-100-instance-000000e0.
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.326 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:07.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:07.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.816 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166967.8163118, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:16:07 np0005593233 nova_compute[222017]: 2026-01-23 11:16:07.817 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Started (Lifecycle Event)#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.013 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:a1:56 10.100.0.9'], port_security=['fa:16:3e:1d:a1:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '2243181a-ba78-49d0-a310-35ec5fa364b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.016 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 bound to our chassis#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.019 140224 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42899517-91b9-42e3-96a7-29180211a7a4#033[00m
Jan 23 06:16:08 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:08Z|00935|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 ovn-installed in OVS
Jan 23 06:16:08 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:08Z|00936|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 up in Southbound
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.026 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.036 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.038 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[60794472-9fc8-46ec-bdb3-414ca5508b5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.039 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42899517-91 in ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.042 225275 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42899517-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.042 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[092cf050-6996-4500-a9d0-08eb5b674a4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.043 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[782c0877-c087-4bfe-bfb1-58436a4db168]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.060 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[77865b85-2a41-4d02-9883-a3e577398b59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.078 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[26348099-b233-49d4-917d-4deae687e632]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.127 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[8569c616-2bb5-4c0b-b50d-6536178771ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.135 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae2f32c-2317-4cb3-91ca-3fea472a0999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 NetworkManager[48871]: <info>  [1769166968.1374] manager: (tap42899517-90): new Veth device (/org/freedesktop/NetworkManager/Devices/428)
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.177 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[32e0caa3-ea5e-4c31-8f1f-ee540aca17f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.182 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec1bd81-c5cd-4c46-a348-367061383e3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 NetworkManager[48871]: <info>  [1769166968.2082] device (tap42899517-90): carrier: link connected
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.213 225295 DEBUG oslo.privsep.daemon [-] privsep: reply[56d337cd-a2a0-45ce-b217-10e317412ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.235 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f0fd5f-2805-45c0-9bbb-5d6bdb940c0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42899517-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:09:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 278], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1088083, 'reachable_time': 36676, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317011, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.255 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[60505c77-771c-41cd-9a5d-370ebcf65cf5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe74:998'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1088083, 'tstamp': 1088083}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 317012, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.274 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[6d12b719-228a-48ec-a08b-379456271366]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42899517-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:09:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 278], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1088083, 'reachable_time': 36676, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 317013, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.311 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[462edf21-0266-4a4a-bfe3-3c9dd2fd8294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.406 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[70b52941-81ed-46e8-9183-e738a5d67480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.408 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42899517-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.408 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.409 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42899517-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.416 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:08 np0005593233 NetworkManager[48871]: <info>  [1769166968.4172] manager: (tap42899517-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Jan 23 06:16:08 np0005593233 kernel: tap42899517-90: entered promiscuous mode
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.423 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42899517-90, col_values=(('external_ids', {'iface-id': '82ae71e6-e83a-4506-8f0f-261163163937'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:08 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:08Z|00937|binding|INFO|Releasing lport 82ae71e6-e83a-4506-8f0f-261163163937 from this chassis (sb_readonly=1)
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.425 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.443 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.445 140224 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.446 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6531c8-3f2b-4a4e-ba83-7c54a9a2e677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.447 140224 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: global
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    log         /dev/log local0 debug
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    log-tag     haproxy-metadata-proxy-42899517-91b9-42e3-96a7-29180211a7a4
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    user        root
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    group       root
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    maxconn     1024
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    pidfile     /var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    daemon
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: defaults
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    log global
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    mode http
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    option httplog
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    option dontlognull
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    option http-server-close
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    option forwardfor
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    retries                 3
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-request    30s
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    timeout connect         30s
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    timeout client          32s
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    timeout server          32s
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    timeout http-keep-alive 30s
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: listen listener
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    bind 169.254.169.254:80
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]:    http-request add-header X-OVN-Network-ID 42899517-91b9-42e3-96a7-29180211a7a4
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:16:08 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:08.448 140224 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'env', 'PROCESS_TAG=haproxy-42899517-91b9-42e3-96a7-29180211a7a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42899517-91b9-42e3-96a7-29180211a7a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.553 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.597 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.602 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166967.8199787, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.603 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.629 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.633 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:16:08 np0005593233 nova_compute[222017]: 2026-01-23 11:16:08.655 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:16:08 np0005593233 podman[317046]: 2026-01-23 11:16:08.846302908 +0000 UTC m=+0.060414521 container create 5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:16:08 np0005593233 systemd[1]: Started libpod-conmon-5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad.scope.
Jan 23 06:16:08 np0005593233 podman[317046]: 2026-01-23 11:16:08.811312554 +0000 UTC m=+0.025424187 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:16:08 np0005593233 systemd[1]: Started libcrun container.
Jan 23 06:16:08 np0005593233 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3574637ee8ed5fe1ea08fde11459d2398ea5860df3b23efe776631cb22806fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:16:08 np0005593233 podman[317046]: 2026-01-23 11:16:08.958433622 +0000 UTC m=+0.172545245 container init 5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 06:16:08 np0005593233 podman[317046]: 2026-01-23 11:16:08.967328792 +0000 UTC m=+0.181440395 container start 5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:16:09 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [NOTICE]   (317065) : New worker (317067) forked
Jan 23 06:16:09 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [NOTICE]   (317065) : Loading success.
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.253 222021 DEBUG nova.compute.manager [req-4c9ee3d0-e969-4858-9658-3c2cece326be req-53783432-caef-46d2-a4db-417b66189672 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.254 222021 DEBUG oslo_concurrency.lockutils [req-4c9ee3d0-e969-4858-9658-3c2cece326be req-53783432-caef-46d2-a4db-417b66189672 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.255 222021 DEBUG oslo_concurrency.lockutils [req-4c9ee3d0-e969-4858-9658-3c2cece326be req-53783432-caef-46d2-a4db-417b66189672 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.255 222021 DEBUG oslo_concurrency.lockutils [req-4c9ee3d0-e969-4858-9658-3c2cece326be req-53783432-caef-46d2-a4db-417b66189672 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.255 222021 DEBUG nova.compute.manager [req-4c9ee3d0-e969-4858-9658-3c2cece326be req-53783432-caef-46d2-a4db-417b66189672 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Processing event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.256 222021 DEBUG nova.compute.manager [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.262 222021 DEBUG nova.virt.driver [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] Emitting event <LifecycleEvent: 1769166969.262263, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.263 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.266 222021 DEBUG nova.virt.libvirt.driver [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.270 222021 INFO nova.virt.libvirt.driver [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance spawned successfully.#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.271 222021 DEBUG nova.compute.manager [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.336 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.342 222021 DEBUG nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.383 222021 INFO nova.compute.manager [None req-095fba34-e9e2-406f-9338-7af7b214f44b - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:16:09 np0005593233 nova_compute[222017]: 2026-01-23 11:16:09.408 222021 DEBUG oslo_concurrency.lockutils [None req-2409ad2f-7a21-47c6-bece-0ff0ce0d471e 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 25.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:09.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:09.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:11 np0005593233 podman[317076]: 2026-01-23 11:16:11.068243801 +0000 UTC m=+0.071158983 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.167 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:11.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:11.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.948 222021 DEBUG nova.compute.manager [req-e31cd8d6-223f-4445-a74e-d317292982e9 req-c2b16920-f279-4c1e-885d-4062e439cc20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.949 222021 DEBUG oslo_concurrency.lockutils [req-e31cd8d6-223f-4445-a74e-d317292982e9 req-c2b16920-f279-4c1e-885d-4062e439cc20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.949 222021 DEBUG oslo_concurrency.lockutils [req-e31cd8d6-223f-4445-a74e-d317292982e9 req-c2b16920-f279-4c1e-885d-4062e439cc20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.950 222021 DEBUG oslo_concurrency.lockutils [req-e31cd8d6-223f-4445-a74e-d317292982e9 req-c2b16920-f279-4c1e-885d-4062e439cc20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.950 222021 DEBUG nova.compute.manager [req-e31cd8d6-223f-4445-a74e-d317292982e9 req-c2b16920-f279-4c1e-885d-4062e439cc20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] No waiting events found dispatching network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:16:11 np0005593233 nova_compute[222017]: 2026-01-23 11:16:11.950 222021 WARNING nova.compute.manager [req-e31cd8d6-223f-4445-a74e-d317292982e9 req-c2b16920-f279-4c1e-885d-4062e439cc20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received unexpected event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:16:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:13.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:13 np0005593233 nova_compute[222017]: 2026-01-23 11:16:13.554 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:13.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:15.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:15.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:16 np0005593233 nova_compute[222017]: 2026-01-23 11:16:16.169 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:17.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:17.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:18 np0005593233 nova_compute[222017]: 2026-01-23 11:16:18.600 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:19.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:19.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:21 np0005593233 nova_compute[222017]: 2026-01-23 11:16:21.171 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:21.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:21.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:23.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:23 np0005593233 nova_compute[222017]: 2026-01-23 11:16:23.602 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:23.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:24 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:24Z|00137|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1d:a1:56 10.100.0.9
Jan 23 06:16:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:25.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:25.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:25 np0005593233 podman[317097]: 2026-01-23 11:16:25.884690259 +0000 UTC m=+0.096505296 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 06:16:26 np0005593233 nova_compute[222017]: 2026-01-23 11:16:26.174 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:27.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:27.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:28 np0005593233 nova_compute[222017]: 2026-01-23 11:16:28.604 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:29.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:29.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:31 np0005593233 nova_compute[222017]: 2026-01-23 11:16:31.177 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:31.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:31.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:33.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:33 np0005593233 nova_compute[222017]: 2026-01-23 11:16:33.607 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:33.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:35 np0005593233 nova_compute[222017]: 2026-01-23 11:16:35.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:35 np0005593233 nova_compute[222017]: 2026-01-23 11:16:35.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:35.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:35.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:36 np0005593233 nova_compute[222017]: 2026-01-23 11:16:36.179 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:37.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:37.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:38 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:38Z|00938|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 06:16:38 np0005593233 nova_compute[222017]: 2026-01-23 11:16:38.609 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.410 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.411 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.411 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.411 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:39.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:39.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:16:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4051486092' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:16:39 np0005593233 nova_compute[222017]: 2026-01-23 11:16:39.929 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.047 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000e0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.048 222021 DEBUG nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] skipping disk for instance-000000e0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.244 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.245 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4053MB free_disk=20.98813247680664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.245 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.246 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.451 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Instance ed71c532-711c-49b9-b0d5-eaf409f0bc76 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.452 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.452 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.483 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing inventories for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.572 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating ProviderTree inventory for provider 929812a2-38ca-4ee7-9f24-090d633cb42b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.573 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Updating inventory in ProviderTree for provider 929812a2-38ca-4ee7-9f24-090d633cb42b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.642 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing aggregate associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.694 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Refreshing trait associations for resource provider 929812a2-38ca-4ee7-9f24-090d633cb42b, traits: COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:16:40 np0005593233 nova_compute[222017]: 2026-01-23 11:16:40.765 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:41 np0005593233 nova_compute[222017]: 2026-01-23 11:16:41.182 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:41 np0005593233 nova_compute[222017]: 2026-01-23 11:16:41.281 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:41 np0005593233 nova_compute[222017]: 2026-01-23 11:16:41.289 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:16:41 np0005593233 nova_compute[222017]: 2026-01-23 11:16:41.337 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:16:41 np0005593233 nova_compute[222017]: 2026-01-23 11:16:41.372 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:16:41 np0005593233 nova_compute[222017]: 2026-01-23 11:16:41.373 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:41.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:41.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:42 np0005593233 podman[317168]: 2026-01-23 11:16:42.060288241 +0000 UTC m=+0.070447093 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 06:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:42.742 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:42.742 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:42.743 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:43 np0005593233 nova_compute[222017]: 2026-01-23 11:16:43.373 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:43 np0005593233 nova_compute[222017]: 2026-01-23 11:16:43.374 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:16:43 np0005593233 nova_compute[222017]: 2026-01-23 11:16:43.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 06:16:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:43.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 06:16:43 np0005593233 nova_compute[222017]: 2026-01-23 11:16:43.612 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:43.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:45.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:45.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:46.162 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=108, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=107) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:16:46 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:46.163 140224 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.163 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.184 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.534 222021 DEBUG nova.compute.manager [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.535 222021 DEBUG nova.compute.manager [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing instance network info cache due to event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.535 222021 DEBUG oslo_concurrency.lockutils [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.535 222021 DEBUG oslo_concurrency.lockutils [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.536 222021 DEBUG nova.network.neutron [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.746 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.747 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.747 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.747 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.747 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.748 222021 INFO nova.compute.manager [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Terminating instance#033[00m
Jan 23 06:16:46 np0005593233 nova_compute[222017]: 2026-01-23 11:16:46.751 222021 DEBUG nova.compute.manager [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:16:47 np0005593233 kernel: tap617c9ef0-df (unregistering): left promiscuous mode
Jan 23 06:16:47 np0005593233 NetworkManager[48871]: <info>  [1769167007.1646] device (tap617c9ef0-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.166 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=539cfa5a-1c2f-4cb4-97af-2edb819f72fc, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '108'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.174 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:47Z|00939|binding|INFO|Releasing lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 from this chassis (sb_readonly=0)
Jan 23 06:16:47 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:47Z|00940|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 down in Southbound
Jan 23 06:16:47 np0005593233 ovn_controller[130653]: 2026-01-23T11:16:47Z|00941|binding|INFO|Removing iface tap617c9ef0-df ovn-installed in OVS
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.188 140224 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:a1:56 10.100.0.9'], port_security=['fa:16:3e:1d:a1:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '9', 'neutron:security_group_ids': '2243181a-ba78-49d0-a310-35ec5fa364b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>], logical_port=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fb103e24640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.190 140224 INFO neutron.agent.ovn.metadata.agent [-] Port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 unbound from our chassis#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.191 140224 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42899517-91b9-42e3-96a7-29180211a7a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.192 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[67144a0e-41ef-43dc-acb6-5ec23be228e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.192 140224 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 namespace which is not needed anymore#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.233 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000e0.scope: Deactivated successfully.
Jan 23 06:16:47 np0005593233 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000e0.scope: Consumed 15.368s CPU time.
Jan 23 06:16:47 np0005593233 systemd-machined[190954]: Machine qemu-100-instance-000000e0 terminated.
Jan 23 06:16:47 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [NOTICE]   (317065) : haproxy version is 2.8.14-c23fe91
Jan 23 06:16:47 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [NOTICE]   (317065) : path to executable is /usr/sbin/haproxy
Jan 23 06:16:47 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [WARNING]  (317065) : Exiting Master process...
Jan 23 06:16:47 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [ALERT]    (317065) : Current worker (317067) exited with code 143 (Terminated)
Jan 23 06:16:47 np0005593233 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[317061]: [WARNING]  (317065) : All workers exited. Exiting... (0)
Jan 23 06:16:47 np0005593233 systemd[1]: libpod-5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad.scope: Deactivated successfully.
Jan 23 06:16:47 np0005593233 podman[317212]: 2026-01-23 11:16:47.372328455 +0000 UTC m=+0.058927799 container died 5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.400 222021 INFO nova.virt.libvirt.driver [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance destroyed successfully.#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.401 222021 DEBUG nova.objects.instance [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'resources' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:16:47 np0005593233 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad-userdata-shm.mount: Deactivated successfully.
Jan 23 06:16:47 np0005593233 systemd[1]: var-lib-containers-storage-overlay-d3574637ee8ed5fe1ea08fde11459d2398ea5860df3b23efe776631cb22806fa-merged.mount: Deactivated successfully.
Jan 23 06:16:47 np0005593233 podman[317212]: 2026-01-23 11:16:47.438686022 +0000 UTC m=+0.125285386 container cleanup 5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 06:16:47 np0005593233 systemd[1]: libpod-conmon-5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad.scope: Deactivated successfully.
Jan 23 06:16:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:47.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:47 np0005593233 podman[317253]: 2026-01-23 11:16:47.518787645 +0000 UTC m=+0.051261963 container remove 5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.524 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[980f978a-c6a6-48e0-bd77-803ee4deadec]: (4, ('Fri Jan 23 11:16:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 (5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad)\n5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad\nFri Jan 23 11:16:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 (5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad)\n5c9b51478e4bcfcfb6ede7a6b9c6b194f58f78ef61ec82b7cb3dfee5410c77ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.526 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[58fd2a1b-872e-40e7-a720-09eca0b3ad46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.527 140224 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42899517-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.529 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 kernel: tap42899517-90: left promiscuous mode
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.547 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.551 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[75ed2dc1-401c-43f7-8c94-5b3714bd6b71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.568 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[b49e2487-62d9-467c-9b1d-4205248e7879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.569 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[c5ea2ab3-0e20-4266-a9d0-fe401db3d96b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.584 225275 DEBUG oslo.privsep.daemon [-] privsep: reply[07aa4a6e-5504-4d42-8b0d-ec74047b0d6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1088075, 'reachable_time': 35128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317272, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 systemd[1]: run-netns-ovnmeta\x2d42899517\x2d91b9\x2d42e3\x2d96a7\x2d29180211a7a4.mount: Deactivated successfully.
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.587 140664 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:16:47 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:16:47.587 140664 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e5af42-43ff-4a89-b435-96f9f54cb9b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.653 222021 DEBUG nova.virt.libvirt.vif [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:14:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2003932207',display_name='tempest-TestShelveInstance-server-2003932207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2003932207',id=224,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3bhlUYE6XIKDWLP/uD+8jgtWoi2zAcS0lWBO+SzamqVUAvHDBegRP4BFxXqktx7WnHXLwe9Z4SStWrBiFMiHWsxXNyXjRJKpQMiCgvbWujMjyVx4RONf0TXgED6xft/g==',key_name='tempest-TestShelveInstance-609770395',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:16:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-p0f4vjg5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:16:09Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=ed71c532-711c-49b9-b0d5-eaf409f0bc76,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.654 222021 DEBUG nova.network.os_vif_util [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.655 222021 DEBUG nova.network.os_vif_util [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.656 222021 DEBUG os_vif [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.661 222021 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap617c9ef0-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.663 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.668 222021 INFO os_vif [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df')#033[00m
Jan 23 06:16:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:47.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.912 222021 INFO nova.virt.libvirt.driver [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deleting instance files /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76_del#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.913 222021 INFO nova.virt.libvirt.driver [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deletion of /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76_del complete#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.923 222021 DEBUG nova.compute.manager [req-ec34b01a-f862-43ef-af37-ef0a6f8e95f5 req-3089322a-7124-4643-b7f9-db0b3f8c0fda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-unplugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.924 222021 DEBUG oslo_concurrency.lockutils [req-ec34b01a-f862-43ef-af37-ef0a6f8e95f5 req-3089322a-7124-4643-b7f9-db0b3f8c0fda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.924 222021 DEBUG oslo_concurrency.lockutils [req-ec34b01a-f862-43ef-af37-ef0a6f8e95f5 req-3089322a-7124-4643-b7f9-db0b3f8c0fda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.925 222021 DEBUG oslo_concurrency.lockutils [req-ec34b01a-f862-43ef-af37-ef0a6f8e95f5 req-3089322a-7124-4643-b7f9-db0b3f8c0fda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.925 222021 DEBUG nova.compute.manager [req-ec34b01a-f862-43ef-af37-ef0a6f8e95f5 req-3089322a-7124-4643-b7f9-db0b3f8c0fda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] No waiting events found dispatching network-vif-unplugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.925 222021 DEBUG nova.compute.manager [req-ec34b01a-f862-43ef-af37-ef0a6f8e95f5 req-3089322a-7124-4643-b7f9-db0b3f8c0fda 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-unplugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.996 222021 INFO nova.compute.manager [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Took 1.24 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.997 222021 DEBUG oslo.service.loopingcall [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.998 222021 DEBUG nova.compute.manager [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:16:47 np0005593233 nova_compute[222017]: 2026-01-23 11:16:47.999 222021 DEBUG nova.network.neutron [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:16:48 np0005593233 nova_compute[222017]: 2026-01-23 11:16:48.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.322 222021 DEBUG nova.network.neutron [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.382 222021 INFO nova.compute.manager [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Took 1.38 seconds to deallocate network for instance.#033[00m
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.517 222021 DEBUG nova.network.neutron [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated VIF entry in instance network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.517 222021 DEBUG nova.network.neutron [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:16:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:49.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.521 222021 DEBUG nova.compute.manager [req-fc9313d7-9276-44dd-b975-c2e757f2bd79 req-a8a4460b-42d1-4b74-938c-57057ced1401 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-deleted-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.555 222021 DEBUG oslo_concurrency.lockutils [req-13c56ef3-f914-420c-8110-30d03a09b6a7 req-e51b0834-6813-4bc5-a2ae-2d94e4a3e67f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:16:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:16:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:16:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:16:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:16:49 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:16:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:49.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.816 222021 INFO nova.compute.manager [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Took 0.43 seconds to detach 1 volumes for instance.#033[00m
Jan 23 06:16:49 np0005593233 nova_compute[222017]: 2026-01-23 11:16:49.818 222021 DEBUG nova.compute.manager [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deleting volume: 525f185e-d0f4-4a0b-bd48-9219445747c5 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.168 222021 DEBUG nova.compute.manager [req-3431018d-ee1c-4727-b23f-18b2d4dd5d8c req-a0be0dd8-792f-4ee8-afb4-cbf8297906b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.169 222021 DEBUG oslo_concurrency.lockutils [req-3431018d-ee1c-4727-b23f-18b2d4dd5d8c req-a0be0dd8-792f-4ee8-afb4-cbf8297906b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.169 222021 DEBUG oslo_concurrency.lockutils [req-3431018d-ee1c-4727-b23f-18b2d4dd5d8c req-a0be0dd8-792f-4ee8-afb4-cbf8297906b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.169 222021 DEBUG oslo_concurrency.lockutils [req-3431018d-ee1c-4727-b23f-18b2d4dd5d8c req-a0be0dd8-792f-4ee8-afb4-cbf8297906b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.170 222021 DEBUG nova.compute.manager [req-3431018d-ee1c-4727-b23f-18b2d4dd5d8c req-a0be0dd8-792f-4ee8-afb4-cbf8297906b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] No waiting events found dispatching network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.170 222021 WARNING nova.compute.manager [req-3431018d-ee1c-4727-b23f-18b2d4dd5d8c req-a0be0dd8-792f-4ee8-afb4-cbf8297906b2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received unexpected event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.254 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.254 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.407 222021 DEBUG oslo_concurrency.processutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:50 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:16:50 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2199671619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.880 222021 DEBUG oslo_concurrency.processutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.890 222021 DEBUG nova.compute.provider_tree [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.910 222021 DEBUG nova.scheduler.client.report [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.940 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:50 np0005593233 nova_compute[222017]: 2026-01-23 11:16:50.973 222021 INFO nova.scheduler.client.report [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Deleted allocations for instance ed71c532-711c-49b9-b0d5-eaf409f0bc76#033[00m
Jan 23 06:16:51 np0005593233 nova_compute[222017]: 2026-01-23 11:16:51.124 222021 DEBUG oslo_concurrency.lockutils [None req-98d6a979-5578-4015-8f83-4eaf9bcf6785 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:51 np0005593233 nova_compute[222017]: 2026-01-23 11:16:51.186 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:51 np0005593233 nova_compute[222017]: 2026-01-23 11:16:51.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:51.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:51.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:52 np0005593233 nova_compute[222017]: 2026-01-23 11:16:52.665 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:53.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:54 np0005593233 nova_compute[222017]: 2026-01-23 11:16:54.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:55 np0005593233 nova_compute[222017]: 2026-01-23 11:16:55.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:55 np0005593233 nova_compute[222017]: 2026-01-23 11:16:55.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:16:55 np0005593233 nova_compute[222017]: 2026-01-23 11:16:55.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:16:55 np0005593233 nova_compute[222017]: 2026-01-23 11:16:55.402 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:16:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:55.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:16:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:55.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:16:56 np0005593233 podman[317447]: 2026-01-23 11:16:56.086639644 +0000 UTC m=+0.094948332 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:16:56 np0005593233 nova_compute[222017]: 2026-01-23 11:16:56.189 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:16:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:57.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:16:57 np0005593233 nova_compute[222017]: 2026-01-23 11:16:57.666 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:57.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:58 np0005593233 nova_compute[222017]: 2026-01-23 11:16:58.538 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:58 np0005593233 nova_compute[222017]: 2026-01-23 11:16:58.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:59.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:16:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:59.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:01 np0005593233 nova_compute[222017]: 2026-01-23 11:17:01.192 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:01.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:01.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:17:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:17:02 np0005593233 nova_compute[222017]: 2026-01-23 11:17:02.394 222021 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769167007.3921466, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:17:02 np0005593233 nova_compute[222017]: 2026-01-23 11:17:02.394 222021 INFO nova.compute.manager [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:17:02 np0005593233 nova_compute[222017]: 2026-01-23 11:17:02.429 222021 DEBUG nova.compute.manager [None req-a9438588-fb28-4baf-866a-6a68bb58dd8e - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:17:02 np0005593233 nova_compute[222017]: 2026-01-23 11:17:02.668 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:03.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:03.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:05.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:05.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:06 np0005593233 nova_compute[222017]: 2026-01-23 11:17:06.194 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:07.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:07 np0005593233 nova_compute[222017]: 2026-01-23 11:17:07.670 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:17:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:07.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:17:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:09.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:09.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:11 np0005593233 nova_compute[222017]: 2026-01-23 11:17:11.198 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:11.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:11.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:12 np0005593233 nova_compute[222017]: 2026-01-23 11:17:12.673 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:13 np0005593233 podman[317525]: 2026-01-23 11:17:13.067537627 +0000 UTC m=+0.073586611 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 06:17:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:13.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:13.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:15.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:15.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:16 np0005593233 nova_compute[222017]: 2026-01-23 11:17:16.200 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:17.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:17 np0005593233 nova_compute[222017]: 2026-01-23 11:17:17.675 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:17.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:19.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:17:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:19.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:17:21 np0005593233 nova_compute[222017]: 2026-01-23 11:17:21.203 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:21.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:21.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:22 np0005593233 nova_compute[222017]: 2026-01-23 11:17:22.677 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:23.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:23.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:25.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:25.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:26 np0005593233 nova_compute[222017]: 2026-01-23 11:17:26.204 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:27 np0005593233 podman[317546]: 2026-01-23 11:17:27.123878615 +0000 UTC m=+0.135606465 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 06:17:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:27.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:27 np0005593233 nova_compute[222017]: 2026-01-23 11:17:27.678 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:27.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:28 np0005593233 nova_compute[222017]: 2026-01-23 11:17:28.397 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:17:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:29.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:17:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:29.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:31 np0005593233 nova_compute[222017]: 2026-01-23 11:17:31.206 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:31.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:31.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:32 np0005593233 nova_compute[222017]: 2026-01-23 11:17:32.681 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:33.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:33.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:35 np0005593233 ceph-mgr[81930]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 06:17:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:35.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:35.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:36 np0005593233 nova_compute[222017]: 2026-01-23 11:17:36.209 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:37 np0005593233 nova_compute[222017]: 2026-01-23 11:17:37.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:37 np0005593233 nova_compute[222017]: 2026-01-23 11:17:37.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:37.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:37 np0005593233 nova_compute[222017]: 2026-01-23 11:17:37.683 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:37.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:39.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:39 np0005593233 ovn_controller[130653]: 2026-01-23T11:17:39Z|00942|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 23 06:17:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:39.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:40 np0005593233 nova_compute[222017]: 2026-01-23 11:17:40.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:40 np0005593233 nova_compute[222017]: 2026-01-23 11:17:40.668 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:17:40 np0005593233 nova_compute[222017]: 2026-01-23 11:17:40.669 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:17:40 np0005593233 nova_compute[222017]: 2026-01-23 11:17:40.669 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:17:40 np0005593233 nova_compute[222017]: 2026-01-23 11:17:40.670 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:17:40 np0005593233 nova_compute[222017]: 2026-01-23 11:17:40.670 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:17:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:17:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2319129841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.178 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.397 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.401 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4256MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.402 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.403 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:17:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:41.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:41.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.927 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.928 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:17:41 np0005593233 nova_compute[222017]: 2026-01-23 11:17:41.976 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:17:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:17:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3396874479' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:17:42 np0005593233 nova_compute[222017]: 2026-01-23 11:17:42.441 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:17:42 np0005593233 nova_compute[222017]: 2026-01-23 11:17:42.448 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:17:42 np0005593233 nova_compute[222017]: 2026-01-23 11:17:42.724 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:17:42.743 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:17:42.743 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:17:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:17:42.744 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:17:42 np0005593233 nova_compute[222017]: 2026-01-23 11:17:42.898 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:17:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:43.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:43.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:44 np0005593233 podman[317618]: 2026-01-23 11:17:44.063794593 +0000 UTC m=+0.076139546 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 06:17:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:17:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2268937051' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:17:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:17:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2268937051' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:17:44 np0005593233 nova_compute[222017]: 2026-01-23 11:17:44.938 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:17:44 np0005593233 nova_compute[222017]: 2026-01-23 11:17:44.939 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:17:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:45.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:45.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:46 np0005593233 nova_compute[222017]: 2026-01-23 11:17:46.213 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:47.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:47 np0005593233 nova_compute[222017]: 2026-01-23 11:17:47.726 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:17:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:47.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:17:47 np0005593233 nova_compute[222017]: 2026-01-23 11:17:47.939 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:47 np0005593233 nova_compute[222017]: 2026-01-23 11:17:47.940 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:47 np0005593233 nova_compute[222017]: 2026-01-23 11:17:47.940 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:17:49 np0005593233 nova_compute[222017]: 2026-01-23 11:17:49.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:49.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:49.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:51 np0005593233 nova_compute[222017]: 2026-01-23 11:17:51.216 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:51.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:51.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:52 np0005593233 nova_compute[222017]: 2026-01-23 11:17:52.728 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:53 np0005593233 nova_compute[222017]: 2026-01-23 11:17:53.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:53.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:53.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:55 np0005593233 nova_compute[222017]: 2026-01-23 11:17:55.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:55 np0005593233 nova_compute[222017]: 2026-01-23 11:17:55.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:55 np0005593233 nova_compute[222017]: 2026-01-23 11:17:55.384 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:17:55 np0005593233 nova_compute[222017]: 2026-01-23 11:17:55.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:17:55 np0005593233 nova_compute[222017]: 2026-01-23 11:17:55.418 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:17:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:55.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:55.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:56 np0005593233 nova_compute[222017]: 2026-01-23 11:17:56.219 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:57.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:57 np0005593233 nova_compute[222017]: 2026-01-23 11:17:57.730 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:57.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:58 np0005593233 podman[317638]: 2026-01-23 11:17:58.093626875 +0000 UTC m=+0.095351288 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 23 06:17:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:59.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:17:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:17:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:17:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:59.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:01 np0005593233 nova_compute[222017]: 2026-01-23 11:18:01.221 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:01.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:01.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:02 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:02 np0005593233 nova_compute[222017]: 2026-01-23 11:18:02.732 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:18:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:03 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:18:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:03.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:03.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:05.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:05.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:06 np0005593233 nova_compute[222017]: 2026-01-23 11:18:06.223 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:07.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #229. Immutable memtables: 0.
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.698587) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 147] Flushing memtable with next log file: 229
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087698638, "job": 147, "event": "flush_started", "num_memtables": 1, "num_entries": 1928, "num_deletes": 250, "total_data_size": 4600511, "memory_usage": 4666256, "flush_reason": "Manual Compaction"}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 147] Level-0 flush table #230: started
Jan 23 06:18:07 np0005593233 nova_compute[222017]: 2026-01-23 11:18:07.734 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087743461, "cf_name": "default", "job": 147, "event": "table_file_creation", "file_number": 230, "file_size": 3018078, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 107157, "largest_seqno": 109080, "table_properties": {"data_size": 3009983, "index_size": 4909, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 15809, "raw_average_key_size": 19, "raw_value_size": 2994027, "raw_average_value_size": 3651, "num_data_blocks": 211, "num_entries": 820, "num_filter_entries": 820, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166915, "oldest_key_time": 1769166915, "file_creation_time": 1769167087, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 230, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 147] Flush lasted 44951 microseconds, and 9759 cpu microseconds.
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.743532) [db/flush_job.cc:967] [default] [JOB 147] Level-0 flush table #230: 3018078 bytes OK
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.743562) [db/memtable_list.cc:519] [default] Level-0 commit table #230 started
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.746467) [db/memtable_list.cc:722] [default] Level-0 commit table #230: memtable #1 done
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.746534) EVENT_LOG_v1 {"time_micros": 1769167087746519, "job": 147, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.746564) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 147] Try to delete WAL files size 4591745, prev total WAL file size 4607527, number of live WAL files 2.
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000226.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.747977) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600353030' seq:72057594037927935, type:22 .. '6B7600373531' seq:0, type:0; will stop at (end)
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 148] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 147 Base level 0, inputs: [230(2947KB)], [228(11MB)]
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087748034, "job": 148, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [230], "files_L6": [228], "score": -1, "input_data_size": 14821462, "oldest_snapshot_seqno": -1}
Jan 23 06:18:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:07.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 148] Generated table #231: 12229 keys, 13699215 bytes, temperature: kUnknown
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087892869, "cf_name": "default", "job": 148, "event": "table_file_creation", "file_number": 231, "file_size": 13699215, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13623989, "index_size": 43561, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30597, "raw_key_size": 325997, "raw_average_key_size": 26, "raw_value_size": 13413895, "raw_average_value_size": 1096, "num_data_blocks": 1621, "num_entries": 12229, "num_filter_entries": 12229, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769167087, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 231, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.893231) [db/compaction/compaction_job.cc:1663] [default] [JOB 148] Compacted 1@0 + 1@6 files to L6 => 13699215 bytes
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.894640) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 102.2 rd, 94.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 11.3 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(9.4) write-amplify(4.5) OK, records in: 12746, records dropped: 517 output_compression: NoCompression
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.894661) EVENT_LOG_v1 {"time_micros": 1769167087894651, "job": 148, "event": "compaction_finished", "compaction_time_micros": 144976, "compaction_time_cpu_micros": 50107, "output_level": 6, "num_output_files": 1, "total_output_size": 13699215, "num_input_records": 12746, "num_output_records": 12229, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000230.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087895627, "job": 148, "event": "table_file_deletion", "file_number": 230}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000228.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087898146, "job": 148, "event": "table_file_deletion", "file_number": 228}
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.747879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.898252) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.898260) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.898262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.898264) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:18:07.898266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:09.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:09.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:10 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:11 np0005593233 nova_compute[222017]: 2026-01-23 11:18:11.226 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:11.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:11.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:12 np0005593233 nova_compute[222017]: 2026-01-23 11:18:12.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:13.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:13.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:15 np0005593233 podman[317968]: 2026-01-23 11:18:15.072787663 +0000 UTC m=+0.082074924 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:18:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:18:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:15.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:18:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:15.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:16 np0005593233 nova_compute[222017]: 2026-01-23 11:18:16.246 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:17.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:17 np0005593233 nova_compute[222017]: 2026-01-23 11:18:17.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:17.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:19.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:19.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:21 np0005593233 nova_compute[222017]: 2026-01-23 11:18:21.248 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:21.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:21.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:22 np0005593233 nova_compute[222017]: 2026-01-23 11:18:22.798 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:23.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:23.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:24 np0005593233 systemd-logind[804]: New session 63 of user zuul.
Jan 23 06:18:24 np0005593233 systemd[1]: Started Session 63 of User zuul.
Jan 23 06:18:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:18:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:25.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:18:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:25.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:26 np0005593233 nova_compute[222017]: 2026-01-23 11:18:26.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:27.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:27 np0005593233 nova_compute[222017]: 2026-01-23 11:18:27.800 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.003000084s ======
Jan 23 06:18:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:27.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Jan 23 06:18:29 np0005593233 podman[318194]: 2026-01-23 11:18:29.169183889 +0000 UTC m=+0.164942189 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:18:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 23 06:18:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2865303566' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 06:18:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:29.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:29.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:31 np0005593233 nova_compute[222017]: 2026-01-23 11:18:31.254 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:31.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:31.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:32 np0005593233 nova_compute[222017]: 2026-01-23 11:18:32.848 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:33.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:33.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:34 np0005593233 ovs-vsctl[318303]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 06:18:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:35.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:35 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 06:18:35 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 06:18:35 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 06:18:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:35.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:36 np0005593233 nova_compute[222017]: 2026-01-23 11:18:36.295 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:36 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: cache status {prefix=cache status} (starting...)
Jan 23 06:18:36 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:36 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: client ls {prefix=client ls} (starting...)
Jan 23 06:18:36 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:36 np0005593233 lvm[318618]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 06:18:36 np0005593233 lvm[318618]: VG ceph_vg0 finished
Jan 23 06:18:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 23 06:18:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3310176834' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 06:18:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:37.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:37 np0005593233 nova_compute[222017]: 2026-01-23 11:18:37.852 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:37.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 06:18:37 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 23 06:18:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2915887231' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 06:18:38 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 06:18:38 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:38 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 06:18:38 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 23 06:18:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/470846291' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 06:18:38 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: ops {prefix=ops} (starting...)
Jan 23 06:18:38 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 23 06:18:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/535508387' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 06:18:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:18:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/202401808' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:18:39 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: session ls {prefix=session ls} (starting...)
Jan 23 06:18:39 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:18:39 np0005593233 nova_compute[222017]: 2026-01-23 11:18:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:39 np0005593233 nova_compute[222017]: 2026-01-23 11:18:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:39 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: status {prefix=status} (starting...)
Jan 23 06:18:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:18:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1121777495' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:18:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:39.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:39.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3293098762' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/640270957' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 06:18:40 np0005593233 nova_compute[222017]: 2026-01-23 11:18:40.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3701004835' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2007937461' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 06:18:40 np0005593233 nova_compute[222017]: 2026-01-23 11:18:40.773 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:18:40 np0005593233 nova_compute[222017]: 2026-01-23 11:18:40.774 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:18:40 np0005593233 nova_compute[222017]: 2026-01-23 11:18:40.774 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:18:40 np0005593233 nova_compute[222017]: 2026-01-23 11:18:40.774 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:18:40 np0005593233 nova_compute[222017]: 2026-01-23 11:18:40.774 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 23 06:18:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3601677555' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2798286302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:18:41 np0005593233 nova_compute[222017]: 2026-01-23 11:18:41.277 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:18:41 np0005593233 nova_compute[222017]: 2026-01-23 11:18:41.296 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1476432408' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:18:41 np0005593233 nova_compute[222017]: 2026-01-23 11:18:41.459 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:18:41 np0005593233 nova_compute[222017]: 2026-01-23 11:18:41.460 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=3953MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:18:41 np0005593233 nova_compute[222017]: 2026-01-23 11:18:41.460 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:18:41 np0005593233 nova_compute[222017]: 2026-01-23 11:18:41.461 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/162234480' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:41.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:41.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 23 06:18:41 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3354154787' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 06:18:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:18:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3535051016' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:18:42 np0005593233 nova_compute[222017]: 2026-01-23 11:18:42.466 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:18:42 np0005593233 nova_compute[222017]: 2026-01-23 11:18:42.466 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:18:42 np0005593233 nova_compute[222017]: 2026-01-23 11:18:42.533 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:18:42.744 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:18:42.744 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:18:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:18:42.744 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:18:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 23 06:18:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4003013795' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 06:18:42 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:18:42 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3832402337' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:18:42 np0005593233 nova_compute[222017]: 2026-01-23 11:18:42.892 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:18:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2773986785' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:18:43 np0005593233 nova_compute[222017]: 2026-01-23 11:18:43.055 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:18:43 np0005593233 nova_compute[222017]: 2026-01-23 11:18:43.060 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:18:43 np0005593233 nova_compute[222017]: 2026-01-23 11:18:43.081 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:18:43 np0005593233 nova_compute[222017]: 2026-01-23 11:18:43.083 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:18:43 np0005593233 nova_compute[222017]: 2026-01-23 11:18:43.083 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437665792 unmapped: 70107136 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 ms_handle_reset con 0x55f13f3ca800 session 0x55f133e034a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437673984 unmapped: 70098944 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437673984 unmapped: 70098944 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 heartbeat osd_stat(store_statfs(0x1a2a88000/0x0/0x1bfc00000, data 0x346edab/0x3341000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437673984 unmapped: 70098944 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4761840 data_alloc: 234881024 data_used: 20279296
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437682176 unmapped: 70090752 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437682176 unmapped: 70090752 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437682176 unmapped: 70090752 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 ms_handle_reset con 0x55f135e9a800 session 0x55f135d89e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437682176 unmapped: 70090752 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437682176 unmapped: 70090752 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 heartbeat osd_stat(store_statfs(0x1a2a88000/0x0/0x1bfc00000, data 0x346edab/0x3341000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4761664 data_alloc: 234881024 data_used: 20279296
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437682176 unmapped: 70090752 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.214059830s of 10.308024406s, submitted: 24
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 ms_handle_reset con 0x55f135e9f000 session 0x55f13423a5a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437690368 unmapped: 70082560 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 ms_handle_reset con 0x55f13631f400 session 0x55f136fe3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 heartbeat osd_stat(store_statfs(0x1a2c3e000/0x0/0x1bfc00000, data 0x32dddab/0x31b0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 407 handle_osd_map epochs [408,408], i have 407, src has [1,408]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f1400f9400 session 0x55f1369b34a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437706752 unmapped: 70066176 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f135eb0000 session 0x55f142f10000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437706752 unmapped: 70066176 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 heartbeat osd_stat(store_statfs(0x1a2c44000/0x0/0x1bfc00000, data 0x2f8e9f6/0x31aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f134aeac00 session 0x55f1342fed20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f13631d000 session 0x55f1363bb0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437706752 unmapped: 70066176 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 heartbeat osd_stat(store_statfs(0x1a2c44000/0x0/0x1bfc00000, data 0x2f8e9f6/0x31aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4732612 data_alloc: 234881024 data_used: 20172800
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 heartbeat osd_stat(store_statfs(0x1a2c44000/0x0/0x1bfc00000, data 0x2f8e9f6/0x31aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437714944 unmapped: 70057984 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437714944 unmapped: 70057984 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437714944 unmapped: 70057984 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f135eb0000 session 0x55f1363610e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437714944 unmapped: 70057984 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f136a1f400 session 0x55f13704e780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f1454e9c00 session 0x55f142f10960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437714944 unmapped: 70057984 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4733880 data_alloc: 234881024 data_used: 20172800
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f134aeac00 session 0x55f1367ef4a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 ms_handle_reset con 0x55f135e9c000 session 0x55f136999c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437723136 unmapped: 70049792 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 heartbeat osd_stat(store_statfs(0x1a2c35000/0x0/0x1bfc00000, data 0x2f9d9f6/0x31b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437723136 unmapped: 70049792 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 408 handle_osd_map epochs [409,409], i have 408, src has [1,409]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.018541336s of 11.250031471s, submitted: 41
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438788096 unmapped: 68984832 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 heartbeat osd_stat(store_statfs(0x1a2c2d000/0x0/0x1bfc00000, data 0x2fa118e/0x31bf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x19e0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 ms_handle_reset con 0x55f13631d000 session 0x55f133e13680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438788096 unmapped: 68984832 heap: 507772928 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 ms_handle_reset con 0x55f135e9a800 session 0x55f142f101e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 ms_handle_reset con 0x55f136a1f400 session 0x55f13a8e70e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 ms_handle_reset con 0x55f13631f400 session 0x55f1369b8f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438861824 unmapped: 73113600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4774245 data_alloc: 234881024 data_used: 17956864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 heartbeat osd_stat(store_statfs(0x1a11de000/0x0/0x1bfc00000, data 0x3850252/0x3a70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437379072 unmapped: 74596352 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 ms_handle_reset con 0x55f135e9f000 session 0x55f1342f90e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b9680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437379072 unmapped: 74596352 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 410 handle_osd_map epochs [411,411], i have 410, src has [1,411]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 437387264 unmapped: 74588160 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 411 ms_handle_reset con 0x55f135e9a800 session 0x55f13423af00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 411 handle_osd_map epochs [412,412], i have 411, src has [1,412]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 ms_handle_reset con 0x55f13631d000 session 0x55f136a42000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433684480 unmapped: 78290944 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 heartbeat osd_stat(store_statfs(0x1a11dc000/0x0/0x1bfc00000, data 0x3851e9d/0x3a72000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 ms_handle_reset con 0x55f136a1f400 session 0x55f135d24d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433692672 unmapped: 78282752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 ms_handle_reset con 0x55f134aeac00 session 0x55f13446f680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4769456 data_alloc: 234881024 data_used: 17977344
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 ms_handle_reset con 0x55f135e9a800 session 0x55f136361e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 heartbeat osd_stat(store_statfs(0x1a11d8000/0x0/0x1bfc00000, data 0x3853b12/0x3a75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433717248 unmapped: 78258176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433717248 unmapped: 78258176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 heartbeat osd_stat(store_statfs(0x1a11d8000/0x0/0x1bfc00000, data 0x3853b12/0x3a75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433717248 unmapped: 78258176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 heartbeat osd_stat(store_statfs(0x1a11d8000/0x0/0x1bfc00000, data 0x3853b12/0x3a75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433717248 unmapped: 78258176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 433717248 unmapped: 78258176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.858445168s of 12.549383163s, submitted: 129
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4674684 data_alloc: 218103808 data_used: 10170368
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 ms_handle_reset con 0x55f135e9f000 session 0x55f135d1a1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431955968 unmapped: 80019456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431955968 unmapped: 80019456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 412 handle_osd_map epochs [413,413], i have 412, src has [1,413]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 heartbeat osd_stat(store_statfs(0x1a1949000/0x0/0x1bfc00000, data 0x30e35cc/0x3304000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 heartbeat osd_stat(store_statfs(0x1a1949000/0x0/0x1bfc00000, data 0x30e35cc/0x3304000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4677050 data_alloc: 218103808 data_used: 10174464
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 heartbeat osd_stat(store_statfs(0x1a1949000/0x0/0x1bfc00000, data 0x30e35cc/0x3304000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431964160 unmapped: 80011264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4677050 data_alloc: 218103808 data_used: 10174464
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431972352 unmapped: 80003072 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 heartbeat osd_stat(store_statfs(0x1a1949000/0x0/0x1bfc00000, data 0x30e35cc/0x3304000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431972352 unmapped: 80003072 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431972352 unmapped: 80003072 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431972352 unmapped: 80003072 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431972352 unmapped: 80003072 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f13631d000 session 0x55f133e12d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f1400f9400 session 0x55f136ab54a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f134aeac00 session 0x55f133e052c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4677050 data_alloc: 218103808 data_used: 10174464
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 431980544 unmapped: 79994880 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f135e9a800 session 0x55f133e04960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.010217667s of 16.125297546s, submitted: 44
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f13631d000 session 0x55f142f11e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f135e9f000 session 0x55f135cb8f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f13602ec00 session 0x55f1363614a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f134aeac00 session 0x55f135d39e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f135e9a800 session 0x55f1363ba5a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f135e9f000 session 0x55f134fb4780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 heartbeat osd_stat(store_statfs(0x1a1949000/0x0/0x1bfc00000, data 0x30e35cc/0x3304000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446398464 unmapped: 65576960 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f13631d000 session 0x55f136a42b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446398464 unmapped: 65576960 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 heartbeat osd_stat(store_statfs(0x1a14ea000/0x0/0x1bfc00000, data 0x35435cc/0x3764000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446398464 unmapped: 65576960 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 ms_handle_reset con 0x55f14a131c00 session 0x55f142f10000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 443318272 unmapped: 68657152 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4777448 data_alloc: 234881024 data_used: 32944128
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 443318272 unmapped: 68657152 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 413 handle_osd_map epochs [414,414], i have 413, src has [1,414]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 414 ms_handle_reset con 0x55f135e9f000 session 0x55f1342de1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438575104 unmapped: 73400320 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 414 heartbeat osd_stat(store_statfs(0x1a14bc000/0x0/0x1bfc00000, data 0x356f279/0x3791000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438575104 unmapped: 73400320 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 414 heartbeat osd_stat(store_statfs(0x1a2a72000/0x0/0x1bfc00000, data 0x1fb9217/0x21da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438575104 unmapped: 73400320 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438575104 unmapped: 73400320 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4565048 data_alloc: 218103808 data_used: 10223616
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 414 heartbeat osd_stat(store_statfs(0x1a2a72000/0x0/0x1bfc00000, data 0x1fb9217/0x21da000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438575104 unmapped: 73400320 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 438575104 unmapped: 73400320 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 414 handle_osd_map epochs [415,415], i have 414, src has [1,415]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.537888527s of 11.046134949s, submitted: 68
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439623680 unmapped: 72351744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2a70000/0x0/0x1bfc00000, data 0x1fbad56/0x21dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439623680 unmapped: 72351744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439623680 unmapped: 72351744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4567830 data_alloc: 218103808 data_used: 10223616
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2a70000/0x0/0x1bfc00000, data 0x1fbad56/0x21dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439623680 unmapped: 72351744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439623680 unmapped: 72351744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439631872 unmapped: 72343552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2a70000/0x0/0x1bfc00000, data 0x1fbad56/0x21dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439631872 unmapped: 72343552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439631872 unmapped: 72343552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4575462 data_alloc: 218103808 data_used: 10686464
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439631872 unmapped: 72343552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439631872 unmapped: 72343552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2a71000/0x0/0x1bfc00000, data 0x1fbad56/0x21dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439631872 unmapped: 72343552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.991353989s of 11.015851021s, submitted: 17
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439640064 unmapped: 72335360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439640064 unmapped: 72335360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4578592 data_alloc: 218103808 data_used: 10682368
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f134aeac00 session 0x55f135cb83c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f135e9a800 session 0x55f1341e5860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439640064 unmapped: 72335360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f13631d000 session 0x55f135ccd680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efb000/0x0/0x1bfc00000, data 0x1b30d56/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439648256 unmapped: 72327168 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439648256 unmapped: 72327168 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439656448 unmapped: 72318976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439656448 unmapped: 72318976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528988 data_alloc: 218103808 data_used: 10186752
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439656448 unmapped: 72318976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439656448 unmapped: 72318976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efb000/0x0/0x1bfc00000, data 0x1b30d56/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efb000/0x0/0x1bfc00000, data 0x1b30d56/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439656448 unmapped: 72318976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f13adde800 session 0x55f1363614a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439656448 unmapped: 72318976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efb000/0x0/0x1bfc00000, data 0x1b30d56/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f13adde800 session 0x55f133e04960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efb000/0x0/0x1bfc00000, data 0x1b30d56/0x1d53000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.770365715s of 11.872770309s, submitted: 27
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f134aeac00 session 0x55f133e052c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528108 data_alloc: 218103808 data_used: 10182656
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528108 data_alloc: 218103808 data_used: 10182656
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439672832 unmapped: 72302592 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 72294400 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 72294400 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 72294400 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528108 data_alloc: 218103808 data_used: 10182656
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 72294400 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 72294400 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439681024 unmapped: 72294400 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528108 data_alloc: 218103808 data_used: 10182656
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528108 data_alloc: 218103808 data_used: 10182656
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439689216 unmapped: 72286208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439697408 unmapped: 72278016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439697408 unmapped: 72278016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439697408 unmapped: 72278016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439697408 unmapped: 72278016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4528428 data_alloc: 218103808 data_used: 10190848
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439697408 unmapped: 72278016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 26.543550491s of 26.567317963s, submitted: 6
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 ms_handle_reset con 0x55f135e9a800 session 0x55f13423af00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 heartbeat osd_stat(store_statfs(0x1a2efc000/0x0/0x1bfc00000, data 0x1b30d46/0x1d52000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439697408 unmapped: 72278016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 415 handle_osd_map epochs [416,416], i have 415, src has [1,416]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439713792 unmapped: 72261632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 416 ms_handle_reset con 0x55f135e9f000 session 0x55f1369b8f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 416 ms_handle_reset con 0x55f13631d000 session 0x55f1363610e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439754752 unmapped: 72220672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 417 ms_handle_reset con 0x55f134aeac00 session 0x55f1369994a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439754752 unmapped: 72220672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4542392 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 417 ms_handle_reset con 0x55f135e9a800 session 0x55f13423a000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439754752 unmapped: 72220672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 417 ms_handle_reset con 0x55f135e9f000 session 0x55f135d5af00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439771136 unmapped: 72204288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 417 heartbeat osd_stat(store_statfs(0x1a2ef6000/0x0/0x1bfc00000, data 0x1b3464c/0x1d58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 417 heartbeat osd_stat(store_statfs(0x1a2ef6000/0x0/0x1bfc00000, data 0x1b3464c/0x1d58000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439771136 unmapped: 72204288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439771136 unmapped: 72204288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439771136 unmapped: 72204288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4537442 data_alloc: 218103808 data_used: 10199040
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439771136 unmapped: 72204288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 417 handle_osd_map epochs [418,418], i have 417, src has [1,418]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439787520 unmapped: 72187904 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.211915970s of 13.497264862s, submitted: 83
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f13adde800 session 0x55f133e11e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4542758 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b361ed/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1369b9e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b361ed/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1367efc20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439795712 unmapped: 72179712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef0000/0x0/0x1bfc00000, data 0x1b3625e/0x1d5e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439812096 unmapped: 72163328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4547662 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef0000/0x0/0x1bfc00000, data 0x1b3625e/0x1d5e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439812096 unmapped: 72163328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439812096 unmapped: 72163328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439812096 unmapped: 72163328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b94a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439812096 unmapped: 72163328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f1341e4d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439836672 unmapped: 72138752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4545326 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.409068108s of 10.787988663s, submitted: 36
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439836672 unmapped: 72138752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f135cf23c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b361ed/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f13adde800 session 0x55f1335db680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544789 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544789 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439934976 unmapped: 72040448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 43.547557831s of 44.114562988s, submitted: 17
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4546617 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f134968d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4546617 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439951360 unmapped: 72024064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439951360 unmapped: 72024064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439951360 unmapped: 72024064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439959552 unmapped: 72015872 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f13a8e6d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f133e13680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1342f90e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135069400 session 0x55f13446fa40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.264505386s of 10.274923325s, submitted: 2
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f134b125a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f1369b3c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f1363bab40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1367ef2c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1400f8800 session 0x55f136998d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b3860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.963664055s of 16.056951523s, submitted: 15
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.8 total, 600.0 interval#012Cumulative writes: 65K writes, 247K keys, 65K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.04 MB/s#012Cumulative WAL: 65K writes, 24K syncs, 2.63 writes per sync, written: 0.23 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4600 writes, 13K keys, 4600 commit groups, 1.0 writes per commit group, ingest: 11.03 MB, 0.02 MB/s#012Interval WAL: 4600 writes, 1959 syncs, 2.35 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593928 data_alloc: 218103808 data_used: 13381632
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593928 data_alloc: 218103808 data_used: 13381632
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439910400 unmapped: 72065024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.895181656s of 12.986706734s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440819712 unmapped: 71155712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440418304 unmapped: 71557120 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4603780 data_alloc: 218103808 data_used: 13373440
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ab8000/0x0/0x1bfc00000, data 0x1f6a19b/0x2190000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ab8000/0x0/0x1bfc00000, data 0x1f6a19b/0x2190000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440950784 unmapped: 71024640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 28.065114975s of 31.968029022s, submitted: 36
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1369e21e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440975360 unmapped: 71000064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619946 data_alloc: 218103808 data_used: 13410304
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f142f114a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f136360780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440844288 unmapped: 71131136 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f136b3c000 session 0x55f1363bb4a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554946 data_alloc: 218103808 data_used: 10215424
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1342df0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.321082115s of 11.436309814s, submitted: 34
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f142f10780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440877056 unmapped: 71098368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440877056 unmapped: 71098368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440877056 unmapped: 71098368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.356424332s of 31.384279251s, submitted: 6
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f1367ee1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1369b2960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f136a16400 session 0x55f1369b3e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f136a16400 session 0x55f136999e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b30e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae7000/0x0/0x1bfc00000, data 0x1f4218b/0x2167000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593598 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441540608 unmapped: 70434816 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae7000/0x0/0x1bfc00000, data 0x1f4218b/0x2167000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441540608 unmapped: 70434816 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441540608 unmapped: 70434816 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593598 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f13446e000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441548800 unmapped: 70426624 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f136fe2d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae7000/0x0/0x1bfc00000, data 0x1f4218b/0x2167000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1342f8960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.714527130s of 10.794013977s, submitted: 13
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f135cb90e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4595436 data_alloc: 218103808 data_used: 10211328
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442204160 unmapped: 69771264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615116 data_alloc: 218103808 data_used: 12935168
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615116 data_alloc: 218103808 data_used: 12935168
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442277888 unmapped: 69697536 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.146739006s of 13.157499313s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448045056 unmapped: 63930368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 447217664 unmapped: 64757760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 447217664 unmapped: 64757760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4699138 data_alloc: 218103808 data_used: 13131776
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446414848 unmapped: 65560576 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2016000/0x0/0x1bfc00000, data 0x2a0919b/0x2c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446636032 unmapped: 65339392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446636032 unmapped: 65339392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1fa1000/0x0/0x1bfc00000, data 0x2a8719b/0x2cad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446636032 unmapped: 65339392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446644224 unmapped: 65331200 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4703656 data_alloc: 218103808 data_used: 13139968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446652416 unmapped: 65323008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446652416 unmapped: 65323008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.548395157s of 10.023729324s, submitted: 110
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446693376 unmapped: 65282048 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704512 data_alloc: 218103808 data_used: 13139968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704908 data_alloc: 218103808 data_used: 13139968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6c000/0x0/0x1bfc00000, data 0x2aac19b/0x2cd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6c000/0x0/0x1bfc00000, data 0x2aac19b/0x2cd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.920547485s of 12.751147270s, submitted: 230
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4706032 data_alloc: 218103808 data_used: 13139968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f134fb4780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f133e12d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b5f000/0x0/0x1bfc00000, data 0x2ab919b/0x2cdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b5f000/0x0/0x1bfc00000, data 0x2ab919b/0x2cdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4706032 data_alloc: 218103808 data_used: 13139968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b5f000/0x0/0x1bfc00000, data 0x2ab919b/0x2cdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4706032 data_alloc: 218103808 data_used: 13139968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f133e12f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.336071014s of 11.364871025s, submitted: 6
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f136a16400 session 0x55f136999680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f136a16400 session 0x55f136361e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f134aeac00 session 0x55f136fe2000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 heartbeat osd_stat(store_statfs(0x1a1b5b000/0x0/0x1bfc00000, data 0x2abadf4/0x2ce2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f135e9a800 session 0x55f1367ee780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f135e9f000 session 0x55f135d39e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446816256 unmapped: 65159168 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4711390 data_alloc: 218103808 data_used: 13328384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 heartbeat osd_stat(store_statfs(0x1a1b5b000/0x0/0x1bfc00000, data 0x2abadf4/0x2ce2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f1454e8400 session 0x55f13446e960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 heartbeat osd_stat(store_statfs(0x1a1b5b000/0x0/0x1bfc00000, data 0x2abadf4/0x2ce2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 420 ms_handle_reset con 0x55f134aeac00 session 0x55f136869a40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4714364 data_alloc: 218103808 data_used: 13328384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 420 heartbeat osd_stat(store_statfs(0x1a1b58000/0x0/0x1bfc00000, data 0x2abcaa1/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 420 heartbeat osd_stat(store_statfs(0x1a1b58000/0x0/0x1bfc00000, data 0x2abcaa1/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.557323456s of 13.611905098s, submitted: 14
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4721068 data_alloc: 218103808 data_used: 13963264
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a1b55000/0x0/0x1bfc00000, data 0x2abe5e0/0x2ce8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446816256 unmapped: 65159168 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4726682 data_alloc: 218103808 data_used: 14082048
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09b1000/0x0/0x1bfc00000, data 0x2ac35e0/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09b1000/0x0/0x1bfc00000, data 0x2ac35e0/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f1369b2780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f136868b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.619669914s of 10.684004784s, submitted: 56
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4577296 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448905216 unmapped: 63070208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f136a43e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448937984 unmapped: 63037440 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448970752 unmapped: 63004672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448970752 unmapped: 63004672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448978944 unmapped: 62996480 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449003520 unmapped: 62971904 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449028096 unmapped: 62947328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f1342fe000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f13446e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f134aeac00 session 0x55f136fb1c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f1369b23c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449028096 unmapped: 62947328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 67.869766235s of 67.935157776s, submitted: 19
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453812224 unmapped: 58163200 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f135d47e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f135cccd20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f13446f680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f134aeac00 session 0x55f13704f0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f136fe2780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4654140 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fe6000/0x0/0x1bfc00000, data 0x248f5d0/0x26b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449355776 unmapped: 62619648 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449355776 unmapped: 62619648 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4654140 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449355776 unmapped: 62619648 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fe6000/0x0/0x1bfc00000, data 0x248f5d0/0x26b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f133e10b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f135ce8d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fe6000/0x0/0x1bfc00000, data 0x248f5d0/0x26b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f136869680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.845645905s of 12.574873924s, submitted: 28
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4655920 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449511424 unmapped: 62464000 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449511424 unmapped: 62464000 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f134aeac00 session 0x55f1363ba000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449511424 unmapped: 62464000 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449519616 unmapped: 62455808 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4657825 data_alloc: 218103808 data_used: 10240000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449519616 unmapped: 62455808 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449519616 unmapped: 62455808 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449536000 unmapped: 62439424 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449536000 unmapped: 62439424 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4718389 data_alloc: 234881024 data_used: 18714624
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4718389 data_alloc: 234881024 data_used: 18714624
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.832292557s of 19.761795044s, submitted: 6
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4721361 data_alloc: 234881024 data_used: 18788352
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454139904 unmapped: 57835520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455000064 unmapped: 56975360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455458816 unmapped: 56516608 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09c7000/0x0/0x1bfc00000, data 0x2aa55f3/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,4])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09c7000/0x0/0x1bfc00000, data 0x2aa55f3/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,10])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455819264 unmapped: 56156160 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455819264 unmapped: 56156160 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4784849 data_alloc: 234881024 data_used: 19849216
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455852032 unmapped: 56123392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a094a000/0x0/0x1bfc00000, data 0x2b225f3/0x2d4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790661 data_alloc: 234881024 data_used: 19857408
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.305270195s of 11.759785652s, submitted: 78
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790677 data_alloc: 234881024 data_used: 19857408
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790677 data_alloc: 234881024 data_used: 19857408
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790677 data_alloc: 234881024 data_used: 19857408
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456269824 unmapped: 55705600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.074939728s of 17.010576248s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456269824 unmapped: 55705600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456269824 unmapped: 55705600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0936000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4788073 data_alloc: 234881024 data_used: 19865600
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0936000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f135d1a1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f136fb10e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4787321 data_alloc: 234881024 data_used: 19865600
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456286208 unmapped: 55689216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456286208 unmapped: 55689216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f133e034a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.452046394s of 10.840573311s, submitted: 7
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456294400 unmapped: 55681024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 421 handle_osd_map epochs [422,422], i have 421, src has [1,422]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4793311 data_alloc: 234881024 data_used: 19873792
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f134aeac00 session 0x55f1341e5c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9a800 session 0x55f13446e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9f000 session 0x55f136a43e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 55672832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f136a08c00 session 0x55f13446e960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 55672832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f136a16400 session 0x55f135d39e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 55672832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4794723 data_alloc: 234881024 data_used: 19976192
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.652312279s of 10.148161888s, submitted: 24
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f136a16400 session 0x55f1367ee780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f134aeac00 session 0x55f1342f8780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4794591 data_alloc: 234881024 data_used: 19976192
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456318976 unmapped: 55656448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456318976 unmapped: 55656448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4794591 data_alloc: 234881024 data_used: 19976192
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456318976 unmapped: 55656448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9a800 session 0x55f1369983c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456327168 unmapped: 55648256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,5,3])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9f000 session 0x55f136a421e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 54566912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 54566912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 54566912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 handle_osd_map epochs [423,423], i have 422, src has [1,423]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.739651680s of 10.780681610s, submitted: 22
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0934000/0x0/0x1bfc00000, data 0x3b1d24c/0x2d6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 422 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4927971 data_alloc: 234881024 data_used: 19984384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457433088 unmapped: 54542336 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457449472 unmapped: 54525952 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 ms_handle_reset con 0x55f136a08c00 session 0x55f1342df0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 ms_handle_reset con 0x55f136a08c00 session 0x55f1342f8960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4851611 data_alloc: 234881024 data_used: 19984384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457465856 unmapped: 54509568 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457474048 unmapped: 54501376 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457474048 unmapped: 54501376 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4851743 data_alloc: 234881024 data_used: 19984384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457474048 unmapped: 54501376 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 423 handle_osd_map epochs [424,424], i have 423, src has [1,424]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.620392799s of 12.359407425s, submitted: 31
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a092d000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a092d000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4854717 data_alloc: 234881024 data_used: 19984384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 54484992 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a092d000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4862305 data_alloc: 234881024 data_used: 20303872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0928000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.015429497s of 12.123022079s, submitted: 17
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4863783 data_alloc: 234881024 data_used: 20303872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a090f000/0x0/0x1bfc00000, data 0x2b47a38/0x2d75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4864263 data_alloc: 234881024 data_used: 20316160
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0921000/0x0/0x1bfc00000, data 0x2b48a38/0x2d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4863295 data_alloc: 234881024 data_used: 20316160
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0921000/0x0/0x1bfc00000, data 0x2b48a38/0x2d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.135271072s of 13.823410034s, submitted: 13
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4863253 data_alloc: 234881024 data_used: 20316160
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a091b000/0x0/0x1bfc00000, data 0x2b4da38/0x2d7b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f136869a40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f136fe34a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457523200 unmapped: 54452224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f1369b8780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.746734619s of 36.266899109s, submitted: 52
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4617390 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452263936 unmapped: 59711488 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a3e/0x1d6e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,12])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f135d5a780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f133e12000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f133e132c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f1342fe000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f13a8e6000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a08c00 session 0x55f13a8e72c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f134fb4d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1025000/0x0/0x1bfc00000, data 0x244ba77/0x2679000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f133e12d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4695740 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f142f10780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451297280 unmapped: 60678144 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4754620 data_alloc: 218103808 data_used: 18493440
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4754620 data_alloc: 218103808 data_used: 18493440
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453533696 unmapped: 58441728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453533696 unmapped: 58441728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.415086746s of 19.606620789s, submitted: 38
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0813000/0x0/0x1bfc00000, data 0x2c5ca87/0x2e8b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458563584 unmapped: 53411840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4880304 data_alloc: 234881024 data_used: 20680704
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 53346304 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459448320 unmapped: 52527104 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459448320 unmapped: 52527104 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4886742 data_alloc: 234881024 data_used: 20729856
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a2000/0x0/0x1bfc00000, data 0x32c5a87/0x34f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4881830 data_alloc: 234881024 data_used: 20742144
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4881830 data_alloc: 234881024 data_used: 20742144
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.877428055s of 20.293407440s, submitted: 150
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f135d885a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f1369b21e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4882078 data_alloc: 234881024 data_used: 20750336
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4882078 data_alloc: 234881024 data_used: 20750336
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f136ab52c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f136999860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.553461075s of 10.557051659s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f135cb8d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f133e12d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f1368685a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f135d89e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f136a432c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.660188675s of 24.752002716s, submitted: 30
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f134fb5c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f136998f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f133e10960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f134fb4780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f1369981e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716395 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f135d5b0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f1367ee1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f135ca23c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3e000/0x0/0x1bfc00000, data 0x2633a15/0x2860000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f13446e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4736313 data_alloc: 218103808 data_used: 12836864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4791033 data_alloc: 234881024 data_used: 20553728
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4791033 data_alloc: 234881024 data_used: 20553728
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.904300690s of 17.033201218s, submitted: 16
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457269248 unmapped: 65208320 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457302016 unmapped: 65175552 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4902267 data_alloc: 234881024 data_used: 21004288
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c4000/0x0/0x1bfc00000, data 0x33aba25/0x35d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897703 data_alloc: 234881024 data_used: 21004288
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c3000/0x0/0x1bfc00000, data 0x33ada25/0x35db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897703 data_alloc: 234881024 data_used: 21004288
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c3000/0x0/0x1bfc00000, data 0x33ada25/0x35db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c3000/0x0/0x1bfc00000, data 0x33ada25/0x35db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897703 data_alloc: 234881024 data_used: 21004288
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.130870819s of 20.470256805s, submitted: 93
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 23 06:18:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1294221829' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f136fb1e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f1342f9680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c2000/0x0/0x1bfc00000, data 0x33aea25/0x35dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897931 data_alloc: 234881024 data_used: 21004288
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c2000/0x0/0x1bfc00000, data 0x33aea25/0x35dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c2000/0x0/0x1bfc00000, data 0x33aea25/0x35dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f13446fe00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457351168 unmapped: 65126400 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457351168 unmapped: 65126400 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f133e05c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f1368683c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f135ce9860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f135d39e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.675395966s of 36.936244965s, submitted: 21
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f133e10b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f136fe34a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f136fb10e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f133e05e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f136a42000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4658895 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4658895 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457375744 unmapped: 65101824 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457375744 unmapped: 65101824 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f142f105a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457375744 unmapped: 65101824 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.654913902s of 11.721105576s, submitted: 13
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4659027 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670707 data_alloc: 218103808 data_used: 11902976
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457392128 unmapped: 65085440 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457392128 unmapped: 65085440 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670707 data_alloc: 218103808 data_used: 11902976
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.669104576s of 11.673585892s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459366400 unmapped: 63111168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458702848 unmapped: 63774720 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14d9000/0x0/0x1bfc00000, data 0x1f97a24/0x21c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458702848 unmapped: 63774720 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458702848 unmapped: 63774720 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710525 data_alloc: 218103808 data_used: 12615680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710861 data_alloc: 218103808 data_used: 12623872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.951898575s of 13.287283897s, submitted: 36
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136385c00 session 0x55f1342f8960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f136fe2000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4709437 data_alloc: 218103808 data_used: 12627968
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710049 data_alloc: 218103808 data_used: 12644352
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710049 data_alloc: 218103808 data_used: 12644352
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.8 total, 600.0 interval#012Cumulative writes: 67K writes, 256K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.03 MB/s#012Cumulative WAL: 67K writes, 25K syncs, 2.63 writes per sync, written: 0.24 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1932 writes, 8335 keys, 1932 commit groups, 1.0 writes per commit group, ingest: 7.98 MB, 0.01 MB/s#012Interval WAL: 1932 writes, 726 syncs, 2.66 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.704678535s of 12.729924202s, submitted: 3
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458735616 unmapped: 63741952 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458735616 unmapped: 63741952 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4712093 data_alloc: 218103808 data_used: 13004800
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458735616 unmapped: 63741952 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458743808 unmapped: 63733760 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458743808 unmapped: 63733760 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458743808 unmapped: 63733760 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: mgrc ms_handle_reset ms_handle_reset con 0x55f136b3d400
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/530399322
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/530399322,v1:192.168.122.100:6801/530399322]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: mgrc handle_mgr_configure stats_period=5
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f135d38780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f136a42000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 68534272 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f133e11e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134a73000 session 0x55f1363bad20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136385000 session 0x55f136ab5c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f1335da5a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 68534272 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 68534272 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453976064 unmapped: 68501504 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453976064 unmapped: 68501504 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 55.346656799s of 55.513092041s, submitted: 42
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462700544 unmapped: 59777024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f136361a40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f1369b94a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1341e4d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1367ee780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136026400 session 0x55f133e12d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1813000/0x0/0x1bfc00000, data 0x1c5da3e/0x1e8b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455360512 unmapped: 67117056 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455360512 unmapped: 67117056 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4719153 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4719153 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455376896 unmapped: 67100672 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455376896 unmapped: 67100672 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455376896 unmapped: 67100672 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.784545898s of 10.241132736s, submitted: 32
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f136868f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455385088 unmapped: 67092480 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455385088 unmapped: 67092480 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4720542 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455417856 unmapped: 67059712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4783262 data_alloc: 218103808 data_used: 19001344
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4783262 data_alloc: 218103808 data_used: 19001344
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.576991081s of 13.728077888s, submitted: 6
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459513856 unmapped: 62963712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a060c000/0x0/0x1bfc00000, data 0x2e64a77/0x3092000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459685888 unmapped: 62791680 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4887774 data_alloc: 218103808 data_used: 20160512
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0454000/0x0/0x1bfc00000, data 0x301ca77/0x324a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4882466 data_alloc: 218103808 data_used: 20164608
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 61636608 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0426000/0x0/0x1bfc00000, data 0x304aa77/0x3278000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 61636608 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 61636608 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.963484764s of 11.457964897s, submitted: 134
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f142f11c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1363ba000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460857344 unmapped: 61620224 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f133e052c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0420000/0x0/0x1bfc00000, data 0x3050a77/0x327e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453009408 unmapped: 69468160 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13648c800 session 0x55f1341e50e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f142f10b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f133e134a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1369b2000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1367efa40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4674675 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1341e4960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4676460 data_alloc: 218103808 data_used: 10256384
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4687820 data_alloc: 218103808 data_used: 11870208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4687820 data_alloc: 218103808 data_used: 11870208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.862291336s of 18.072654724s, submitted: 60
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135eb0000 session 0x55f1367ef680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455794688 unmapped: 66682880 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453099520 unmapped: 69378048 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453148672 unmapped: 69328896 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1076000/0x0/0x1bfc00000, data 0x23faa38/0x2628000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453165056 unmapped: 69312512 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4754492 data_alloc: 218103808 data_used: 12021760
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453165056 unmapped: 69312512 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453165056 unmapped: 69312512 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453181440 unmapped: 69296128 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453181440 unmapped: 69296128 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1076000/0x0/0x1bfc00000, data 0x23faa38/0x2628000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453197824 unmapped: 69279744 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4750796 data_alloc: 218103808 data_used: 12021760
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453214208 unmapped: 69263360 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.339035988s of 10.036103249s, submitted: 184
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1073000/0x0/0x1bfc00000, data 0x23fda38/0x262b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453238784 unmapped: 69238784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453238784 unmapped: 69238784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f135d1b2c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f1369b2b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453246976 unmapped: 69230592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451944448 unmapped: 70533120 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1367eeb40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1073000/0x0/0x1bfc00000, data 0x23fda38/0x262b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452001792 unmapped: 70475776 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f134968d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135068000 session 0x55f136fb14a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1342fed20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f135d25860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.541412354s of 31.675952911s, submitted: 215
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453066752 unmapped: 69410816 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,14,5])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f135cf2000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1369b2780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f14a130000 session 0x55f136fe3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f14a130000 session 0x55f13a8e7c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1341e5860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4801478 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f133e12000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f133e12b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0861000/0x0/0x1bfc00000, data 0x2c0fa25/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f133e02000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453091328 unmapped: 75210752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1367ee960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4803581 data_alloc: 218103808 data_used: 10268672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453091328 unmapped: 75210752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453091328 unmapped: 75210752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 74358784 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921313 data_alloc: 234881024 data_used: 26689536
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921313 data_alloc: 234881024 data_used: 26689536
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 21.172000885s of 21.923461914s, submitted: 27
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462774272 unmapped: 65527808 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00e2000/0x0/0x1bfc00000, data 0x338da35/0x35bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4982751 data_alloc: 234881024 data_used: 26726400
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464543744 unmapped: 63758336 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464560128 unmapped: 63741952 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a006a000/0x0/0x1bfc00000, data 0x3405a35/0x3634000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5001729 data_alloc: 234881024 data_used: 28971008
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5002209 data_alloc: 234881024 data_used: 28983296
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.061190605s of 14.412494659s, submitted: 94
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5001473 data_alloc: 234881024 data_used: 28995584
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b23c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f134968d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5001341 data_alloc: 234881024 data_used: 28995584
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464666624 unmapped: 63635456 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f135cb8780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.726196289s of 10.059652328s, submitted: 3
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464666624 unmapped: 63635456 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f14a130000 session 0x55f1369b9c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f14a130000 session 0x55f1369990e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f135fffc00 session 0x55f1342fe000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f136b41400 session 0x55f1363601e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464666624 unmapped: 63635456 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5005163 data_alloc: 234881024 data_used: 29003776
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464674816 unmapped: 63627264 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f13f3cbc00 session 0x55f13a8e7680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464674816 unmapped: 63627264 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464674816 unmapped: 63627264 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5006255 data_alloc: 234881024 data_used: 29089792
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5006255 data_alloc: 234881024 data_used: 29089792
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f139a81400 session 0x55f136fb1c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f135fffc00 session 0x55f1342f9c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 425 handle_osd_map epochs [426,426], i have 425, src has [1,426]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.723518372s of 16.029331207s, submitted: 2
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464732160 unmapped: 63569920 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5014061 data_alloc: 234881024 data_used: 29474816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464748544 unmapped: 63553536 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464756736 unmapped: 63545344 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5014861 data_alloc: 234881024 data_used: 29548544
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5015341 data_alloc: 234881024 data_used: 29560832
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 handle_osd_map epochs [427,427], i have 426, src has [1,427]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 426 handle_osd_map epochs [427,427], i have 427, src has [1,427]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.787799835s of 12.030132294s, submitted: 13
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13423b680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136868f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5016953 data_alloc: 234881024 data_used: 29556736
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a004d000/0x0/0x1bfc00000, data 0x1b50e6a/0x1d82000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f136fb01e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1367ef680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135cb9680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136fe3860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455098368 unmapped: 73203712 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135ce8000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 50.622058868s of 54.810539246s, submitted: 47
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455098368 unmapped: 73203712 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1926000/0x0/0x1bfc00000, data 0x1b45e93/0x1d78000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455098368 unmapped: 73203712 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462733312 unmapped: 65568768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f1363ba5a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f14a130000 session 0x55f13704fc20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455131136 unmapped: 73170944 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f14a130000 session 0x55f133e03680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13a8e6960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4761140 data_alloc: 218103808 data_used: 10289152
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136a421e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f142f10b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f136fe2960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13446e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135d5a780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4761272 data_alloc: 218103808 data_used: 10289152
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455139328 unmapped: 73162752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455147520 unmapped: 73154560 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4815672 data_alloc: 218103808 data_used: 17928192
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4815672 data_alloc: 218103808 data_used: 17928192
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.918918610s of 21.329256058s, submitted: 44
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455262208 unmapped: 73039872 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456982528 unmapped: 71319552 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0dce000/0x0/0x1bfc00000, data 0x2695ecc/0x28c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4845574 data_alloc: 218103808 data_used: 18001920
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x272becc/0x295e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a08ae000/0x0/0x1bfc00000, data 0x27adecc/0x29e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859686 data_alloc: 218103808 data_used: 18522112
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.001146317s of 10.467357635s, submitted: 66
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859026 data_alloc: 218103808 data_used: 18526208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135cf2b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f136ab54a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4857498 data_alloc: 218103808 data_used: 18526208
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f14a130000 session 0x55f142f101e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.138523102s of 11.733281136s, submitted: 10
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4858750 data_alloc: 218103808 data_used: 18653184
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859390 data_alloc: 218103808 data_used: 18714624
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859390 data_alloc: 218103808 data_used: 18714624
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.540402412s of 11.780837059s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4864546 data_alloc: 218103808 data_used: 19070976
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4865074 data_alloc: 218103808 data_used: 19066880
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.979717255s of 11.657351494s, submitted: 11
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4865074 data_alloc: 218103808 data_used: 19066880
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1367ef860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135d46b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f134b963c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13704e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13addf400 session 0x55f136a43a40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1341e4960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136fe30e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 40.794708252s of 44.714622498s, submitted: 44
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1368692c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f136360f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602bc00 session 0x55f13704f2c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13704f680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369b25a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12e9000/0x0/0x1bfc00000, data 0x1d73e83/0x1fa5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12e9000/0x0/0x1bfc00000, data 0x1d73ebc/0x1fa5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135ccc5a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4724367 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13a8e6b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f139384800 session 0x55f13704fa40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13704e3c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451493888 unmapped: 76808192 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451493888 unmapped: 76808192 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4744312 data_alloc: 218103808 data_used: 12484608
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4744312 data_alloc: 218103808 data_used: 12484608
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.000379562s of 17.736671448s, submitted: 27
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451510272 unmapped: 76791808 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4759852 data_alloc: 218103808 data_used: 12509184
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451510272 unmapped: 76791808 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453230592 unmapped: 75071488 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453230592 unmapped: 75071488 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c6000/0x0/0x1bfc00000, data 0x1f95edf/0x21c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4764890 data_alloc: 218103808 data_used: 12718080
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c6000/0x0/0x1bfc00000, data 0x1f95edf/0x21c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c6000/0x0/0x1bfc00000, data 0x1f95edf/0x21c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 3.305987597s of 10.130201340s, submitted: 32
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4767300 data_alloc: 218103808 data_used: 12718080
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766548 data_alloc: 218103808 data_used: 12722176
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f133e101e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1363614a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452575232 unmapped: 75726848 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452575232 unmapped: 75726848 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766476 data_alloc: 218103808 data_used: 12722176
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766476 data_alloc: 218103808 data_used: 12722176
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.317569733s of 18.511228561s, submitted: 5
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f136fb0b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f136360960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766460 data_alloc: 218103808 data_used: 12722176
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135d5a000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 42.061229706s of 42.817886353s, submitted: 33
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452599808 unmapped: 75702272 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135d463c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f1342f90e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1369e2960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f1341e4f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f133e56780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0ebe000/0x0/0x1bfc00000, data 0x219fe5a/0x23d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4764975 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0ebe000/0x0/0x1bfc00000, data 0x219fe5a/0x23d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0ebe000/0x0/0x1bfc00000, data 0x219fe5a/0x23d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f136360d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454172672 unmapped: 74129408 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454180864 unmapped: 74121216 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4772407 data_alloc: 218103808 data_used: 10932224
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.808236122s of 11.594923019s, submitted: 22
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f13a8e6960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f135d5a000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813555 data_alloc: 218103808 data_used: 16748544
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813687 data_alloc: 218103808 data_used: 16748544
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,2])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135d47680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f133e101e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813555 data_alloc: 218103808 data_used: 16748544
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.312306404s of 13.516556740s, submitted: 3
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1342df4a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b25a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136998780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 73908224 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 73908224 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454410240 unmapped: 73891840 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454410240 unmapped: 73891840 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f135d89c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f136fb10e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f142f10d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f135cf2d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 43.204605103s of 44.053894043s, submitted: 30
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135cf3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f1368685a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b2960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1367eeb40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f136868d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766156 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0fe5000/0x0/0x1bfc00000, data 0x2076ecc/0x22a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766156 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455499776 unmapped: 72802304 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0fe4000/0x0/0x1bfc00000, data 0x2076ef5/0x22aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455499776 unmapped: 72802304 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459685888 unmapped: 68616192 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465387520 unmapped: 62914560 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475217920 unmapped: 62586880 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a065e000/0x0/0x1bfc00000, data 0x29fcef5/0x2c30000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,9,0,4,7])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4959952 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fd03000/0x0/0x1bfc00000, data 0x3357ef5/0x358b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,9,0,11])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.006268978s of 11.423579216s, submitted: 64
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470769664 unmapped: 67035136 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470769664 unmapped: 67035136 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f136361860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135e9c000 session 0x55f142f10b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13423a000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455737344 unmapped: 82067456 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f133e04d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f133e05c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602fc00 session 0x55f135d1a960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13adde000 session 0x55f136ab5e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135d46b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135cb9c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136ab5a40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602fc00 session 0x55f1369e3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9e000/0x0/0x1bfc00000, data 0x33bcf2e/0x35f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4915148 data_alloc: 218103808 data_used: 10293248
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456056832 unmapped: 81747968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9e000/0x0/0x1bfc00000, data 0x33bcf2e/0x35f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456056832 unmapped: 81747968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9e000/0x0/0x1bfc00000, data 0x33bcf2e/0x35f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457072640 unmapped: 80732160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4947684 data_alloc: 218103808 data_used: 14323712
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.965469837s of 10.133234978s, submitted: 25
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135eb0000 session 0x55f136ab43c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a19000 session 0x55f142f114a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4947282 data_alloc: 218103808 data_used: 14327808
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457310208 unmapped: 80494592 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5017682 data_alloc: 234881024 data_used: 24203264
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.825445175s of 10.174798012s, submitted: 10
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460357632 unmapped: 77447168 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462028800 unmapped: 75776000 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f69a000/0x0/0x1bfc00000, data 0x39bef51/0x3bf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f660000/0x0/0x1bfc00000, data 0x39f9f51/0x3c2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5134180 data_alloc: 234881024 data_used: 34156544
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f658000/0x0/0x1bfc00000, data 0x3a01f51/0x3c36000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468754432 unmapped: 69050368 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470532096 unmapped: 67272704 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172302 data_alloc: 234881024 data_used: 34160640
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f142f10f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135d463c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 2.601797819s of 10.000380516s, submitted: 136
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468631552 unmapped: 69173248 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468639744 unmapped: 69165056 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f1d5000/0x0/0x1bfc00000, data 0x30a9eef/0x32dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468639744 unmapped: 69165056 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13648c000 session 0x55f1342fe000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468377600 unmapped: 69427200 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469999616 unmapped: 67805184 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5067736 data_alloc: 234881024 data_used: 24604672
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f8a3000/0x0/0x1bfc00000, data 0x37afeef/0x39e3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470089728 unmapped: 67715072 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602fc00 session 0x55f133e11860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f142f114a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465223680 unmapped: 72581120 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1363ba3c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.8 total, 600.0 interval#012Cumulative writes: 69K writes, 266K keys, 69K commit groups, 1.0 writes per commit group, ingest: 0.25 GB, 0.03 MB/s#012Cumulative WAL: 69K writes, 26K syncs, 2.63 writes per sync, written: 0.25 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2524 writes, 10K keys, 2524 commit groups, 1.0 writes per commit group, ingest: 10.14 MB, 0.02 MB/s#012Interval WAL: 2524 writes, 998 syncs, 2.53 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0920000/0x0/0x1bfc00000, data 0x273aecc/0x296d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4879398 data_alloc: 218103808 data_used: 15605760
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.405301094s of 10.917118073s, submitted: 104
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0920000/0x0/0x1bfc00000, data 0x273aecc/0x296d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f142f11860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f1342fd0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0920000/0x0/0x1bfc00000, data 0x273aecc/0x296d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4753552 data_alloc: 218103808 data_used: 10395648
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a14f3000/0x0/0x1bfc00000, data 0x1b69e5a/0x1d9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369e3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369e21e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f13446e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b8f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f1342f8960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 48.122467041s of 50.524135590s, submitted: 29
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f13704e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f136fe3c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b94a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f142f11c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f136868000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4800406 data_alloc: 218103808 data_used: 10285056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f2e000/0x0/0x1bfc00000, data 0x212fe5a/0x2360000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13648c000 session 0x55f135cb8f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4805884 data_alloc: 218103808 data_used: 10289152
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4848444 data_alloc: 218103808 data_used: 16191488
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4848444 data_alloc: 218103808 data_used: 16191488
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.138151169s of 19.535371780s, submitted: 27
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468279296 unmapped: 69525504 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468279296 unmapped: 69525504 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468115456 unmapped: 69689344 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4979316 data_alloc: 218103808 data_used: 17743872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edf3000/0x0/0x1bfc00000, data 0x30c9e7d/0x32fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4981584 data_alloc: 218103808 data_used: 17743872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcf000/0x0/0x1bfc00000, data 0x30ede7d/0x331f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4977592 data_alloc: 218103808 data_used: 17743872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcf000/0x0/0x1bfc00000, data 0x30ede7d/0x331f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4977592 data_alloc: 218103808 data_used: 17743872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.841932297s of 19.731657028s, submitted: 129
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcf000/0x0/0x1bfc00000, data 0x30ede7d/0x331f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4977284 data_alloc: 218103808 data_used: 17743872
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcc000/0x0/0x1bfc00000, data 0x30f0e7d/0x3322000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcc000/0x0/0x1bfc00000, data 0x30f0e7d/0x3322000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,2])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edc9000/0x0/0x1bfc00000, data 0x30f2e7d/0x3324000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edb8000/0x0/0x1bfc00000, data 0x3103e7d/0x3335000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4979836 data_alloc: 218103808 data_used: 17764352
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f134fb5c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.664420128s of 11.280209541s, submitted: 18
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369e2780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369b3c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edaa000/0x0/0x1bfc00000, data 0x3112e7d/0x3344000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4979704 data_alloc: 218103808 data_used: 17764352
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f136a1f400 session 0x55f1363614a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f135eb0800 session 0x55f136fe3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f136a19000 session 0x55f1367ee3c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f135045800 session 0x55f135d5a960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f135fffc00 session 0x55f13a8e7c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 heartbeat osd_stat(store_statfs(0x19df53000/0x0/0x1bfc00000, data 0x3f66b48/0x419b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 handle_osd_map epochs [429,429], i have 428, src has [1,429]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 428 handle_osd_map epochs [429,429], i have 429, src has [1,429]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 429 heartbeat osd_stat(store_statfs(0x19df53000/0x0/0x1bfc00000, data 0x3f66b48/0x419b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 429 handle_osd_map epochs [430,430], i have 429, src has [1,430]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f136027000 session 0x55f133e034a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f136a1f400 session 0x55f136ab4780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f136a1f400 session 0x55f133e11e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f135045800 session 0x55f1369b8d20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484302848 unmapped: 53501952 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5174173 data_alloc: 234881024 data_used: 33067008
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484302848 unmapped: 53501952 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 478642176 unmapped: 59162624 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 heartbeat osd_stat(store_statfs(0x19df4a000/0x0/0x1bfc00000, data 0x3f6a4cc/0x41a2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,0,1,8])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.816694260s of 10.062507629s, submitted: 71
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f135fffc00 session 0x55f1368692c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4892239 data_alloc: 218103808 data_used: 10305536
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 430 handle_osd_map epochs [431,431], i have 430, src has [1,431]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f4f5000/0x0/0x1bfc00000, data 0x299d4a9/0x2bd4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468090880 unmapped: 69713920 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468156416 unmapped: 69648384 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4895357 data_alloc: 218103808 data_used: 10313728
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f517000/0x0/0x1bfc00000, data 0x299efe8/0x2bd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4895357 data_alloc: 218103808 data_used: 10313728
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f517000/0x0/0x1bfc00000, data 0x299efe8/0x2bd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f517000/0x0/0x1bfc00000, data 0x299efe8/0x2bd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136027000 session 0x55f13446e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a19000 session 0x55f1369e21e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a19000 session 0x55f1369e3680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135045800 session 0x55f142f11860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.625346184s of 17.649908066s, submitted: 301
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136027000 session 0x55f1369b8780
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135fffc00 session 0x55f1363ba3c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a1f400 session 0x55f142f114a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a1f400 session 0x55f133e11860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468197376 unmapped: 69607424 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135045800 session 0x55f142f10f00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135fffc00 session 0x55f136a42b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897299 data_alloc: 218103808 data_used: 10317824
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136027000 session 0x55f1342ffc20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475201536 unmapped: 72851456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475201536 unmapped: 72851456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19cbf0000/0x0/0x1bfc00000, data 0x412504a/0x435e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475201536 unmapped: 72851456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a19000 session 0x55f1342f8960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475209728 unmapped: 72843264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470548480 unmapped: 77504512 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 432 ms_handle_reset con 0x55f136027000 session 0x55f135d46000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5002273 data_alloc: 218103808 data_used: 13885440
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473186304 unmapped: 74866688 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 432 heartbeat osd_stat(store_statfs(0x19d6c9000/0x0/0x1bfc00000, data 0x32d4c46/0x350d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473186304 unmapped: 74866688 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 432 heartbeat osd_stat(store_statfs(0x19d6c9000/0x0/0x1bfc00000, data 0x32d4c46/0x350d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 432 heartbeat osd_stat(store_statfs(0x19d6c9000/0x0/0x1bfc00000, data 0x32d4c46/0x350d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070433 data_alloc: 234881024 data_used: 23470080
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.400728226s of 10.873732567s, submitted: 89
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5073215 data_alloc: 234881024 data_used: 23470080
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3d000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473251840 unmapped: 74801152 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096647 data_alloc: 234881024 data_used: 25784320
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096647 data_alloc: 234881024 data_used: 25784320
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096647 data_alloc: 234881024 data_used: 25784320
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.438421249s of 20.862281799s, submitted: 29
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5095415 data_alloc: 234881024 data_used: 25780224
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473423872 unmapped: 74629120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473554944 unmapped: 74498048 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135045800 session 0x55f136869860
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135fffc00 session 0x55f133e57a40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466214912 unmapped: 81838080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a1f400 session 0x55f133e045a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4804505 data_alloc: 218103808 data_used: 10326016
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4804505 data_alloc: 218103808 data_used: 10326016
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.781463623s of 16.977258682s, submitted: 61
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f1369e25a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f1369b9e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4805449 data_alloc: 218103808 data_used: 14127104
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f135d5a960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13a125400 session 0x55f13704e1e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13b9b5c00 session 0x55f136a430e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19efe8000/0x0/0x1bfc00000, data 0x1d2e700/0x1f66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468705280 unmapped: 79347712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4867713 data_alloc: 218103808 data_used: 14127104
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468705280 unmapped: 79347712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4867713 data_alloc: 218103808 data_used: 14127104
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136b3cc00 session 0x55f136361e00
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f136ab5680
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4867713 data_alloc: 218103808 data_used: 14127104
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13a125400 session 0x55f133e04960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13b9b5c00 session 0x55f134fb43c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468828160 unmapped: 79224832 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4907393 data_alloc: 234881024 data_used: 19374080
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4907393 data_alloc: 234881024 data_used: 19374080
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.486818314s of 31.777320862s, submitted: 28
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4948785 data_alloc: 234881024 data_used: 19398656
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471539712 unmapped: 76513280 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472588288 unmapped: 75464704 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2d0000/0x0/0x1bfc00000, data 0x2a46700/0x2c7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2d0000/0x0/0x1bfc00000, data 0x2a46700/0x2c7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4972213 data_alloc: 234881024 data_used: 19816448
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2d0000/0x0/0x1bfc00000, data 0x2a46700/0x2c7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4971817 data_alloc: 234881024 data_used: 19816448
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2af000/0x0/0x1bfc00000, data 0x2a67700/0x2c9f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4971817 data_alloc: 234881024 data_used: 19816448
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2af000/0x0/0x1bfc00000, data 0x2a67700/0x2c9f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.674192429s of 20.244110107s, submitted: 61
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4972033 data_alloc: 234881024 data_used: 19816448
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e29c000/0x0/0x1bfc00000, data 0x2a7a700/0x2cb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472809472 unmapped: 75243520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f13a8e6b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1365ce800 session 0x55f1342f83c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.431100845s of 21.130962372s, submitted: 10
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f135d1a960
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13a125400 session 0x55f136fe23c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13b9b5c00 session 0x55f135d392c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f135d46b40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 33.194938660s of 33.254158020s, submitted: 4
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4976261 data_alloc: 234881024 data_used: 19845120
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135e9bc00 session 0x55f1342fed20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4978341 data_alloc: 234881024 data_used: 20037632
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4978341 data_alloc: 234881024 data_used: 20037632
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472989696 unmapped: 75063296 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.098130226s of 15.642007828s, submitted: 1
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4989413 data_alloc: 234881024 data_used: 20328448
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b85a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a0c800 session 0x55f1369e3c20
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a0a800 session 0x55f1368683c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.028177261s of 15.051651955s, submitted: 2
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135e9bc00 session 0x55f1367efa40
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4998053 data_alloc: 234881024 data_used: 21549056
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f135ccc5a0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.078834534s of 11.086947441s, submitted: 2
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a0c800 session 0x55f135d46000
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4991385 data_alloc: 234881024 data_used: 21438464
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1454e9000 session 0x55f134fb43c0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 75677696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 75677696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 75677696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472399872 unmapped: 75653120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472399872 unmapped: 75653120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472399872 unmapped: 75653120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472408064 unmapped: 75644928 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472424448 unmapped: 75628544 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472424448 unmapped: 75628544 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472424448 unmapped: 75628544 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f139285000 session 0x55f1367ef0e0
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472465408 unmapped: 75587584 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472465408 unmapped: 75587584 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 75571200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 75571200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 75571200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472514560 unmapped: 75538432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472514560 unmapped: 75538432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'config diff' '{prefix=config diff}'
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'config show' '{prefix=config show}'
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472309760 unmapped: 75743232 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472006656 unmapped: 76046336 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:18:43 np0005593233 ceph-osd[78880]: do_command 'log dump' '{prefix=log dump}'
Jan 23 06:18:43 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:18:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:43.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:43.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:18:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3149341960' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:18:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 23 06:18:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2615551441' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 06:18:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 23 06:18:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4082998767' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 06:18:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:45.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:45.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1311540754' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 06:18:46 np0005593233 podman[320103]: 2026-01-23 11:18:46.098301188 +0000 UTC m=+0.095812331 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3920445850' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 06:18:46 np0005593233 nova_compute[222017]: 2026-01-23 11:18:46.330 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/543199158' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/31574832' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 23 06:18:46 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/627838575' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 06:18:47 np0005593233 nova_compute[222017]: 2026-01-23 11:18:47.083 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:47 np0005593233 nova_compute[222017]: 2026-01-23 11:18:47.084 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:47 np0005593233 nova_compute[222017]: 2026-01-23 11:18:47.084 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:18:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 23 06:18:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2576712278' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 06:18:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 23 06:18:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3135222063' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 06:18:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:47.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 23 06:18:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3441331062' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 06:18:47 np0005593233 nova_compute[222017]: 2026-01-23 11:18:47.894 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:47.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/699751526' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 06:18:48 np0005593233 systemd[1]: Starting Hostname Service...
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1926464222' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 06:18:48 np0005593233 systemd[1]: Started Hostname Service.
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4117489907' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 23 06:18:48 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3741411182' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 06:18:49 np0005593233 nova_compute[222017]: 2026-01-23 11:18:49.538 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:49.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:49 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 23 06:18:49 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1816850369' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 06:18:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:18:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:49.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/284449848' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 06:18:51 np0005593233 nova_compute[222017]: 2026-01-23 11:18:51.332 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2280594174' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 06:18:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:51.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:18:51 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:18:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:51.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:18:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/298886846' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:18:52 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 23 06:18:52 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/823703449' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 06:18:52 np0005593233 nova_compute[222017]: 2026-01-23 11:18:52.934 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:18:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:18:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:53.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:18:53 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:18:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:53.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:54 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 23 06:18:54 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/625422393' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 06:18:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Jan 23 06:18:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1074979259' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 06:18:55 np0005593233 nova_compute[222017]: 2026-01-23 11:18:55.381 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:55 np0005593233 nova_compute[222017]: 2026-01-23 11:18:55.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:55.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:55 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0) v1
Jan 23 06:18:55 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/172395570' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 06:18:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:18:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:55.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:18:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Jan 23 06:18:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4112691072' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 06:18:56 np0005593233 nova_compute[222017]: 2026-01-23 11:18:56.334 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:56 np0005593233 nova_compute[222017]: 2026-01-23 11:18:56.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:56 np0005593233 nova_compute[222017]: 2026-01-23 11:18:56.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:18:56 np0005593233 nova_compute[222017]: 2026-01-23 11:18:56.385 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:18:56 np0005593233 nova_compute[222017]: 2026-01-23 11:18:56.414 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:18:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Jan 23 06:18:56 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3010164390' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 06:18:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Jan 23 06:18:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1957223201' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 06:18:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:57.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:57 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0) v1
Jan 23 06:18:57 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1462689973' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 06:18:57 np0005593233 nova_compute[222017]: 2026-01-23 11:18:57.935 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:57.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:59 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Jan 23 06:18:59 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/572795903' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 06:18:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:59.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:18:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:59.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:00 np0005593233 podman[321701]: 2026-01-23 11:19:00.236885693 +0000 UTC m=+0.233598634 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 06:19:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0) v1
Jan 23 06:19:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1629422729' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 06:19:01 np0005593233 nova_compute[222017]: 2026-01-23 11:19:01.385 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:01.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:01.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Jan 23 06:19:01 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3388027606' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 06:19:02 np0005593233 nova_compute[222017]: 2026-01-23 11:19:02.938 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Jan 23 06:19:03 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/813426340' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 06:19:03 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:19:03 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1327028268' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:19:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:03.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:03 np0005593233 ovs-appctl[322800]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 06:19:03 np0005593233 ovs-appctl[322806]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 06:19:03 np0005593233 ovs-appctl[322816]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 06:19:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:03.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:05.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:05.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 23 06:19:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/288951382' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 06:19:06 np0005593233 nova_compute[222017]: 2026-01-23 11:19:06.388 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0) v1
Jan 23 06:19:06 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2023484631' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 06:19:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1868003869' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:07.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:07.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:19:07 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3650368283' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:19:07 np0005593233 nova_compute[222017]: 2026-01-23 11:19:07.997 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0) v1
Jan 23 06:19:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1446668470' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 06:19:08 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:08 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/330936122' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:09.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:09 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0) v1
Jan 23 06:19:09 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/175644701' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 06:19:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:09.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:10 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0) v1
Jan 23 06:19:10 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3142423481' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2191510219' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:11 np0005593233 nova_compute[222017]: 2026-01-23 11:19:11.420 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:11.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0) v1
Jan 23 06:19:11 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3645585865' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 06:19:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:11.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 06:19:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 06:19:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:19:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:12 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:19:13 np0005593233 nova_compute[222017]: 2026-01-23 11:19:13.000 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1982856786' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:13 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0) v1
Jan 23 06:19:13 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2792021158' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 06:19:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:13.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:13.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:14 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:19:14 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2489116697' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:19:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0) v1
Jan 23 06:19:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1635264569' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 06:19:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:15.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:15 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 06:19:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:15.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Jan 23 06:19:16 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/964548394' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 06:19:16 np0005593233 podman[324684]: 2026-01-23 11:19:16.280081209 +0000 UTC m=+0.100360879 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:19:16 np0005593233 nova_compute[222017]: 2026-01-23 11:19:16.422 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:16 np0005593233 systemd[1]: Starting Time & Date Service...
Jan 23 06:19:16 np0005593233 systemd[1]: Started Time & Date Service.
Jan 23 06:19:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:17.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:17.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:18 np0005593233 nova_compute[222017]: 2026-01-23 11:19:18.002 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:19.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:19.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:20 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:21 np0005593233 nova_compute[222017]: 2026-01-23 11:19:21.513 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:21.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:19:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:21.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:19:23 np0005593233 nova_compute[222017]: 2026-01-23 11:19:23.005 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:23.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:23.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:25.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:25.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:26 np0005593233 nova_compute[222017]: 2026-01-23 11:19:26.516 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:27.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:27.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:28 np0005593233 nova_compute[222017]: 2026-01-23 11:19:28.007 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:29.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:30.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:31 np0005593233 podman[324929]: 2026-01-23 11:19:31.143646743 +0000 UTC m=+0.140179791 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 06:19:31 np0005593233 nova_compute[222017]: 2026-01-23 11:19:31.520 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:31.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:19:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:32.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:19:32 np0005593233 nova_compute[222017]: 2026-01-23 11:19:32.409 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:33 np0005593233 nova_compute[222017]: 2026-01-23 11:19:33.010 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:33.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:34.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:35.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:36.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:36 np0005593233 nova_compute[222017]: 2026-01-23 11:19:36.550 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:37.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:38.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:38 np0005593233 nova_compute[222017]: 2026-01-23 11:19:38.061 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:39 np0005593233 nova_compute[222017]: 2026-01-23 11:19:39.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:39.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:40.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:40 np0005593233 nova_compute[222017]: 2026-01-23 11:19:40.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:41 np0005593233 nova_compute[222017]: 2026-01-23 11:19:41.604 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:41.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:19:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:42.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:19:42 np0005593233 nova_compute[222017]: 2026-01-23 11:19:42.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:19:42.745 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:19:42.746 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:19:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:19:42.746 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:19:43 np0005593233 nova_compute[222017]: 2026-01-23 11:19:43.064 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:43.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:44.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:44 np0005593233 nova_compute[222017]: 2026-01-23 11:19:44.899 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:19:44 np0005593233 nova_compute[222017]: 2026-01-23 11:19:44.900 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:19:44 np0005593233 nova_compute[222017]: 2026-01-23 11:19:44.900 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:19:44 np0005593233 nova_compute[222017]: 2026-01-23 11:19:44.901 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:19:44 np0005593233 nova_compute[222017]: 2026-01-23 11:19:44.902 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:19:45 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:19:45 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/654807366' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:19:45 np0005593233 nova_compute[222017]: 2026-01-23 11:19:45.368 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:19:45 np0005593233 nova_compute[222017]: 2026-01-23 11:19:45.589 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:19:45 np0005593233 nova_compute[222017]: 2026-01-23 11:19:45.590 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4179MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:19:45 np0005593233 nova_compute[222017]: 2026-01-23 11:19:45.591 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:19:45 np0005593233 nova_compute[222017]: 2026-01-23 11:19:45.591 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:19:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:45.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:46.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:46 np0005593233 nova_compute[222017]: 2026-01-23 11:19:46.657 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:46 np0005593233 podman[324977]: 2026-01-23 11:19:46.701027888 +0000 UTC m=+0.109295921 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 06:19:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:46 np0005593233 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 06:19:46 np0005593233 nova_compute[222017]: 2026-01-23 11:19:46.951 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:19:46 np0005593233 nova_compute[222017]: 2026-01-23 11:19:46.951 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:19:46 np0005593233 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 06:19:47 np0005593233 nova_compute[222017]: 2026-01-23 11:19:47.045 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:19:47 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:19:47 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1561667550' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:19:47 np0005593233 nova_compute[222017]: 2026-01-23 11:19:47.603 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:19:47 np0005593233 nova_compute[222017]: 2026-01-23 11:19:47.615 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:19:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:47.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:48.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:48 np0005593233 nova_compute[222017]: 2026-01-23 11:19:48.067 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:49 np0005593233 nova_compute[222017]: 2026-01-23 11:19:49.077 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:19:49 np0005593233 nova_compute[222017]: 2026-01-23 11:19:49.079 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:19:49 np0005593233 nova_compute[222017]: 2026-01-23 11:19:49.080 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:19:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:49.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:50.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:51 np0005593233 nova_compute[222017]: 2026-01-23 11:19:51.080 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:51 np0005593233 nova_compute[222017]: 2026-01-23 11:19:51.081 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:51 np0005593233 nova_compute[222017]: 2026-01-23 11:19:51.081 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:51 np0005593233 nova_compute[222017]: 2026-01-23 11:19:51.082 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:19:51 np0005593233 nova_compute[222017]: 2026-01-23 11:19:51.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:52.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:53 np0005593233 nova_compute[222017]: 2026-01-23 11:19:53.071 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:54.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:55 np0005593233 nova_compute[222017]: 2026-01-23 11:19:55.380 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:19:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:55.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:19:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:56.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:56 np0005593233 nova_compute[222017]: 2026-01-23 11:19:56.660 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:57 np0005593233 nova_compute[222017]: 2026-01-23 11:19:57.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:58.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:58 np0005593233 nova_compute[222017]: 2026-01-23 11:19:58.081 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:58 np0005593233 nova_compute[222017]: 2026-01-23 11:19:58.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:58 np0005593233 nova_compute[222017]: 2026-01-23 11:19:58.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:19:58 np0005593233 nova_compute[222017]: 2026-01-23 11:19:58.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:19:58 np0005593233 nova_compute[222017]: 2026-01-23 11:19:58.523 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:19:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:19:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:59.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:00.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:00 np0005593233 ceph-mon[81574]: overall HEALTH_OK
Jan 23 06:20:01 np0005593233 nova_compute[222017]: 2026-01-23 11:20:01.700 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:01.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:01 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:02.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:02 np0005593233 podman[325024]: 2026-01-23 11:20:02.086110798 +0000 UTC m=+0.096936703 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:20:03 np0005593233 nova_compute[222017]: 2026-01-23 11:20:03.085 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:04.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:05 np0005593233 systemd[1]: session-63.scope: Deactivated successfully.
Jan 23 06:20:05 np0005593233 systemd[1]: session-63.scope: Consumed 3min 3.811s CPU time, 1.0G memory peak, read 501.1M from disk, written 315.3M to disk.
Jan 23 06:20:05 np0005593233 systemd-logind[804]: Session 63 logged out. Waiting for processes to exit.
Jan 23 06:20:05 np0005593233 systemd-logind[804]: Removed session 63.
Jan 23 06:20:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:06.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:06 np0005593233 systemd-logind[804]: New session 64 of user zuul.
Jan 23 06:20:06 np0005593233 systemd[1]: Started Session 64 of User zuul.
Jan 23 06:20:06 np0005593233 nova_compute[222017]: 2026-01-23 11:20:06.739 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:06 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:06 np0005593233 systemd[1]: session-64.scope: Deactivated successfully.
Jan 23 06:20:06 np0005593233 systemd-logind[804]: Session 64 logged out. Waiting for processes to exit.
Jan 23 06:20:06 np0005593233 systemd-logind[804]: Removed session 64.
Jan 23 06:20:07 np0005593233 systemd-logind[804]: New session 65 of user zuul.
Jan 23 06:20:07 np0005593233 systemd[1]: Started Session 65 of User zuul.
Jan 23 06:20:07 np0005593233 systemd[1]: session-65.scope: Deactivated successfully.
Jan 23 06:20:07 np0005593233 systemd-logind[804]: Session 65 logged out. Waiting for processes to exit.
Jan 23 06:20:07 np0005593233 systemd-logind[804]: Removed session 65.
Jan 23 06:20:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:07.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:08.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:08 np0005593233 nova_compute[222017]: 2026-01-23 11:20:08.088 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:09.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:10.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:11 np0005593233 nova_compute[222017]: 2026-01-23 11:20:11.742 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:11 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:11.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:12.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:13 np0005593233 nova_compute[222017]: 2026-01-23 11:20:13.091 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:13.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:14.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:15.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:16.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:16 np0005593233 nova_compute[222017]: 2026-01-23 11:20:16.793 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:16 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:17 np0005593233 podman[325108]: 2026-01-23 11:20:17.056995436 +0000 UTC m=+0.069324315 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:20:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:17.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:18.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:18 np0005593233 nova_compute[222017]: 2026-01-23 11:20:18.096 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:19.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:20.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 06:20:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:20:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:20:21 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:20:21 np0005593233 nova_compute[222017]: 2026-01-23 11:20:21.795 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:21 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:21.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:22.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #232. Immutable memtables: 0.
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.774129) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:856] [default] [JOB 149] Flushing memtable with next log file: 232
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222774195, "job": 149, "event": "flush_started", "num_memtables": 1, "num_entries": 2229, "num_deletes": 506, "total_data_size": 4229522, "memory_usage": 4299760, "flush_reason": "Manual Compaction"}
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:885] [default] [JOB 149] Level-0 flush table #233: started
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222806276, "cf_name": "default", "job": 149, "event": "table_file_creation", "file_number": 233, "file_size": 2780748, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 109085, "largest_seqno": 111309, "table_properties": {"data_size": 2770783, "index_size": 5498, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 28941, "raw_average_key_size": 21, "raw_value_size": 2747336, "raw_average_value_size": 2050, "num_data_blocks": 234, "num_entries": 1340, "num_filter_entries": 1340, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769167087, "oldest_key_time": 1769167087, "file_creation_time": 1769167222, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 233, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 149] Flush lasted 32226 microseconds, and 8150 cpu microseconds.
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.806353) [db/flush_job.cc:967] [default] [JOB 149] Level-0 flush table #233: 2780748 bytes OK
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.806377) [db/memtable_list.cc:519] [default] Level-0 commit table #233 started
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.976596) [db/memtable_list.cc:722] [default] Level-0 commit table #233: memtable #1 done
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.976666) EVENT_LOG_v1 {"time_micros": 1769167222976653, "job": 149, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.976695) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 149] Try to delete WAL files size 4217433, prev total WAL file size 4218073, number of live WAL files 2.
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000229.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.982101) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353233' seq:72057594037927935, type:22 .. '6C6F676D0034373734' seq:0, type:0; will stop at (end)
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 150] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 149 Base level 0, inputs: [233(2715KB)], [231(13MB)]
Jan 23 06:20:22 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222982203, "job": 150, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [233], "files_L6": [231], "score": -1, "input_data_size": 16479963, "oldest_snapshot_seqno": -1}
Jan 23 06:20:23 np0005593233 nova_compute[222017]: 2026-01-23 11:20:23.099 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 150] Generated table #234: 12541 keys, 14393987 bytes, temperature: kUnknown
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167223179301, "cf_name": "default", "job": 150, "event": "table_file_creation", "file_number": 234, "file_size": 14393987, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14315618, "index_size": 45936, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31365, "raw_key_size": 336538, "raw_average_key_size": 26, "raw_value_size": 14098938, "raw_average_value_size": 1124, "num_data_blocks": 1708, "num_entries": 12541, "num_filter_entries": 12541, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158916, "oldest_key_time": 0, "file_creation_time": 1769167222, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d3e46583-96e0-4f5f-ac42-7b628f2a09c0", "db_session_id": "KQET2Q3DBZ4VI5YCO0ZU", "orig_file_number": 234, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.179838) [db/compaction/compaction_job.cc:1663] [default] [JOB 150] Compacted 1@0 + 1@6 files to L6 => 14393987 bytes
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.207855) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.5 rd, 73.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 13.1 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(11.1) write-amplify(5.2) OK, records in: 13569, records dropped: 1028 output_compression: NoCompression
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.207958) EVENT_LOG_v1 {"time_micros": 1769167223207893, "job": 150, "event": "compaction_finished", "compaction_time_micros": 197252, "compaction_time_cpu_micros": 66948, "output_level": 6, "num_output_files": 1, "total_output_size": 14393987, "num_input_records": 13569, "num_output_records": 12541, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000233.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167223208966, "job": 150, "event": "table_file_deletion", "file_number": 233}
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000231.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167223212311, "job": 150, "event": "table_file_deletion", "file_number": 231}
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:22.981972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.212456) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.212466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.212468) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.212470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593233 ceph-mon[81574]: rocksdb: (Original Log Time 2026/01/23-11:20:23.212471) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:23.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:24.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:25.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:26.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:26 np0005593233 nova_compute[222017]: 2026-01-23 11:20:26.797 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:27.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:28.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:28 np0005593233 nova_compute[222017]: 2026-01-23 11:20:28.154 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:20:28 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:20:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:29.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:20:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:30.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:20:31 np0005593233 nova_compute[222017]: 2026-01-23 11:20:31.842 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:31.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:32.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:33 np0005593233 podman[325308]: 2026-01-23 11:20:33.120174624 +0000 UTC m=+0.116803182 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 06:20:33 np0005593233 nova_compute[222017]: 2026-01-23 11:20:33.156 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:33 np0005593233 nova_compute[222017]: 2026-01-23 11:20:33.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:33.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:34.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:35.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:36.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:36 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:36 np0005593233 nova_compute[222017]: 2026-01-23 11:20:36.846 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:37.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:38.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:38 np0005593233 nova_compute[222017]: 2026-01-23 11:20:38.184 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:39.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:40.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:40 np0005593233 nova_compute[222017]: 2026-01-23 11:20:40.440 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:41 np0005593233 nova_compute[222017]: 2026-01-23 11:20:41.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:41 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:41 np0005593233 nova_compute[222017]: 2026-01-23 11:20:41.881 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:41 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:41 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:41 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:42 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:42 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:42 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:42.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:42 np0005593233 nova_compute[222017]: 2026-01-23 11:20:42.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:42 np0005593233 nova_compute[222017]: 2026-01-23 11:20:42.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:20:42 np0005593233 nova_compute[222017]: 2026-01-23 11:20:42.409 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:20:42.747 140224 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:20:42.747 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:20:42 np0005593233 ovn_metadata_agent[140219]: 2026-01-23 11:20:42.747 140224 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.211 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.408 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.476 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.477 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.477 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.477 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.478 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:20:43 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:43 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:43 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:43 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:20:43 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1110509380' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:20:43 np0005593233 nova_compute[222017]: 2026-01-23 11:20:43.989 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:20:44 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:44 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:20:44 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:44.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.150 222021 WARNING nova.virt.libvirt.driver [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.153 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4208MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.154 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.154 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.217 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.218 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.234 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:20:44 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:20:44 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3676425090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.703 222021 DEBUG oslo_concurrency.processutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.711 222021 DEBUG nova.compute.provider_tree [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 929812a2-38ca-4ee7-9f24-090d633cb42b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.729 222021 DEBUG nova.scheduler.client.report [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Inventory has not changed for provider 929812a2-38ca-4ee7-9f24-090d633cb42b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.730 222021 DEBUG nova.compute.resource_tracker [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.731 222021 DEBUG oslo_concurrency.lockutils [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.731 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:44 np0005593233 nova_compute[222017]: 2026-01-23 11:20:44.731 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:20:45 np0005593233 nova_compute[222017]: 2026-01-23 11:20:45.720 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:45 np0005593233 nova_compute[222017]: 2026-01-23 11:20:45.721 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:20:45 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:45 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:45 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:45.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:46 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:46 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:46 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:46.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:46 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:46 np0005593233 nova_compute[222017]: 2026-01-23 11:20:46.884 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:47 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:47 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:47 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:47.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:48 np0005593233 podman[325378]: 2026-01-23 11:20:48.062144997 +0000 UTC m=+0.074389208 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:20:48 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:48 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:48 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:48.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:48 np0005593233 nova_compute[222017]: 2026-01-23 11:20:48.214 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:48 np0005593233 nova_compute[222017]: 2026-01-23 11:20:48.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:49 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:49 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.002000056s ======
Jan 23 06:20:49 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:49.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Jan 23 06:20:50 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:50 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:50 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:50.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:50 np0005593233 nova_compute[222017]: 2026-01-23 11:20:50.386 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:51 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:51 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:51 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:51 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:51.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:51 np0005593233 nova_compute[222017]: 2026-01-23 11:20:51.925 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:52 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:52 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:52 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:52.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:53 np0005593233 nova_compute[222017]: 2026-01-23 11:20:53.250 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:53 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:53 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:53 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:53.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:54 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:54 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:54 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:54.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:55 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:55 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:55 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:56 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:56 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:56 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:56.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:56 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:56 np0005593233 nova_compute[222017]: 2026-01-23 11:20:56.928 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:57 np0005593233 nova_compute[222017]: 2026-01-23 11:20:57.379 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:57 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:57 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:57 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:57.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:58 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:58 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:20:58 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:58.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:20:58 np0005593233 nova_compute[222017]: 2026-01-23 11:20:58.293 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:58 np0005593233 nova_compute[222017]: 2026-01-23 11:20:58.384 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:59 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:20:59 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:20:59 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:59.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:21:00 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:00 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:00 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:00.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:00 np0005593233 nova_compute[222017]: 2026-01-23 11:21:00.385 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:21:00 np0005593233 nova_compute[222017]: 2026-01-23 11:21:00.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:21:00 np0005593233 nova_compute[222017]: 2026-01-23 11:21:00.386 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:21:00 np0005593233 nova_compute[222017]: 2026-01-23 11:21:00.732 222021 DEBUG nova.compute.manager [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:21:01 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:01 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:01 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:01.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:02 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:02 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:02 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:02.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:02 np0005593233 nova_compute[222017]: 2026-01-23 11:21:01.968 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:02 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:03 np0005593233 nova_compute[222017]: 2026-01-23 11:21:03.297 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:03 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:03 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:03 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:03.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:04 np0005593233 podman[325397]: 2026-01-23 11:21:04.091526723 +0000 UTC m=+0.098256560 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 06:21:04 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:04 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:04 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:04.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:05 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:05 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:05 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:05.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:06 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:06 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:06 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:06.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:06 np0005593233 nova_compute[222017]: 2026-01-23 11:21:06.972 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:07 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:07 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:07 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:07 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:07.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:08 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:08 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:08 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:08.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:08 np0005593233 nova_compute[222017]: 2026-01-23 11:21:08.329 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:09 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:09 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:21:09 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:21:10 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:10 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:10 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:10.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:11 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:11 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:11 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:11.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:11 np0005593233 systemd-logind[804]: New session 66 of user zuul.
Jan 23 06:21:11 np0005593233 systemd[1]: Started Session 66 of User zuul.
Jan 23 06:21:11 np0005593233 nova_compute[222017]: 2026-01-23 11:21:11.973 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:12 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:12 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:12 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:12.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:12 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:13 np0005593233 nova_compute[222017]: 2026-01-23 11:21:13.331 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:13 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:13 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:13 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:13.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:14 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:14 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:14 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:14.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:15 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 23 06:21:15 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2057684457' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 06:21:15 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:15 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:15 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:15.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:16 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:16 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:16 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:16.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:16 np0005593233 nova_compute[222017]: 2026-01-23 11:21:16.974 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:17 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:17 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:17 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:17 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:17 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.8 total, 600.0 interval#012Cumulative writes: 71K writes, 271K keys, 71K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.03 MB/s#012Cumulative WAL: 71K writes, 27K syncs, 2.62 writes per sync, written: 0.26 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1522 writes, 5398 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 5.34 MB, 0.01 MB/s#012Interval WAL: 1522 writes, 642 syncs, 2.37 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:18 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:18 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:18 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:18.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:18 np0005593233 nova_compute[222017]: 2026-01-23 11:21:18.334 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:19 np0005593233 ovs-vsctl[325726]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 06:21:19 np0005593233 podman[325704]: 2026-01-23 11:21:19.075209561 +0000 UTC m=+0.076694702 container health_status 2c4c0fa6b571e76c99d6e771875ed37b48d1f1b3bf463066a65f2d1bc29bc357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:21:19 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:19 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:19 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:19.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:20 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:20 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:20 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:20.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:20 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 06:21:20 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 06:21:20 np0005593233 virtqemud[221325]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 06:21:20 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: cache status {prefix=cache status} (starting...)
Jan 23 06:21:20 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:21 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: client ls {prefix=client ls} (starting...)
Jan 23 06:21:21 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:21 np0005593233 lvm[326077]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 06:21:21 np0005593233 lvm[326077]: VG ceph_vg0 finished
Jan 23 06:21:21 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 06:21:21 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:21 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 06:21:21 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:21 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:21 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:21 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:21.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:21 np0005593233 nova_compute[222017]: 2026-01-23 11:21:21.976 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:22 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:22 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:22 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:22.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 23 06:21:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2296032499' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:22 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 23 06:21:22 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1176153335' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 06:21:22 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1271510014' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 06:21:23 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: ops {prefix=ops} (starting...)
Jan 23 06:21:23 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:23 np0005593233 nova_compute[222017]: 2026-01-23 11:21:23.337 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/43242302' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/789231160' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:21:23 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3321165586' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:21:23 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:23 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:23 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:23.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:24 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: session ls {prefix=session ls} (starting...)
Jan 23 06:21:24 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx Can't run that command on an inactive MDS!
Jan 23 06:21:24 np0005593233 ceph-mds[85262]: mds.cephfs.compute-1.elkrlx asok_command: status {prefix=status} (starting...)
Jan 23 06:21:24 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:24 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:24 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:24.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:21:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1788867118' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:21:24 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 23 06:21:24 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2496344219' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1342120866' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1952988231' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2570300927' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 23 06:21:25 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4213442731' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 06:21:25 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:25 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:21:25 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:21:26 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:26 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:26 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:26.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:21:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1030050351' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:21:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 23 06:21:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/400147772' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 06:21:26 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 23 06:21:26 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3155483086' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 06:21:26 np0005593233 nova_compute[222017]: 2026-01-23 11:21:26.977 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:21:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2354321469' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439844864 unmapped: 72130560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439853056 unmapped: 72122368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544629 data_alloc: 218103808 data_used: 10207232
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544789 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439918592 unmapped: 72056832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4544789 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439926784 unmapped: 72048640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439934976 unmapped: 72040448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 43.547557831s of 44.114562988s, submitted: 17
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4546617 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f134968d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439943168 unmapped: 72032256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4546617 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439951360 unmapped: 72024064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439951360 unmapped: 72024064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439951360 unmapped: 72024064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439959552 unmapped: 72015872 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f13a8e6d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f133e13680
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1342f90e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135069400 session 0x55f13446fa40
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.264505386s of 10.274923325s, submitted: 2
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f134b125a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f1369b3c20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f1363bab40
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1367ef2c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1400f8800 session 0x55f136998d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439861248 unmapped: 72114176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439869440 unmapped: 72105984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439877632 unmapped: 72097792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439885824 unmapped: 72089600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b3860
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4571236 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.963664055s of 16.056951523s, submitted: 15
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.8 total, 600.0 interval#012Cumulative writes: 65K writes, 247K keys, 65K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.04 MB/s#012Cumulative WAL: 65K writes, 24K syncs, 2.63 writes per sync, written: 0.23 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4600 writes, 13K keys, 4600 commit groups, 1.0 writes per commit group, ingest: 11.03 MB, 0.02 MB/s#012Interval WAL: 4600 writes, 1959 syncs, 2.35 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439894016 unmapped: 72081408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593928 data_alloc: 218103808 data_used: 13381632
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593928 data_alloc: 218103808 data_used: 13381632
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439902208 unmapped: 72073216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2bec000/0x0/0x1bfc00000, data 0x1e3c19b/0x2062000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 439910400 unmapped: 72065024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.895181656s of 12.986706734s, submitted: 1
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440819712 unmapped: 71155712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440418304 unmapped: 71557120 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4603780 data_alloc: 218103808 data_used: 13373440
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ab8000/0x0/0x1bfc00000, data 0x1f6a19b/0x2190000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ab8000/0x0/0x1bfc00000, data 0x1f6a19b/0x2190000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440918016 unmapped: 71057408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440926208 unmapped: 71049216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440934400 unmapped: 71041024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440950784 unmapped: 71024640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619786 data_alloc: 218103808 data_used: 13406208
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 28.065114975s of 31.968029022s, submitted: 36
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440967168 unmapped: 71008256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1369e21e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440975360 unmapped: 71000064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4619946 data_alloc: 218103808 data_used: 13410304
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2a98000/0x0/0x1bfc00000, data 0x1f8819b/0x21ae000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f142f114a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f136360780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440844288 unmapped: 71131136 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f136b3c000 session 0x55f1363bb4a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554946 data_alloc: 218103808 data_used: 10215424
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef2000/0x0/0x1bfc00000, data 0x1b3619b/0x1d5c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1342df0e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.321082115s of 11.436309814s, submitted: 34
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f142f10780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440860672 unmapped: 71114752 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440868864 unmapped: 71106560 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440877056 unmapped: 71098368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440877056 unmapped: 71098368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440877056 unmapped: 71098368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440885248 unmapped: 71090176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440893440 unmapped: 71081984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4554066 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ef3000/0x0/0x1bfc00000, data 0x1b3618b/0x1d5b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 440901632 unmapped: 71073792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.356424332s of 31.384279251s, submitted: 6
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f1367ee1e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1369b2960
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f136a16400 session 0x55f1369b3e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f136a16400 session 0x55f136999e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b30e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae7000/0x0/0x1bfc00000, data 0x1f4218b/0x2167000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593598 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441532416 unmapped: 70443008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441540608 unmapped: 70434816 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae7000/0x0/0x1bfc00000, data 0x1f4218b/0x2167000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441540608 unmapped: 70434816 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441540608 unmapped: 70434816 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4593598 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f13446e000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441548800 unmapped: 70426624 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f136fe2d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae7000/0x0/0x1bfc00000, data 0x1f4218b/0x2167000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f1342f8960
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.714527130s of 10.794013977s, submitted: 13
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f1445eb800 session 0x55f135cb90e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 441556992 unmapped: 70418432 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4595436 data_alloc: 218103808 data_used: 10211328
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442204160 unmapped: 69771264 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615116 data_alloc: 218103808 data_used: 12935168
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2ae6000/0x0/0x1bfc00000, data 0x1f4219b/0x2168000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615116 data_alloc: 218103808 data_used: 12935168
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442269696 unmapped: 69705728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 442277888 unmapped: 69697536 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.146739006s of 13.157499313s, submitted: 1
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448045056 unmapped: 63930368 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 447217664 unmapped: 64757760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 447217664 unmapped: 64757760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4699138 data_alloc: 218103808 data_used: 13131776
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446414848 unmapped: 65560576 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a2016000/0x0/0x1bfc00000, data 0x2a0919b/0x2c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446636032 unmapped: 65339392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446636032 unmapped: 65339392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1fa1000/0x0/0x1bfc00000, data 0x2a8719b/0x2cad000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1afaf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446636032 unmapped: 65339392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446644224 unmapped: 65331200 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4703656 data_alloc: 218103808 data_used: 13139968
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446652416 unmapped: 65323008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446652416 unmapped: 65323008 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.548395157s of 10.023729324s, submitted: 110
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446693376 unmapped: 65282048 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704512 data_alloc: 218103808 data_used: 13139968
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704908 data_alloc: 218103808 data_used: 13139968
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6f000/0x0/0x1bfc00000, data 0x2aa919b/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6c000/0x0/0x1bfc00000, data 0x2aac19b/0x2cd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b6c000/0x0/0x1bfc00000, data 0x2aac19b/0x2cd2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446791680 unmapped: 65183744 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.920547485s of 12.751147270s, submitted: 230
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4706032 data_alloc: 218103808 data_used: 13139968
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f134aeac00 session 0x55f134fb4780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9a800 session 0x55f133e12d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b5f000/0x0/0x1bfc00000, data 0x2ab919b/0x2cdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b5f000/0x0/0x1bfc00000, data 0x2ab919b/0x2cdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4706032 data_alloc: 218103808 data_used: 13139968
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 heartbeat osd_stat(store_statfs(0x1a1b5f000/0x0/0x1bfc00000, data 0x2ab919b/0x2cdf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4706032 data_alloc: 218103808 data_used: 13139968
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446799872 unmapped: 65175552 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 ms_handle_reset con 0x55f135e9f000 session 0x55f133e12f00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.336071014s of 11.364871025s, submitted: 6
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f136a16400 session 0x55f136999680
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f136a16400 session 0x55f136361e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f134aeac00 session 0x55f136fe2000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 heartbeat osd_stat(store_statfs(0x1a1b5b000/0x0/0x1bfc00000, data 0x2abadf4/0x2ce2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f135e9a800 session 0x55f1367ee780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f135e9f000 session 0x55f135d39e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446816256 unmapped: 65159168 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4711390 data_alloc: 218103808 data_used: 13328384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 heartbeat osd_stat(store_statfs(0x1a1b5b000/0x0/0x1bfc00000, data 0x2abadf4/0x2ce2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 ms_handle_reset con 0x55f1454e8400 session 0x55f13446e960
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 heartbeat osd_stat(store_statfs(0x1a1b5b000/0x0/0x1bfc00000, data 0x2abadf4/0x2ce2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 420 ms_handle_reset con 0x55f134aeac00 session 0x55f136869a40
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4714364 data_alloc: 218103808 data_used: 13328384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446824448 unmapped: 65150976 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 420 heartbeat osd_stat(store_statfs(0x1a1b58000/0x0/0x1bfc00000, data 0x2abcaa1/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 420 heartbeat osd_stat(store_statfs(0x1a1b58000/0x0/0x1bfc00000, data 0x2abcaa1/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446832640 unmapped: 65142784 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.557323456s of 13.611905098s, submitted: 14
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4721068 data_alloc: 218103808 data_used: 13963264
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a1b55000/0x0/0x1bfc00000, data 0x2abe5e0/0x2ce8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1b3bf9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446808064 unmapped: 65167360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 446816256 unmapped: 65159168 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4726682 data_alloc: 218103808 data_used: 14082048
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09b1000/0x0/0x1bfc00000, data 0x2ac35e0/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09b1000/0x0/0x1bfc00000, data 0x2ac35e0/0x2ced000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f1369b2780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f136868b40
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.619669914s of 10.684004784s, submitted: 56
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4577296 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448905216 unmapped: 63070208 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f136a43e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448913408 unmapped: 63062016 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448921600 unmapped: 63053824 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448929792 unmapped: 63045632 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448937984 unmapped: 63037440 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448946176 unmapped: 63029248 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574309 data_alloc: 218103808 data_used: 10227712
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448962560 unmapped: 63012864 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448970752 unmapped: 63004672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448970752 unmapped: 63004672 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448978944 unmapped: 62996480 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448987136 unmapped: 62988288 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 448995328 unmapped: 62980096 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449003520 unmapped: 62971904 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449011712 unmapped: 62963712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4574789 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449019904 unmapped: 62955520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449028096 unmapped: 62947328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f1342fe000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f13446e1e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f134aeac00 session 0x55f136fb1c20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f1369b23c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449028096 unmapped: 62947328 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 67.869766235s of 67.935157776s, submitted: 19
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a193a000/0x0/0x1bfc00000, data 0x1b3b5d0/0x1d64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453812224 unmapped: 58163200 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f135d47e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f135cccd20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f13446f680
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f134aeac00 session 0x55f13704f0e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f136fe2780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4654140 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449347584 unmapped: 62627840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fe6000/0x0/0x1bfc00000, data 0x248f5d0/0x26b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449355776 unmapped: 62619648 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449355776 unmapped: 62619648 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4654140 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449355776 unmapped: 62619648 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fe6000/0x0/0x1bfc00000, data 0x248f5d0/0x26b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f133e10b40
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f135ce8d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fe6000/0x0/0x1bfc00000, data 0x248f5d0/0x26b8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f136a08c00 session 0x55f136869680
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.845645905s of 12.574873924s, submitted: 28
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4655920 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449363968 unmapped: 62611456 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449511424 unmapped: 62464000 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449511424 unmapped: 62464000 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f134aeac00 session 0x55f1363ba000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449511424 unmapped: 62464000 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449519616 unmapped: 62455808 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4657825 data_alloc: 218103808 data_used: 10240000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449519616 unmapped: 62455808 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449519616 unmapped: 62455808 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449536000 unmapped: 62439424 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 449536000 unmapped: 62439424 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4718389 data_alloc: 234881024 data_used: 18714624
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4718389 data_alloc: 234881024 data_used: 18714624
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0fc1000/0x0/0x1bfc00000, data 0x24b35f3/0x26dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454131712 unmapped: 57843712 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.832292557s of 19.761795044s, submitted: 6
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4721361 data_alloc: 234881024 data_used: 18788352
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454139904 unmapped: 57835520 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455000064 unmapped: 56975360 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455458816 unmapped: 56516608 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09c7000/0x0/0x1bfc00000, data 0x2aa55f3/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,4])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a09c7000/0x0/0x1bfc00000, data 0x2aa55f3/0x2ccf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,10])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455819264 unmapped: 56156160 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:21:27 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2016658749' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:21:27 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:27 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:27 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:27.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455819264 unmapped: 56156160 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4784849 data_alloc: 234881024 data_used: 19849216
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455852032 unmapped: 56123392 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a094a000/0x0/0x1bfc00000, data 0x2b225f3/0x2d4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790661 data_alloc: 234881024 data_used: 19857408
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456245248 unmapped: 55730176 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.305270195s of 11.759785652s, submitted: 78
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790677 data_alloc: 234881024 data_used: 19857408
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790677 data_alloc: 234881024 data_used: 19857408
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456253440 unmapped: 55721984 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4790677 data_alloc: 234881024 data_used: 19857408
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456261632 unmapped: 55713792 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456269824 unmapped: 55705600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3c5f3/0x2d66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.074939728s of 17.010576248s, submitted: 1
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456269824 unmapped: 55705600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456269824 unmapped: 55705600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0936000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4788073 data_alloc: 234881024 data_used: 19865600
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0936000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9a800 session 0x55f135d1a1e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f135e9f000 session 0x55f136fb10e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4787321 data_alloc: 234881024 data_used: 19865600
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456278016 unmapped: 55697408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456286208 unmapped: 55689216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456286208 unmapped: 55689216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 heartbeat osd_stat(store_statfs(0x1a0937000/0x0/0x1bfc00000, data 0x2b3d5f3/0x2d67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 ms_handle_reset con 0x55f1445eb800 session 0x55f133e034a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.452046394s of 10.840573311s, submitted: 7
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456294400 unmapped: 55681024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 421 handle_osd_map epochs [422,422], i have 421, src has [1,422]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4793311 data_alloc: 234881024 data_used: 19873792
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f134aeac00 session 0x55f1341e5c20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9a800 session 0x55f13446e1e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9f000 session 0x55f136a43e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 55672832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f136a08c00 session 0x55f13446e960
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 55672832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f136a16400 session 0x55f135d39e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456302592 unmapped: 55672832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4794723 data_alloc: 234881024 data_used: 19976192
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.652312279s of 10.148161888s, submitted: 24
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f136a16400 session 0x55f1367ee780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f134aeac00 session 0x55f1342f8780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4794591 data_alloc: 234881024 data_used: 19976192
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456310784 unmapped: 55664640 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456318976 unmapped: 55656448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456318976 unmapped: 55656448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4794591 data_alloc: 234881024 data_used: 19976192
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456318976 unmapped: 55656448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9a800 session 0x55f1369983c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456327168 unmapped: 55648256 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0932000/0x0/0x1bfc00000, data 0x2b3f2ae/0x2d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,5,3])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 ms_handle_reset con 0x55f135e9f000 session 0x55f136a421e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 54566912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 54566912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457408512 unmapped: 54566912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 handle_osd_map epochs [423,423], i have 422, src has [1,423]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.739651680s of 10.780681610s, submitted: 22
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 heartbeat osd_stat(store_statfs(0x1a0934000/0x0/0x1bfc00000, data 0x3b1d24c/0x2d6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 422 handle_osd_map epochs [423,423], i have 423, src has [1,423]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4927971 data_alloc: 234881024 data_used: 19984384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457433088 unmapped: 54542336 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457449472 unmapped: 54525952 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 ms_handle_reset con 0x55f136a08c00 session 0x55f1342df0e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 ms_handle_reset con 0x55f136a08c00 session 0x55f1342f8960
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4851611 data_alloc: 234881024 data_used: 19984384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457457664 unmapped: 54517760 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457465856 unmapped: 54509568 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457474048 unmapped: 54501376 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x2b40ef9/0x2d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457474048 unmapped: 54501376 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4851743 data_alloc: 234881024 data_used: 19984384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457474048 unmapped: 54501376 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 423 handle_osd_map epochs [424,424], i have 423, src has [1,424]
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.620392799s of 12.359407425s, submitted: 31
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a092d000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a092d000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4854717 data_alloc: 234881024 data_used: 19984384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457482240 unmapped: 54493184 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457490432 unmapped: 54484992 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a092d000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4862305 data_alloc: 234881024 data_used: 20303872
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0928000/0x0/0x1bfc00000, data 0x2b42a38/0x2d70000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.015429497s of 12.123022079s, submitted: 17
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4863783 data_alloc: 234881024 data_used: 20303872
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a090f000/0x0/0x1bfc00000, data 0x2b47a38/0x2d75000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4864263 data_alloc: 234881024 data_used: 20316160
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0921000/0x0/0x1bfc00000, data 0x2b48a38/0x2d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4863295 data_alloc: 234881024 data_used: 20316160
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0921000/0x0/0x1bfc00000, data 0x2b48a38/0x2d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.135271072s of 13.823410034s, submitted: 13
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4863253 data_alloc: 234881024 data_used: 20316160
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457498624 unmapped: 54476800 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a091b000/0x0/0x1bfc00000, data 0x2b4da38/0x2d7b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f136869a40
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f136fe34a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457523200 unmapped: 54452224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f1369b8780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451149824 unmapped: 60825600 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451158016 unmapped: 60817408 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451166208 unmapped: 60809216 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451174400 unmapped: 60801024 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4615217 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451182592 unmapped: 60792832 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.746734619s of 36.266899109s, submitted: 52
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 60776448 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4617390 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452263936 unmapped: 59711488 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a3e/0x1d6e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,12])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f135d5a780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f133e12000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f133e132c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f1342fe000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f13a8e6000
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a08c00 session 0x55f13a8e72c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f134fb4d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1025000/0x0/0x1bfc00000, data 0x244ba77/0x2679000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f133e12d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4695740 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f142f10780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451379200 unmapped: 60596224 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451297280 unmapped: 60678144 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4754620 data_alloc: 218103808 data_used: 18493440
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4754620 data_alloc: 218103808 data_used: 18493440
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453525504 unmapped: 58449920 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1024000/0x0/0x1bfc00000, data 0x244ba87/0x267a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453533696 unmapped: 58441728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453533696 unmapped: 58441728 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.415086746s of 19.606620789s, submitted: 38
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0813000/0x0/0x1bfc00000, data 0x2c5ca87/0x2e8b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458563584 unmapped: 53411840 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4880304 data_alloc: 234881024 data_used: 20680704
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458629120 unmapped: 53346304 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459448320 unmapped: 52527104 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459448320 unmapped: 52527104 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4886742 data_alloc: 234881024 data_used: 20729856
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a2000/0x0/0x1bfc00000, data 0x32c5a87/0x34f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4881830 data_alloc: 234881024 data_used: 20742144
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4881830 data_alloc: 234881024 data_used: 20742144
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459456512 unmapped: 52518912 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.877428055s of 20.293407440s, submitted: 150
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f135d885a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f1369b21e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4882078 data_alloc: 234881024 data_used: 20750336
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4882078 data_alloc: 234881024 data_used: 20750336
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a01a7000/0x0/0x1bfc00000, data 0x32c8a87/0x34f7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f136ab52c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f136999860
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.553461075s of 10.557051659s, submitted: 1
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459464704 unmapped: 52510720 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f135cb8d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4629143 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a13c4000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453263360 unmapped: 58712064 heap: 511975424 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f133e12d20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9a800 session 0x55f1368685a0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f135d89e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f136a432c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.660188675s of 24.752002716s, submitted: 30
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f134fb5c20
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f136998f00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f133e10960
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f134fb4780
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f1369981e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716395 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f135d5b0e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f1367ee1e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f135ca23c0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3e000/0x0/0x1bfc00000, data 0x2633a15/0x2860000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f13446e1e0
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4736313 data_alloc: 218103808 data_used: 12836864
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453345280 unmapped: 69132288 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4791033 data_alloc: 234881024 data_used: 20553728
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0e3d000/0x0/0x1bfc00000, data 0x2633a25/0x2861000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4791033 data_alloc: 234881024 data_used: 20553728
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 68083712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.904300690s of 17.033201218s, submitted: 16
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457269248 unmapped: 65208320 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457302016 unmapped: 65175552 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4902267 data_alloc: 234881024 data_used: 21004288
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c4000/0x0/0x1bfc00000, data 0x33aba25/0x35d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897703 data_alloc: 234881024 data_used: 21004288
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c3000/0x0/0x1bfc00000, data 0x33ada25/0x35db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897703 data_alloc: 234881024 data_used: 21004288
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c3000/0x0/0x1bfc00000, data 0x33ada25/0x35db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c3000/0x0/0x1bfc00000, data 0x33ada25/0x35db000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897703 data_alloc: 234881024 data_used: 21004288
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.130870819s of 20.470256805s, submitted: 93
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f136fb1e00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f1342f9680
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c2000/0x0/0x1bfc00000, data 0x33aea25/0x35dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897931 data_alloc: 234881024 data_used: 21004288
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c2000/0x0/0x1bfc00000, data 0x33aea25/0x35dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00c2000/0x0/0x1bfc00000, data 0x33aea25/0x35dc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f13446fe00
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457334784 unmapped: 65142784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457342976 unmapped: 65134592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457351168 unmapped: 65126400 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:27 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457351168 unmapped: 65126400 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4641032 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457359360 unmapped: 65118208 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13602e800 session 0x55f133e05c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f1368683c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f135ce9860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f135d39e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.675395966s of 36.936244965s, submitted: 21
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f133e10b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136a16400 session 0x55f136fe34a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f136fb10e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f133e05e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f136a42000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4658895 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4658895 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457367552 unmapped: 65110016 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457375744 unmapped: 65101824 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457375744 unmapped: 65101824 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f142f105a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457375744 unmapped: 65101824 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.654913902s of 11.721105576s, submitted: 13
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4659027 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457383936 unmapped: 65093632 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670707 data_alloc: 218103808 data_used: 11902976
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457392128 unmapped: 65085440 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457392128 unmapped: 65085440 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1788000/0x0/0x1bfc00000, data 0x1ce8a24/0x1f16000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670707 data_alloc: 218103808 data_used: 11902976
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457400320 unmapped: 65077248 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.669104576s of 11.673585892s, submitted: 1
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459366400 unmapped: 63111168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458702848 unmapped: 63774720 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14d9000/0x0/0x1bfc00000, data 0x1f97a24/0x21c5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458702848 unmapped: 63774720 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458702848 unmapped: 63774720 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710525 data_alloc: 218103808 data_used: 12615680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710861 data_alloc: 218103808 data_used: 12623872
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458711040 unmapped: 63766528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.951898575s of 13.287283897s, submitted: 36
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136385c00 session 0x55f1342f8960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f136fe2000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4709437 data_alloc: 218103808 data_used: 12627968
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710049 data_alloc: 218103808 data_used: 12644352
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458719232 unmapped: 63758336 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710049 data_alloc: 218103808 data_used: 12644352
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.8 total, 600.0 interval#012Cumulative writes: 67K writes, 256K keys, 67K commit groups, 1.0 writes per commit group, ingest: 0.24 GB, 0.03 MB/s#012Cumulative WAL: 67K writes, 25K syncs, 2.63 writes per sync, written: 0.24 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1932 writes, 8335 keys, 1932 commit groups, 1.0 writes per commit group, ingest: 7.98 MB, 0.01 MB/s#012Interval WAL: 1932 writes, 726 syncs, 2.66 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.704678535s of 12.729924202s, submitted: 3
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458727424 unmapped: 63750144 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458735616 unmapped: 63741952 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458735616 unmapped: 63741952 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a14cf000/0x0/0x1bfc00000, data 0x1fa1a24/0x21cf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4712093 data_alloc: 218103808 data_used: 13004800
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458735616 unmapped: 63741952 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458743808 unmapped: 63733760 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458743808 unmapped: 63733760 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458743808 unmapped: 63733760 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: mgrc ms_handle_reset ms_handle_reset con 0x55f136b3d400
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/530399322
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/530399322,v1:192.168.122.100:6801/530399322]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: mgrc handle_mgr_configure stats_period=5
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134aeac00 session 0x55f135d38780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135811000 session 0x55f136a42000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 68534272 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135e9f000 session 0x55f133e11e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f134a73000 session 0x55f1363bad20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136385000 session 0x55f136ab5c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f1335da5a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 68534272 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 68534272 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453951488 unmapped: 68526080 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453959680 unmapped: 68517888 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453967872 unmapped: 68509696 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453976064 unmapped: 68501504 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453976064 unmapped: 68501504 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453984256 unmapped: 68493312 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4648508 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454000640 unmapped: 68476928 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1931000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 55.346656799s of 55.513092041s, submitted: 42
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462700544 unmapped: 59777024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f136361a40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f1369b94a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1341e4d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1367ee780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136026400 session 0x55f133e12d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1813000/0x0/0x1bfc00000, data 0x1c5da3e/0x1e8b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455360512 unmapped: 67117056 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455360512 unmapped: 67117056 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4719153 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455368704 unmapped: 67108864 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4719153 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455376896 unmapped: 67100672 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455376896 unmapped: 67100672 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455376896 unmapped: 67100672 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.784545898s of 10.241132736s, submitted: 32
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f136868f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455385088 unmapped: 67092480 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455385088 unmapped: 67092480 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4720542 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455417856 unmapped: 67059712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4783262 data_alloc: 218103808 data_used: 19001344
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457318400 unmapped: 65159168 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1093000/0x0/0x1bfc00000, data 0x23dda77/0x260b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4783262 data_alloc: 218103808 data_used: 19001344
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457326592 unmapped: 65150976 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.576991081s of 13.728077888s, submitted: 6
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459513856 unmapped: 62963712 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a060c000/0x0/0x1bfc00000, data 0x2e64a77/0x3092000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459685888 unmapped: 62791680 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4887774 data_alloc: 218103808 data_used: 20160512
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0454000/0x0/0x1bfc00000, data 0x301ca77/0x324a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460759040 unmapped: 61718528 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4882466 data_alloc: 218103808 data_used: 20164608
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 61636608 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0426000/0x0/0x1bfc00000, data 0x304aa77/0x3278000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 61636608 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460840960 unmapped: 61636608 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.963484764s of 11.457964897s, submitted: 134
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f142f11c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1363ba000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460857344 unmapped: 61620224 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f133e052c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0420000/0x0/0x1bfc00000, data 0x3050a77/0x327e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453009408 unmapped: 69468160 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13648c800 session 0x55f1341e50e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f142f10b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f139285000 session 0x55f133e134a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1369b2000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1367efa40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4674675 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1341e4960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451411968 unmapped: 71065600 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4676460 data_alloc: 218103808 data_used: 10256384
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4687820 data_alloc: 218103808 data_used: 11870208
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a17a6000/0x0/0x1bfc00000, data 0x1ccaa38/0x1ef8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4687820 data_alloc: 218103808 data_used: 11870208
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451436544 unmapped: 71041024 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.862291336s of 18.072654724s, submitted: 60
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135eb0000 session 0x55f1367ef680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455794688 unmapped: 66682880 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453099520 unmapped: 69378048 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453148672 unmapped: 69328896 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1076000/0x0/0x1bfc00000, data 0x23faa38/0x2628000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453165056 unmapped: 69312512 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4754492 data_alloc: 218103808 data_used: 12021760
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453165056 unmapped: 69312512 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453165056 unmapped: 69312512 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453181440 unmapped: 69296128 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453181440 unmapped: 69296128 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1076000/0x0/0x1bfc00000, data 0x23faa38/0x2628000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453197824 unmapped: 69279744 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4750796 data_alloc: 218103808 data_used: 12021760
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453214208 unmapped: 69263360 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.339035988s of 10.036103249s, submitted: 184
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1073000/0x0/0x1bfc00000, data 0x23fda38/0x262b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453238784 unmapped: 69238784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453238784 unmapped: 69238784 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f135d1b2c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f1369b2b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453246976 unmapped: 69230592 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451944448 unmapped: 70533120 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1367eeb40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1073000/0x0/0x1bfc00000, data 0x23fda38/0x262b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452001792 unmapped: 70475776 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4670232 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452018176 unmapped: 70459392 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f134968d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135068000 session 0x55f136fb14a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1342fed20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f135d25860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.541412354s of 31.675952911s, submitted: 215
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453066752 unmapped: 69410816 heap: 522477568 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a1930000/0x0/0x1bfc00000, data 0x1b40a15/0x1d6d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,14,5])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f135cf2000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1369b2780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f14a130000 session 0x55f136fe3680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f14a130000 session 0x55f13a8e7c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1341e5860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4801478 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f133e12000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f133e12b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0861000/0x0/0x1bfc00000, data 0x2c0fa25/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452042752 unmapped: 76259328 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f133e02000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453091328 unmapped: 75210752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f1445eb000 session 0x55f1367ee960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4803581 data_alloc: 218103808 data_used: 10268672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453091328 unmapped: 75210752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453091328 unmapped: 75210752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453943296 unmapped: 74358784 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921313 data_alloc: 234881024 data_used: 26689536
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0860000/0x0/0x1bfc00000, data 0x2c0fa35/0x2e3e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4921313 data_alloc: 234881024 data_used: 26689536
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459579392 unmapped: 68722688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 21.172000885s of 21.923461914s, submitted: 27
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462774272 unmapped: 65527808 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a00e2000/0x0/0x1bfc00000, data 0x338da35/0x35bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 4982751 data_alloc: 234881024 data_used: 26726400
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464543744 unmapped: 63758336 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464560128 unmapped: 63741952 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a006a000/0x0/0x1bfc00000, data 0x3405a35/0x3634000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5001729 data_alloc: 234881024 data_used: 28971008
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5002209 data_alloc: 234881024 data_used: 28983296
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.061190605s of 14.412494659s, submitted: 94
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5001473 data_alloc: 234881024 data_used: 28995584
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b23c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f136b41400 session 0x55f134968d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5001341 data_alloc: 234881024 data_used: 28995584
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464658432 unmapped: 63643648 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 heartbeat osd_stat(store_statfs(0x1a0055000/0x0/0x1bfc00000, data 0x341aa35/0x3649000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464666624 unmapped: 63635456 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 ms_handle_reset con 0x55f13f3cbc00 session 0x55f135cb8780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.726196289s of 10.059652328s, submitted: 3
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464666624 unmapped: 63635456 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f14a130000 session 0x55f1369b9c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f14a130000 session 0x55f1369990e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f135fffc00 session 0x55f1342fe000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f136b41400 session 0x55f1363601e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464666624 unmapped: 63635456 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5005163 data_alloc: 234881024 data_used: 29003776
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464674816 unmapped: 63627264 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f13f3cbc00 session 0x55f13a8e7680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464674816 unmapped: 63627264 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464674816 unmapped: 63627264 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5006255 data_alloc: 234881024 data_used: 29089792
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5006255 data_alloc: 234881024 data_used: 29089792
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464723968 unmapped: 63578112 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f139a81400 session 0x55f136fb1c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 heartbeat osd_stat(store_statfs(0x1a0051000/0x0/0x1bfc00000, data 0x341c68e/0x364c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 ms_handle_reset con 0x55f135fffc00 session 0x55f1342f9c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 425 handle_osd_map epochs [426,426], i have 425, src has [1,426]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.723518372s of 16.029331207s, submitted: 2
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464732160 unmapped: 63569920 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5014061 data_alloc: 234881024 data_used: 29474816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464748544 unmapped: 63553536 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464756736 unmapped: 63545344 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5014861 data_alloc: 234881024 data_used: 29548544
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5015341 data_alloc: 234881024 data_used: 29560832
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464764928 unmapped: 63537152 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 heartbeat osd_stat(store_statfs(0x1a004e000/0x0/0x1bfc00000, data 0x341e33b/0x364f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 handle_osd_map epochs [427,427], i have 426, src has [1,427]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 426 handle_osd_map epochs [427,427], i have 427, src has [1,427]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.787799835s of 12.030132294s, submitted: 13
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13423b680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136868f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5016953 data_alloc: 234881024 data_used: 29556736
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 464781312 unmapped: 63520768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a004d000/0x0/0x1bfc00000, data 0x1b50e6a/0x1d82000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f136fb01e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455073792 unmapped: 73228288 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693278 data_alloc: 218103808 data_used: 10276864
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455081984 unmapped: 73220096 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455090176 unmapped: 73211904 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1928000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4693598 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13f3cbc00 session 0x55f1367ef680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135cb9680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136fe3860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455098368 unmapped: 73203712 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135ce8000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 50.622058868s of 54.810539246s, submitted: 47
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455098368 unmapped: 73203712 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1926000/0x0/0x1bfc00000, data 0x1b45e93/0x1d78000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455098368 unmapped: 73203712 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462733312 unmapped: 65568768 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f1363ba5a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f14a130000 session 0x55f13704fc20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455131136 unmapped: 73170944 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f14a130000 session 0x55f133e03680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13a8e6960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4761140 data_alloc: 218103808 data_used: 10289152
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136a421e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f142f10b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f136fe2960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13446e1e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135d5a780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4761272 data_alloc: 218103808 data_used: 10289152
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455139328 unmapped: 73162752 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455147520 unmapped: 73154560 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4815672 data_alloc: 218103808 data_used: 17928192
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455221248 unmapped: 73080832 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4815672 data_alloc: 218103808 data_used: 17928192
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1176000/0x0/0x1bfc00000, data 0x22f5ecc/0x2528000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455229440 unmapped: 73072640 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.918918610s of 21.329256058s, submitted: 44
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455262208 unmapped: 73039872 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456982528 unmapped: 71319552 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0dce000/0x0/0x1bfc00000, data 0x2695ecc/0x28c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c55f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4845574 data_alloc: 218103808 data_used: 18001920
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0930000/0x0/0x1bfc00000, data 0x272becc/0x295e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456179712 unmapped: 72122368 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a08ae000/0x0/0x1bfc00000, data 0x27adecc/0x29e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859686 data_alloc: 218103808 data_used: 18522112
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.001146317s of 10.467357635s, submitted: 66
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456409088 unmapped: 71892992 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859026 data_alloc: 218103808 data_used: 18526208
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135cf2b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f136ab54a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4857498 data_alloc: 218103808 data_used: 18526208
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f14a130000 session 0x55f142f101e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.138523102s of 11.733281136s, submitted: 10
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4858750 data_alloc: 218103808 data_used: 18653184
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859390 data_alloc: 218103808 data_used: 18714624
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4859390 data_alloc: 218103808 data_used: 18714624
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.540402412s of 11.780837059s, submitted: 1
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4864546 data_alloc: 218103808 data_used: 19070976
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456417280 unmapped: 71884800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4865074 data_alloc: 218103808 data_used: 19066880
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.979717255s of 11.657351494s, submitted: 11
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4865074 data_alloc: 218103808 data_used: 19066880
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1367ef860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135d46b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456425472 unmapped: 71876608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a087f000/0x0/0x1bfc00000, data 0x27dcecc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f134b963c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4704166 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13704e1e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13addf400 session 0x55f136a43a40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1341e4960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136fe30e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451198976 unmapped: 77103104 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 40.794708252s of 44.714622498s, submitted: 44
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1368692c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f136360f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602bc00 session 0x55f13704f2c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13704f680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369b25a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12e9000/0x0/0x1bfc00000, data 0x1d73e83/0x1fa5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12e9000/0x0/0x1bfc00000, data 0x1d73ebc/0x1fa5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135ccc5a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4724367 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f13a8e6b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451330048 unmapped: 76972032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f139384800 session 0x55f13704fa40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13704e3c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451493888 unmapped: 76808192 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451493888 unmapped: 76808192 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4744312 data_alloc: 218103808 data_used: 12484608
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4744312 data_alloc: 218103808 data_used: 12484608
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451502080 unmapped: 76800000 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.000379562s of 17.736671448s, submitted: 27
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a12c4000/0x0/0x1bfc00000, data 0x1d97edf/0x1fca000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451510272 unmapped: 76791808 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4759852 data_alloc: 218103808 data_used: 12509184
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 451510272 unmapped: 76791808 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453230592 unmapped: 75071488 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453230592 unmapped: 75071488 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c6000/0x0/0x1bfc00000, data 0x1f95edf/0x21c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4764890 data_alloc: 218103808 data_used: 12718080
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c6000/0x0/0x1bfc00000, data 0x1f95edf/0x21c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c6000/0x0/0x1bfc00000, data 0x1f95edf/0x21c8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 3.305987597s of 10.130201340s, submitted: 32
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4767300 data_alloc: 218103808 data_used: 12718080
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452558848 unmapped: 75743232 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766548 data_alloc: 218103808 data_used: 12722176
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452567040 unmapped: 75735040 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f133e101e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1363614a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452575232 unmapped: 75726848 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452575232 unmapped: 75726848 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766476 data_alloc: 218103808 data_used: 12722176
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766476 data_alloc: 218103808 data_used: 12722176
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.317569733s of 18.511228561s, submitted: 5
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a10c2000/0x0/0x1bfc00000, data 0x1f99edf/0x21cc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f136fb0b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f136360960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766460 data_alloc: 218103808 data_used: 12722176
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452583424 unmapped: 75718656 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135d5a000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452591616 unmapped: 75710464 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1516000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4710176 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 42.061229706s of 42.817886353s, submitted: 33
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 452599808 unmapped: 75702272 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135d463c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f1342f90e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1369e2960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f1341e4f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f133e56780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0ebe000/0x0/0x1bfc00000, data 0x219fe5a/0x23d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4764975 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0ebe000/0x0/0x1bfc00000, data 0x219fe5a/0x23d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0ebe000/0x0/0x1bfc00000, data 0x219fe5a/0x23d0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 453869568 unmapped: 74432512 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f136360d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454172672 unmapped: 74129408 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454180864 unmapped: 74121216 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4772407 data_alloc: 218103808 data_used: 10932224
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.808236122s of 11.594923019s, submitted: 22
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f13a8e6960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f135d5a000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813555 data_alloc: 218103808 data_used: 16748544
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813687 data_alloc: 218103808 data_used: 16748544
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,2])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135d47680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f133e101e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813555 data_alloc: 218103808 data_used: 16748544
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.312306404s of 13.516556740s, submitted: 3
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0e9a000/0x0/0x1bfc00000, data 0x21c3e5a/0x23f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f1342df4a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b25a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454352896 unmapped: 73949184 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136998780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454369280 unmapped: 73932800 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454377472 unmapped: 73924608 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454385664 unmapped: 73916416 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 73908224 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454393856 unmapped: 73908224 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4716505 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454402048 unmapped: 73900032 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1518000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454410240 unmapped: 73891840 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 454410240 unmapped: 73891840 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1a800 session 0x55f135d89c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f136fb10e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f142f10d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f135cf2d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 43.204605103s of 44.053894043s, submitted: 30
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135cf3680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f1445eb000 session 0x55f1368685a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b2960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1367eeb40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f136868d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766156 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0fe5000/0x0/0x1bfc00000, data 0x2076ecc/0x22a9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455483392 unmapped: 72818688 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4766156 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455499776 unmapped: 72802304 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0fe4000/0x0/0x1bfc00000, data 0x2076ef5/0x22aa000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455499776 unmapped: 72802304 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 459685888 unmapped: 68616192 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465387520 unmapped: 62914560 heap: 528302080 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475217920 unmapped: 62586880 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a065e000/0x0/0x1bfc00000, data 0x29fcef5/0x2c30000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,9,0,4,7])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4959952 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fd03000/0x0/0x1bfc00000, data 0x3357ef5/0x358b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,9,0,11])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.006268978s of 11.423579216s, submitted: 64
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470769664 unmapped: 67035136 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470769664 unmapped: 67035136 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f136361860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135e9c000 session 0x55f142f10b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f13423a000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 455737344 unmapped: 82067456 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f133e04d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f133e05c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602fc00 session 0x55f135d1a960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13adde000 session 0x55f136ab5e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f135d46b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f135cb9c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f136ab5a40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602fc00 session 0x55f1369e3680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9e000/0x0/0x1bfc00000, data 0x33bcf2e/0x35f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4915148 data_alloc: 218103808 data_used: 10293248
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456048640 unmapped: 81756160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456056832 unmapped: 81747968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9e000/0x0/0x1bfc00000, data 0x33bcf2e/0x35f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 456056832 unmapped: 81747968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9e000/0x0/0x1bfc00000, data 0x33bcf2e/0x35f0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457072640 unmapped: 80732160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4947684 data_alloc: 218103808 data_used: 14323712
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.965469837s of 10.133234978s, submitted: 25
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135eb0000 session 0x55f136ab43c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a19000 session 0x55f142f114a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4947282 data_alloc: 218103808 data_used: 14327808
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457105408 unmapped: 80699392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 457310208 unmapped: 80494592 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19fc9d000/0x0/0x1bfc00000, data 0x33bcf51/0x35f1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5017682 data_alloc: 234881024 data_used: 24203264
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 458530816 unmapped: 79273984 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.825445175s of 10.174798012s, submitted: 10
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 460357632 unmapped: 77447168 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462028800 unmapped: 75776000 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f69a000/0x0/0x1bfc00000, data 0x39bef51/0x3bf3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f660000/0x0/0x1bfc00000, data 0x39f9f51/0x3c2e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5134180 data_alloc: 234881024 data_used: 34156544
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465297408 unmapped: 72507392 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f658000/0x0/0x1bfc00000, data 0x3a01f51/0x3c36000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468754432 unmapped: 69050368 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470532096 unmapped: 67272704 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5172302 data_alloc: 234881024 data_used: 34160640
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f142f10f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f135d463c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 2.601797819s of 10.000380516s, submitted: 136
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468631552 unmapped: 69173248 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468639744 unmapped: 69165056 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f1d5000/0x0/0x1bfc00000, data 0x30a9eef/0x32dd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468639744 unmapped: 69165056 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13648c000 session 0x55f1342fe000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468377600 unmapped: 69427200 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469999616 unmapped: 67805184 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5067736 data_alloc: 234881024 data_used: 24604672
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19f8a3000/0x0/0x1bfc00000, data 0x37afeef/0x39e3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470089728 unmapped: 67715072 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13602fc00 session 0x55f133e11860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f142f114a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465223680 unmapped: 72581120 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1363ba3c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.8 total, 600.0 interval#012Cumulative writes: 69K writes, 266K keys, 69K commit groups, 1.0 writes per commit group, ingest: 0.25 GB, 0.03 MB/s#012Cumulative WAL: 69K writes, 26K syncs, 2.63 writes per sync, written: 0.25 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2524 writes, 10K keys, 2524 commit groups, 1.0 writes per commit group, ingest: 10.14 MB, 0.02 MB/s#012Interval WAL: 2524 writes, 998 syncs, 2.53 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0920000/0x0/0x1bfc00000, data 0x273aecc/0x296d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4879398 data_alloc: 218103808 data_used: 15605760
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.405301094s of 10.917118073s, submitted: 104
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0920000/0x0/0x1bfc00000, data 0x273aecc/0x296d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f142f11860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f1342fd0e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 465231872 unmapped: 72572928 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0920000/0x0/0x1bfc00000, data 0x273aecc/0x296d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4753552 data_alloc: 218103808 data_used: 10395648
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a14f3000/0x0/0x1bfc00000, data 0x1b69e5a/0x1d9a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369e3680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4746436 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a1517000/0x0/0x1bfc00000, data 0x1b45e5a/0x1d76000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369e21e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f13446e1e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b8f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f1342f8960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 462143488 unmapped: 75661312 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 48.122467041s of 50.524135590s, submitted: 29
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136b41400 session 0x55f13704e1e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f136fe3c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369b94a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f142f11c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136a1f400 session 0x55f136868000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4800406 data_alloc: 218103808 data_used: 10285056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f2e000/0x0/0x1bfc00000, data 0x212fe5a/0x2360000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463798272 unmapped: 74006528 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f13648c000 session 0x55f135cb8f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4805884 data_alloc: 218103808 data_used: 10289152
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4848444 data_alloc: 218103808 data_used: 16191488
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4848444 data_alloc: 218103808 data_used: 16191488
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x1a0f09000/0x0/0x1bfc00000, data 0x2153e7d/0x2385000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1c96f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 463740928 unmapped: 74063872 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.138151169s of 19.535371780s, submitted: 27
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468279296 unmapped: 69525504 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468279296 unmapped: 69525504 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468115456 unmapped: 69689344 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4979316 data_alloc: 218103808 data_used: 17743872
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edf3000/0x0/0x1bfc00000, data 0x30c9e7d/0x32fb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4981584 data_alloc: 218103808 data_used: 17743872
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcf000/0x0/0x1bfc00000, data 0x30ede7d/0x331f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4977592 data_alloc: 218103808 data_used: 17743872
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcf000/0x0/0x1bfc00000, data 0x30ede7d/0x331f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4977592 data_alloc: 218103808 data_used: 17743872
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.841932297s of 19.731657028s, submitted: 129
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcf000/0x0/0x1bfc00000, data 0x30ede7d/0x331f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468328448 unmapped: 69476352 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4977284 data_alloc: 218103808 data_used: 17743872
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcc000/0x0/0x1bfc00000, data 0x30f0e7d/0x3322000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edcc000/0x0/0x1bfc00000, data 0x30f0e7d/0x3322000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,2])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edc9000/0x0/0x1bfc00000, data 0x30f2e7d/0x3324000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468336640 unmapped: 69468160 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edb8000/0x0/0x1bfc00000, data 0x3103e7d/0x3335000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4979836 data_alloc: 218103808 data_used: 17764352
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135045800 session 0x55f134fb5c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.664420128s of 11.280209541s, submitted: 18
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f135fffc00 session 0x55f1369e2780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 ms_handle_reset con 0x55f136027000 session 0x55f1369b3c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 heartbeat osd_stat(store_statfs(0x19edaa000/0x0/0x1bfc00000, data 0x3112e7d/0x3344000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468344832 unmapped: 69459968 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4979704 data_alloc: 218103808 data_used: 17764352
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f136a1f400 session 0x55f1363614a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f135eb0800 session 0x55f136fe3680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f136a19000 session 0x55f1367ee3c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f135045800 session 0x55f135d5a960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 ms_handle_reset con 0x55f135fffc00 session 0x55f13a8e7c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 heartbeat osd_stat(store_statfs(0x19df53000/0x0/0x1bfc00000, data 0x3f66b48/0x419b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 handle_osd_map epochs [429,429], i have 428, src has [1,429]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 428 handle_osd_map epochs [429,429], i have 429, src has [1,429]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 429 heartbeat osd_stat(store_statfs(0x19df53000/0x0/0x1bfc00000, data 0x3f66b48/0x419b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 429 handle_osd_map epochs [430,430], i have 429, src has [1,430]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f136027000 session 0x55f133e034a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484261888 unmapped: 53542912 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f136a1f400 session 0x55f136ab4780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f136a1f400 session 0x55f133e11e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f135045800 session 0x55f1369b8d20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484302848 unmapped: 53501952 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5174173 data_alloc: 234881024 data_used: 33067008
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 484302848 unmapped: 53501952 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 478642176 unmapped: 59162624 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 heartbeat osd_stat(store_statfs(0x19df4a000/0x0/0x1bfc00000, data 0x3f6a4cc/0x41a2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,0,0,0,1,8])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.816694260s of 10.062507629s, submitted: 71
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 ms_handle_reset con 0x55f135fffc00 session 0x55f1368692c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4892239 data_alloc: 218103808 data_used: 10305536
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468082688 unmapped: 69722112 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 430 handle_osd_map epochs [431,431], i have 430, src has [1,431]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f4f5000/0x0/0x1bfc00000, data 0x299d4a9/0x2bd4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468090880 unmapped: 69713920 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468156416 unmapped: 69648384 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4895357 data_alloc: 218103808 data_used: 10313728
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f517000/0x0/0x1bfc00000, data 0x299efe8/0x2bd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4895357 data_alloc: 218103808 data_used: 10313728
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f517000/0x0/0x1bfc00000, data 0x299efe8/0x2bd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19f517000/0x0/0x1bfc00000, data 0x299efe8/0x2bd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1db0f9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468189184 unmapped: 69615616 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136027000 session 0x55f13446e1e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a19000 session 0x55f1369e21e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a19000 session 0x55f1369e3680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135045800 session 0x55f142f11860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.625346184s of 17.649908066s, submitted: 301
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136027000 session 0x55f1369b8780
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135fffc00 session 0x55f1363ba3c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a1f400 session 0x55f142f114a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a1f400 session 0x55f133e11860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468197376 unmapped: 69607424 heap: 537804800 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135045800 session 0x55f142f10f00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f135fffc00 session 0x55f136a42b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4897299 data_alloc: 218103808 data_used: 10317824
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136027000 session 0x55f1342ffc20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475201536 unmapped: 72851456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475201536 unmapped: 72851456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 heartbeat osd_stat(store_statfs(0x19cbf0000/0x0/0x1bfc00000, data 0x412504a/0x435e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475201536 unmapped: 72851456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 ms_handle_reset con 0x55f136a19000 session 0x55f1342f8960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 475209728 unmapped: 72843264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 470548480 unmapped: 77504512 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 432 ms_handle_reset con 0x55f136027000 session 0x55f135d46000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5002273 data_alloc: 218103808 data_used: 13885440
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473186304 unmapped: 74866688 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 432 heartbeat osd_stat(store_statfs(0x19d6c9000/0x0/0x1bfc00000, data 0x32d4c46/0x350d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473186304 unmapped: 74866688 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 432 heartbeat osd_stat(store_statfs(0x19d6c9000/0x0/0x1bfc00000, data 0x32d4c46/0x350d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 432 heartbeat osd_stat(store_statfs(0x19d6c9000/0x0/0x1bfc00000, data 0x32d4c46/0x350d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5070433 data_alloc: 234881024 data_used: 23470080
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.400728226s of 10.873732567s, submitted: 89
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5073215 data_alloc: 234881024 data_used: 23470080
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3d000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473210880 unmapped: 74842112 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473251840 unmapped: 74801152 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096647 data_alloc: 234881024 data_used: 25784320
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096647 data_alloc: 234881024 data_used: 25784320
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5096647 data_alloc: 234881024 data_used: 25784320
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.438421249s of 20.862281799s, submitted: 29
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5095415 data_alloc: 234881024 data_used: 25780224
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19da3e000/0x0/0x1bfc00000, data 0x32d6785/0x3510000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473415680 unmapped: 74637312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473423872 unmapped: 74629120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473554944 unmapped: 74498048 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135045800 session 0x55f136869860
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135fffc00 session 0x55f133e57a40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466214912 unmapped: 81838080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a1f400 session 0x55f133e045a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4804505 data_alloc: 218103808 data_used: 10326016
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4804505 data_alloc: 218103808 data_used: 10326016
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 466231296 unmapped: 81821696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.781463623s of 16.977258682s, submitted: 61
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f1369e25a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f1369b9e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c5000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4805449 data_alloc: 218103808 data_used: 14127104
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f135d5a960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468697088 unmapped: 79355904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13a125400 session 0x55f13704e1e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13b9b5c00 session 0x55f136a430e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19efe8000/0x0/0x1bfc00000, data 0x1d2e700/0x1f66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468705280 unmapped: 79347712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4867713 data_alloc: 218103808 data_used: 14127104
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468705280 unmapped: 79347712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468713472 unmapped: 79339520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4867713 data_alloc: 218103808 data_used: 14127104
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136b3cc00 session 0x55f136361e00
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f136ab5680
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4867713 data_alloc: 218103808 data_used: 14127104
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13a125400 session 0x55f133e04960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13b9b5c00 session 0x55f134fb43c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468721664 unmapped: 79331328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 468828160 unmapped: 79224832 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4907393 data_alloc: 234881024 data_used: 19374080
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4907393 data_alloc: 234881024 data_used: 19374080
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469057536 unmapped: 78995456 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19eae5000/0x0/0x1bfc00000, data 0x2231700/0x2469000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 469065728 unmapped: 78987264 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.486818314s of 31.777320862s, submitted: 28
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4948785 data_alloc: 234881024 data_used: 19398656
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471539712 unmapped: 76513280 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472588288 unmapped: 75464704 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2d0000/0x0/0x1bfc00000, data 0x2a46700/0x2c7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2d0000/0x0/0x1bfc00000, data 0x2a46700/0x2c7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4972213 data_alloc: 234881024 data_used: 19816448
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473849856 unmapped: 74203136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2d0000/0x0/0x1bfc00000, data 0x2a46700/0x2c7e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4971817 data_alloc: 234881024 data_used: 19816448
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2af000/0x0/0x1bfc00000, data 0x2a67700/0x2c9f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4971817 data_alloc: 234881024 data_used: 19816448
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e2af000/0x0/0x1bfc00000, data 0x2a67700/0x2c9f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.674192429s of 20.244110107s, submitted: 61
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4972033 data_alloc: 234881024 data_used: 19816448
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e29c000/0x0/0x1bfc00000, data 0x2a7a700/0x2cb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472801280 unmapped: 75251712 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472809472 unmapped: 75243520 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f13a8e6b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1365ce800 session 0x55f1342f83c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4974401 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.431100845s of 21.130962372s, submitted: 10
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f135d1a960
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472817664 unmapped: 75235328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13a125400 session 0x55f136fe23c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f13b9b5c00 session 0x55f135d392c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4973345 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1445e8400 session 0x55f135d46b40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472825856 unmapped: 75227136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 33.194938660s of 33.254158020s, submitted: 4
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e28d000/0x0/0x1bfc00000, data 0x2a89700/0x2cc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4976261 data_alloc: 234881024 data_used: 19845120
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135e9bc00 session 0x55f1342fed20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472973312 unmapped: 75079680 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4978341 data_alloc: 234881024 data_used: 20037632
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4978341 data_alloc: 234881024 data_used: 20037632
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472981504 unmapped: 75071488 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472989696 unmapped: 75063296 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.098130226s of 15.642007828s, submitted: 1
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4989413 data_alloc: 234881024 data_used: 20328448
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f134aeac00 session 0x55f1369b85a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a0c800 session 0x55f1369e3c20
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a0a800 session 0x55f1368683c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472997888 unmapped: 75055104 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.028177261s of 15.051651955s, submitted: 2
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4993893 data_alloc: 234881024 data_used: 20619264
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473006080 unmapped: 75046912 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f135e9bc00 session 0x55f1367efa40
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4998053 data_alloc: 234881024 data_used: 21549056
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f138f03c00 session 0x55f135ccc5a0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.078834534s of 11.086947441s, submitted: 2
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19e269000/0x0/0x1bfc00000, data 0x2aad700/0x2ce5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f136a0c800 session 0x55f135d46000
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 473014272 unmapped: 75038720 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4991385 data_alloc: 234881024 data_used: 21438464
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f1454e9000 session 0x55f134fb43c0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472334336 unmapped: 75718656 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472350720 unmapped: 75702272 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472358912 unmapped: 75694080 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472367104 unmapped: 75685888 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 75677696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 75677696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472375296 unmapped: 75677696 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472383488 unmapped: 75669504 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472391680 unmapped: 75661312 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472399872 unmapped: 75653120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472399872 unmapped: 75653120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472399872 unmapped: 75653120 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472408064 unmapped: 75644928 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472416256 unmapped: 75636736 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472424448 unmapped: 75628544 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472424448 unmapped: 75628544 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472424448 unmapped: 75628544 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472440832 unmapped: 75612160 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 ms_handle_reset con 0x55f139285000 session 0x55f1367ef0e0
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472449024 unmapped: 75603968 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472457216 unmapped: 75595776 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472465408 unmapped: 75587584 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472465408 unmapped: 75587584 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472473600 unmapped: 75579392 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 75571200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 75571200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472481792 unmapped: 75571200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472489984 unmapped: 75563008 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472506368 unmapped: 75546624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472514560 unmapped: 75538432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472514560 unmapped: 75538432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config diff' '{prefix=config diff}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472342528 unmapped: 75710464 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config show' '{prefix=config show}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472309760 unmapped: 75743232 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 472006656 unmapped: 76046336 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'log dump' '{prefix=log dump}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471572480 unmapped: 76480512 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'perf dump' '{prefix=perf dump}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'perf schema' '{prefix=perf schema}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471457792 unmapped: 76595200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471457792 unmapped: 76595200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471457792 unmapped: 76595200 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471474176 unmapped: 76578816 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471482368 unmapped: 76570624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471482368 unmapped: 76570624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471482368 unmapped: 76570624 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471490560 unmapped: 76562432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471490560 unmapped: 76562432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471490560 unmapped: 76562432 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471498752 unmapped: 76554240 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471498752 unmapped: 76554240 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471506944 unmapped: 76546048 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471506944 unmapped: 76546048 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471506944 unmapped: 76546048 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471515136 unmapped: 76537856 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471515136 unmapped: 76537856 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471515136 unmapped: 76537856 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471515136 unmapped: 76537856 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471515136 unmapped: 76537856 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471523328 unmapped: 76529664 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471523328 unmapped: 76529664 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471523328 unmapped: 76529664 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471523328 unmapped: 76529664 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471523328 unmapped: 76529664 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471531520 unmapped: 76521472 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471531520 unmapped: 76521472 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471547904 unmapped: 76505088 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471547904 unmapped: 76505088 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471547904 unmapped: 76505088 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471547904 unmapped: 76505088 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471556096 unmapped: 76496896 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471556096 unmapped: 76496896 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471564288 unmapped: 76488704 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471564288 unmapped: 76488704 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471564288 unmapped: 76488704 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471572480 unmapped: 76480512 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471580672 unmapped: 76472320 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471588864 unmapped: 76464128 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471588864 unmapped: 76464128 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471597056 unmapped: 76455936 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471597056 unmapped: 76455936 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471597056 unmapped: 76455936 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471597056 unmapped: 76455936 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471597056 unmapped: 76455936 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 76447744 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 76447744 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 76447744 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471605248 unmapped: 76447744 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471613440 unmapped: 76439552 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471613440 unmapped: 76439552 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471621632 unmapped: 76431360 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471629824 unmapped: 76423168 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471646208 unmapped: 76406784 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471646208 unmapped: 76406784 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471646208 unmapped: 76406784 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 76398592 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 76398592 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471654400 unmapped: 76398592 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471662592 unmapped: 76390400 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471662592 unmapped: 76390400 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471662592 unmapped: 76390400 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471662592 unmapped: 76390400 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471662592 unmapped: 76390400 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471670784 unmapped: 76382208 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471678976 unmapped: 76374016 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471687168 unmapped: 76365824 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471687168 unmapped: 76365824 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471687168 unmapped: 76365824 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471695360 unmapped: 76357632 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471695360 unmapped: 76357632 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471695360 unmapped: 76357632 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471695360 unmapped: 76357632 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471695360 unmapped: 76357632 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471703552 unmapped: 76349440 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471703552 unmapped: 76349440 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471703552 unmapped: 76349440 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471711744 unmapped: 76341248 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471711744 unmapped: 76341248 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471719936 unmapped: 76333056 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471728128 unmapped: 76324864 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471728128 unmapped: 76324864 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471728128 unmapped: 76324864 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471744512 unmapped: 76308480 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471744512 unmapped: 76308480 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471744512 unmapped: 76308480 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471744512 unmapped: 76308480 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471744512 unmapped: 76308480 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471752704 unmapped: 76300288 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471752704 unmapped: 76300288 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471752704 unmapped: 76300288 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471752704 unmapped: 76300288 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471752704 unmapped: 76300288 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471760896 unmapped: 76292096 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471760896 unmapped: 76292096 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471769088 unmapped: 76283904 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471793664 unmapped: 76259328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471793664 unmapped: 76259328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471793664 unmapped: 76259328 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471801856 unmapped: 76251136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471801856 unmapped: 76251136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471801856 unmapped: 76251136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471801856 unmapped: 76251136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471801856 unmapped: 76251136 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471810048 unmapped: 76242944 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471810048 unmapped: 76242944 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471818240 unmapped: 76234752 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471818240 unmapped: 76234752 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471818240 unmapped: 76234752 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471826432 unmapped: 76226560 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471826432 unmapped: 76226560 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.8 total, 600.0 interval#012Cumulative writes: 71K writes, 271K keys, 71K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.03 MB/s#012Cumulative WAL: 71K writes, 27K syncs, 2.62 writes per sync, written: 0.26 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1522 writes, 5398 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 5.34 MB, 0.01 MB/s#012Interval WAL: 1522 writes, 642 syncs, 2.37 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471826432 unmapped: 76226560 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471834624 unmapped: 76218368 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471834624 unmapped: 76218368 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471834624 unmapped: 76218368 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471834624 unmapped: 76218368 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: osd.1 433 heartbeat osd_stat(store_statfs(0x19f1c6000/0x0/0x1bfc00000, data 0x1b50700/0x1d88000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x1ecaf9c6), peers [0,2] op hist [])
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471834624 unmapped: 76218368 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: bluestore.MempoolThread(0x55f1328dfb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 4813041 data_alloc: 218103808 data_used: 13602816
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471834624 unmapped: 76218368 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config diff' '{prefix=config diff}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471670784 unmapped: 76382208 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config show' '{prefix=config show}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471932928 unmapped: 76120064 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: prioritycache tune_memory target: 4294967296 mapped: 471924736 unmapped: 76128256 heap: 548052992 old mem: 2845415833 new mem: 2845415833
Jan 23 06:21:28 np0005593233 ceph-osd[78880]: do_command 'log dump' '{prefix=log dump}'
Jan 23 06:21:28 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:28 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:28 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:28.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:28 np0005593233 nova_compute[222017]: 2026-01-23 11:21:28.378 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:28 np0005593233 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3893322876' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/595298638' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 23 06:21:29 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/602448918' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 06:21:29 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:29 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:29 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:29.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:30 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:30 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:30 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:30.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:30 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 23 06:21:30 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/142283704' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 06:21:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 23 06:21:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3908229302' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 06:21:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 23 06:21:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2965600401' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 06:21:31 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 23 06:21:31 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3451630365' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 06:21:31 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:31 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:31 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:31.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:31 np0005593233 nova_compute[222017]: 2026-01-23 11:21:31.977 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3844757475' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2437973649' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 06:21:32 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:32 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:32 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:32.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/601124630' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 23 06:21:32 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2452185189' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3544505127' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 06:21:33 np0005593233 systemd[1]: Starting Hostname Service...
Jan 23 06:21:33 np0005593233 systemd[1]: Started Hostname Service.
Jan 23 06:21:33 np0005593233 nova_compute[222017]: 2026-01-23 11:21:33.382 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2526477133' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2881115186' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 23 06:21:33 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/531467881' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 06:21:33 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:33 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:33 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:33.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 23 06:21:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2328928004' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 06:21:34 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:34 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:34 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:34.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 23 06:21:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/574087736' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 06:21:34 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 23 06:21:34 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/298541437' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 06:21:35 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Jan 23 06:21:35 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1369147621' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 06:21:35 np0005593233 podman[328138]: 2026-01-23 11:21:35.119295463 +0000 UTC m=+0.122895394 container health_status f6f2d47714972dd494f25d724aa0ec282ba134ba1833dd6ee18b1793e99379fe (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d-f957eec682f883f79494f9c719c882eab609710a6b2112ac63f785a540dedb3d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 06:21:35 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:35 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:35 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:35.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:36 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:36 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:36 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:36.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:36 np0005593233 nova_compute[222017]: 2026-01-23 11:21:36.980 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/747085215' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:21:37 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3300384213' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:21:37 np0005593233 nova_compute[222017]: 2026-01-23 11:21:37.726 222021 DEBUG oslo_service.periodic_task [None req-02f6563a-04b0-4955-8876-da9cc7aa50a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:21:37 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:37 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:37 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:37.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:38 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 23 06:21:38 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2627670380' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 06:21:38 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:38 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 06:21:38 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:38.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 06:21:38 np0005593233 nova_compute[222017]: 2026-01-23 11:21:38.427 222021 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 23 06:21:39 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4276581310' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 06:21:39 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:39 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:39 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:39.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:40 np0005593233 radosgw[84337]: ====== starting new request req=0x7fcd5c70e6f0 =====
Jan 23 06:21:40 np0005593233 radosgw[84337]: ====== req done req=0x7fcd5c70e6f0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 06:21:40 np0005593233 radosgw[84337]: beast: 0x7fcd5c70e6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:40.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 06:21:40 np0005593233 ceph-mon[81574]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Jan 23 06:21:40 np0005593233 ceph-mon[81574]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3328433218' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
